Emergent Mind

On a Vectorized Version of a Generalized Richardson Extrapolation Process

(1605.02630)
Published May 9, 2016 in math.NA and cs.NA

Abstract

Let ${\xxm}$ be a vector sequence that satisfies $$ \xxm\sim \sss+\sum\infty{i=1}\alphai \ggi(m)\quad\text{as $m\to\infty$},$$ $\sss$ being the limit or antilimit of ${\xxm}$ and ${\ggi(m)}\infty{i=1}$ being an asymptotic scale as $m\to\infty$, in the sense that $$\lim{m\to\infty}\frac{|\gg{i+1}(m)|}{|\gg{i}(m)|}=0,\quad i=1,2,\ldots.$$ The vector sequences ${\ggi(m)}\infty_{m=0}$, $i=1,2,\ldots,$ are known, as well as ${\xxm}$. In this work, we analyze the convergence and convergence acceleration properties of a vectorized version of the generalized Richardson extrapolation process that is defined via the equations $$ \sumk{i=1}\braket{\yy,\Delta\gg{i}(m)}\widetilde{\alpha}i=\braket{\yy,\Delta\xxm},\quad n\leq m\leq n+k-1;\quad \sss{n,k}=\xxn+\sumk{i=1}\widetilde{\alpha}i\gg{i}(n),$$ $\sss{n,k}$ being the approximation to $\sss$. Here $\yy$ is some nonzero vector, $\braket{\cdot\,,\cdot}$ is an inner product, such that $\braket{\alpha\aaa,\beta\bb}=\bar{\alpha}\beta\braket{\aaa,\bb}$, and $\Delta\xxm=\xx{m+1}-~\xxm$ and $\Delta\ggi(m)=\ggi(m+1)-\ggi(m)$. By imposing a minimal number of reasonable additional conditions on the $\ggi(m)$, we show that the error $\sss_{n,k}-\sss$ has a full asymptotic expansion as $n\to\infty$. We also show that actual convergence acceleration takes place and we provide a complete classification of it.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.