Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Polynomial approximation of derivatives by the constrained mock-Chebyshev least squares operator (2209.09822v1)

Published 20 Sep 2022 in math.NA and cs.NA

Abstract: The constrained mock-Chebyshev least squares operator is a linear approximation operator based on an equispaced grid of points. Like other polynomial or rational approximation methods, it was recently introduced in order to defeat the Runge phenomenon that occurs when using polynomial interpolation on large sets of equally spaced points. The idea is to improve the mock-Chebyshev subset interpolation, where the considered function $f$ is interpolated only on a proper subset of the uniform grid, formed by nodes that mimic the behavior of Chebyshev--Lobatto nodes. In the mock-Chebyshev subset interpolation all remaining nodes are discarded, while in the constrained mock-Chebyshev least squares interpolation they are used in a simultaneous regression, with the aim to further improving the accuracy of the approximation provided by the mock-Chebyshev subset interpolation. The goal of this paper is two-fold. We discuss some theoretical aspects of the constrained mock-Chebyshev least squares operator and present new results. In particular, we introduce explicit representations of the error and its derivatives. Moreover, for a sufficiently smooth function $f$ in $[-1,1]$, we present a method for approximating the successive derivatives of $f$ at a point $x\in [-1,1]$, based on the constrained mock-Chebyshev least squares operator and provide estimates for these approximations. Numerical tests demonstrate the effectiveness of the proposed method.

Citations (1)

Summary

We haven't generated a summary for this paper yet.