Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Modular-proximal gradient algorithms in variable exponent Lebesgue spaces (2112.05480v2)

Published 10 Dec 2021 in math.OC, cs.NA, and math.NA

Abstract: We consider structured optimisation problems defined in terms of the sum of a smooth and convex function, and a proper, l.s.c., convex (typically non-smooth) one in reflexive variable exponent Lebesgue spaces $L_{p(\cdot)}(\Omega)$. Due to their intrinsic space-variant properties, such spaces can be naturally used as solution space and combined with space-variant functionals for the solution of ill-posed inverse problems. For this purpose, we propose and analyse two instances (primal and dual) of proximal gradient algorithms in $L_{p(\cdot)}(\Omega)$, where the proximal step, rather than depending on the natural (non-separable) $L_{p(\cdot)}(\Omega)$ norm, is defined in terms of its modular function, which, thanks to its separability, allows for the efficient computation of algorithmic iterates. Convergence in function values is proved for both algorithms, with convergence rates depending on problem/space smoothness. To show the effectiveness of the proposed modelling, some numerical tests highlighting the flexibility of the space $L_{p(\cdot)}(\Omega)$ are shown for exemplar deconvolution and mixed noise removal problems. Finally, a numerical verification on the convergence speed and computational costs of both algorithms in comparison with analogous ones defined in standard $L_{p}(\Omega)$ spaces is presented.

Citations (4)

Summary

We haven't generated a summary for this paper yet.