Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Mutual Information in Random Linear Estimation Beyond i.i.d. Matrices (1802.08963v2)

Published 25 Feb 2018 in cs.IT, math-ph, math.IT, and math.MP

Abstract: There has been definite progress recently in proving the variational single-letter formula given by the heuristic replica method for various estimation problems. In particular, the replica formula for the mutual information in the case of noisy linear estimation with random i.i.d. matrices, a problem with applications ranging from compressed sensing to statistics, has been proven rigorously. In this contribution we go beyond the restrictive i.i.d. matrix assumption and discuss the formula proposed by Takeda, Uda, Kabashima and later by Tulino, Verdu, Caire and Shamai who used the replica method. Using the recently introduced adaptive interpolation method and random matrix theory, we prove this formula for a relevant large sub-class of rotationally invariant matrices.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Jean Barbier (60 papers)
  2. Nicolas Macris (66 papers)
  3. Antoine Maillard (24 papers)
  4. Florent Krzakala (179 papers)
Citations (57)

Summary

We haven't generated a summary for this paper yet.