Bilinear Generalized Vector Approximate Message Passing (2009.06854v1)
Abstract: We introduce the bilinear generalized vector approximate message passing (BiG-VAMP) algorithm which jointly recovers two matrices U and V from their noisy product through a probabilistic observation model. BiG-VAMP provides computationally efficient approximate implementations of both max-sum and sumproduct loopy belief propagation (BP). We show how the proposed BiG-VAMP algorithm recovers different types of structured matrices and overcomes the fundamental limitations of other state-of-the-art approaches to the bilinear recovery problem, such as BiG-AMP, BAd-VAMP and LowRAMP. In essence, BiG-VAMP applies to a broader class of practical applications which involve a general form of structured matrices. For the sake of theoretical performance prediction, we also conduct a state evolution (SE) analysis of the proposed algorithm and show its consistency with the asymptotic empirical mean-squared error (MSE). Numerical results on various applications such as matrix factorization, dictionary learning, and matrix completion demonstrate unambiguously the effectiveness of the proposed BiG-VAMP algorithm and its superiority over stateof-the-art algorithms. Using the developed SE framework, we also examine (as one example) the phase transition diagrams of the matrix completion problem, thereby unveiling a low detectability region corresponding to the low signal-to-noise ratio (SNR) regime.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.