Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PADDLE: Proximal Algorithm for Dual Dictionaries LEarning (1011.3728v1)

Published 16 Nov 2010 in cs.LG, cs.IT, math.IT, and stat.ML

Abstract: Recently, considerable research efforts have been devoted to the design of methods to learn from data overcomplete dictionaries for sparse coding. However, learned dictionaries require the solution of an optimization problem for coding new data. In order to overcome this drawback, we propose an algorithm aimed at learning both a dictionary and its dual: a linear mapping directly performing the coding. By leveraging on proximal methods, our algorithm jointly minimizes the reconstruction error of the dictionary and the coding error of its dual; the sparsity of the representation is induced by an $\ell_1$-based penalty on its coefficients. The results obtained on synthetic data and real images show that the algorithm is capable of recovering the expected dictionaries. Furthermore, on a benchmark dataset, we show that the image features obtained from the dual matrix yield state-of-the-art classification performance while being much less computational intensive.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Curzio Basso (1 paper)
  2. Matteo Santoro (5 papers)
  3. Alessandro Verri (9 papers)
  4. Silvia Villa (43 papers)
Citations (21)

Summary

We haven't generated a summary for this paper yet.