Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Tutorial on Dual Decomposition and Lagrangian Relaxation for Inference in Natural Language Processing (1405.5208v1)

Published 23 Jan 2014 in cs.CL and cs.AI

Abstract: Dual decomposition, and more generally Lagrangian relaxation, is a classical method for combinatorial optimization; it has recently been applied to several inference problems in NLP. This tutorial gives an overview of the technique. We describe example algorithms, describe formal guarantees for the method, and describe practical issues in implementing the algorithms. While our examples are predominantly drawn from the NLP literature, the material should be of general relevance to inference problems in machine learning. A central theme of this tutorial is that Lagrangian relaxation is naturally applied in conjunction with a broad class of combinatorial algorithms, allowing inference in models that go significantly beyond previous work on Lagrangian relaxation for inference in graphical models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Alexander M. Rush (115 papers)
  2. Michael Collins (46 papers)
Citations (120)

Summary

We haven't generated a summary for this paper yet.