Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Anderson Acceleration for Nonconvex ADMM Based on Douglas-Rachford Splitting (2006.14539v2)

Published 25 Jun 2020 in math.OC and cs.GR

Abstract: The alternating direction multiplier method (ADMM) is widely used in computer graphics for solving optimization problems that can be nonsmooth and nonconvex. It converges quickly to an approximate solution, but can take a long time to converge to a solution of high-accuracy. Previously, Anderson acceleration has been applied to ADMM, by treating it as a fixed-point iteration for the concatenation of the dual variables and a subset of the primal variables. In this paper, we note that the equivalence between ADMM and Douglas-Rachford splitting reveals that ADMM is in fact a fixed-point iteration in a lower-dimensional space. By applying Anderson acceleration to such lower-dimensional fixed-point iteration, we obtain a more effective approach for accelerating ADMM. We analyze the convergence of the proposed acceleration method on nonconvex problems, and verify its effectiveness on a variety of computer graphics problems including geometry processing and physical simulation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Wenqing Ouyang (10 papers)
  2. Yue Peng (8 papers)
  3. Yuxin Yao (12 papers)
  4. Juyong Zhang (85 papers)
  5. Bailin Deng (34 papers)
Citations (29)

Summary

We haven't generated a summary for this paper yet.