Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 45 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 11 tok/s Pro
GPT-5 High 19 tok/s Pro
GPT-4o 88 tok/s Pro
Kimi K2 214 tok/s Pro
GPT OSS 120B 460 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

Analyzing the Weighted Nuclear Norm Minimization and Nuclear Norm Minimization based on Group Sparse Representation (1702.04463v5)

Published 15 Feb 2017 in cs.CV

Abstract: Rank minimization methods have attracted considerable interest in various areas, such as computer vision and machine learning. The most representative work is nuclear norm minimization (NNM), which can recover the matrix rank exactly under some restricted and theoretical guarantee conditions. However, for many real applications, NNM is not able to approximate the matrix rank accurately, since it often tends to over-shrink the rank components. To rectify the weakness of NNM, recent advances have shown that weighted nuclear norm minimization (WNNM) can achieve a better matrix rank approximation than NNM, which heuristically set the weight being inverse to the singular values. However, it still lacks a sound mathematical explanation on why WNNM is more feasible than NNM. In this paper, we propose a scheme to analyze WNNM and NNM from the perspective of the group sparse representation. Specifically, we design an adaptive dictionary to bridge the gap between the group sparse representation and the rank minimization models. Based on this scheme, we provide a mathematical derivation to explain why WNNM is more feasible than NNM. Moreover, due to the heuristical set of the weight, WNNM sometimes pops out error in the operation of SVD, and thus we present an adaptive weight setting scheme to avoid this error. We then employ the proposed scheme on two low-level vision tasks including image denoising and image inpainting. Experimental results demonstrate that WNNM is more feasible than NNM and the proposed scheme outperforms many current state-of-the-art methods.

Citations (12)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.