Emergent Mind

Error Bounds for Generalized Group Sparsity

(2008.04734)
Published Aug 8, 2020 in stat.ML , cs.LG , and stat.ME

Abstract

In high-dimensional statistical inference, sparsity regularizations have shown advantages in consistency and convergence rates for coefficient estimation. We consider a generalized version of Sparse-Group Lasso which captures both element-wise sparsity and group-wise sparsity simultaneously. We state one universal theorem which is proved to obtain results on consistency and convergence rates for different forms of double sparsity regularization. The universality of the results lies in an generalization of various convergence rates for single regularization cases such as LASSO and group LASSO and also double regularization cases such as sparse-group LASSO. Our analysis identifies a generalized norm of $\epsilon$-norm, which provides a dual formulation for our double sparsity regularization.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.