Emergent Mind

An error bound for Lasso and Group Lasso in high dimensions

(1912.11398)
Published Dec 21, 2019 in stat.ML , cs.LG , math.ST , and stat.TH

Abstract

We leverage recent advances in high-dimensional statistics to derive new L2 estimation upper bounds for Lasso and Group Lasso in high-dimensions. For Lasso, our bounds scale as $(k*/n) \log(p/k*)$$n\times p$ is the size of the design matrix and $k*$ the dimension of the ground truth $\boldsymbol{\beta}*$and match the optimal minimax rate. For Group Lasso, our bounds scale as $(s*/n) \log\left( G / s* \right) + m* / n$$G$ is the total number of groups and $m*$ the number of coefficients in the $s*$ groups which contain $\boldsymbol{\beta}*$and improve over existing results. We additionally show that when the signal is strongly group-sparse, Group Lasso is superior to Lasso.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.