Emergent Mind

Self-concordant analysis for logistic regression

(0910.4627)
Published Oct 24, 2009 in cs.LG , math.ST , and stat.TH

Abstract

Most of the non-asymptotic theoretical work in regression is carried out for the square loss, where estimators can be obtained through closed-form expressions. In this paper, we use and extend tools from the convex optimization literature, namely self-concordant functions, to provide simple extensions of theoretical results for the square loss to the logistic loss. We apply the extension techniques to logistic regression with regularization by the $\ell2$-norm and regularization by the $\ell1$-norm, showing that new results for binary classification through logistic regression can be easily derived from corresponding results for least-squares regression.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.