Emergent Mind

Abstract

The multinomial logistic regression (MLR) model is widely used in statistics and machine learning. Stochastic gradient descent (SGD) is the most common approach for determining the parameters of a MLR model in big data scenarios. However, SGD has slow sub-linear rates of convergence. A way to improve these rates of convergence is to use manifold optimization. Along this line, stochastic natural gradient descent (SNGD), proposed by Amari, was proven to be Fisher efficient when it converged. However, SNGD is not guaranteed to converge and it is computationally too expensive for MLR models with a large number of parameters. Here, we propose a stochastic optimization method for MLR based on manifold optimization concepts which (i) has per-iteration computational complexity is linear in the number of parameters and (ii) can be proven to converge. To achieve (i) we establish that the family of joint distributions for MLR is a dually flat manifold and we use that to speed up calculations. S\'anchez-L\'opez and Cerquides have recently introduced convergent stochastic natural gradient descent (CSNGD), a variant of SNGD whose convergence is guaranteed. To obtain (ii) our algorithm uses the fundamental idea from CSNGD, thus relying on an independent sequence to build a bounded approximation of the natural gradient. We call the resulting algorithm dual stochastic natural gradient descent (DNSGD). By generalizing a result from Sunehag et al., we prove that DSNGD converges. Furthermore, we prove that the computational complexity of DSNGD iterations are linear on the number of variables of the model.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.