Emergent Mind

Learning nonlinear dynamical systems from a single trajectory

(2004.14681)
Published Apr 30, 2020 in cs.LG , math.OC , math.ST , stat.ML , and stat.TH

Abstract

We introduce algorithms for learning nonlinear dynamical systems of the form $x{t+1}=\sigma(\Theta{\star}xt)+\varepsilont$, where $\Theta{\star}$ is a weight matrix, $\sigma$ is a nonlinear link function, and $\varepsilont$ is a mean-zero noise process. We give an algorithm that recovers the weight matrix $\Theta{\star}$ from a single trajectory with optimal sample complexity and linear running time. The algorithm succeeds under weaker statistical assumptions than in previous work, and in particular i) does not require a bound on the spectral norm of the weight matrix $\Theta{\star}$ (rather, it depends on a generalization of the spectral radius) and ii) enjoys guarantees for non-strictly-increasing link functions such as the ReLU. Our analysis has two key components: i) we give a general recipe whereby global stability for nonlinear dynamical systems can be used to certify that the state-vector covariance is well-conditioned, and ii) using these tools, we extend well-known algorithms for efficiently learning generalized linear models to the dependent setting.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.