Emergent Mind

Neural Networks with Complex-Valued Weights Have No Spurious Local Minima

(2103.07287)
Published Jan 31, 2021 in cs.LG and stat.ML

Abstract

We study the benefits of complex-valued weights for neural networks. We prove that shallow complex neural networks with quadratic activations have no spurious local minima. In contrast, shallow real neural networks with quadratic activations have infinitely many spurious local minima under the same conditions. In addition, we provide specific examples to demonstrate that complex-valued weights turn poor local minima into saddle points. The activation function CReLU is also discussed to illustrate the superiority of analytic activations in complex-valued neural networks.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.