Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Riemannian Residual Neural Networks (2310.10013v1)

Published 16 Oct 2023 in stat.ML and cs.LG

Abstract: Recent methods in geometric deep learning have introduced various neural networks to operate over data that lie on Riemannian manifolds. Such networks are often necessary to learn well over graphs with a hierarchical structure or to learn over manifold-valued data encountered in the natural sciences. These networks are often inspired by and directly generalize standard Euclidean neural networks. However, extending Euclidean networks is difficult and has only been done for a select few manifolds. In this work, we examine the residual neural network (ResNet) and show how to extend this construction to general Riemannian manifolds in a geometrically principled manner. Originally introduced to help solve the vanishing gradient problem, ResNets have become ubiquitous in machine learning due to their beneficial learning properties, excellent empirical results, and easy-to-incorporate nature when building varied neural networks. We find that our Riemannian ResNets mirror these desirable properties: when compared to existing manifold neural networks designed to learn over hyperbolic space and the manifold of symmetric positive definite matrices, we outperform both kinds of networks in terms of relevant testing metrics and training dynamics.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Isay Katsman (12 papers)
  2. Eric Ming Chen (3 papers)
  3. Sidhanth Holalkere (3 papers)
  4. Anna Asch (4 papers)
  5. Aaron Lou (13 papers)
  6. Ser-Nam Lim (116 papers)
  7. Christopher De Sa (77 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.