Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits (2004.06231v1)

Published 13 Apr 2020 in cs.LG and stat.ML

Abstract: Probabilistic circuits (PCs) are a promising avenue for probabilistic modeling, as they permit a wide range of exact and efficient inference routines. Recent ``deep-learning-style'' implementations of PCs strive for a better scalability, but are still difficult to train on real-world data, due to their sparsely connected computational graphs. In this paper, we propose Einsum Networks (EiNets), a novel implementation design for PCs, improving prior art in several regards. At their core, EiNets combine a large number of arithmetic operations in a single monolithic einsum-operation, leading to speedups and memory savings of up to two orders of magnitude, in comparison to previous implementations. As an algorithmic contribution, we show that the implementation of Expectation-Maximization (EM) can be simplified for PCs, by leveraging automatic differentiation. Furthermore, we demonstrate that EiNets scale well to datasets which were previously out of reach, such as SVHN and CelebA, and that they can be used as faithful generative image models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Robert Peharz (27 papers)
  2. Steven Lang (3 papers)
  3. Antonio Vergari (46 papers)
  4. Karl Stelzner (8 papers)
  5. Alejandro Molina (20 papers)
  6. Martin Trapp (25 papers)
  7. Guy Van den Broeck (104 papers)
  8. Kristian Kersting (205 papers)
  9. Zoubin Ghahramani (108 papers)
Citations (111)

Summary

We haven't generated a summary for this paper yet.