Emergent Mind

Limitation of capsule networks

(1905.08744)
Published May 21, 2019 in cs.LG and stat.ML

Abstract

A recently proposed method in deep learning groups multiple neurons to capsules such that each capsule represents an object or part of an object. Routing algorithms route the output of capsules from lower-level layers to upper-level layers. In this paper, we prove that state-of-the-art routing procedures decrease the expressivity of capsule networks. More precisely, it is shown that EM-routing and routing-by-agreement prevent capsule networks from distinguishing inputs and their negative counterpart. Therefore, only symmetric functions can be expressed by capsule networks, and it can be concluded that they are not universal approximators. We also theoretically motivate and empirically show that this limitation affects the training of deep capsule networks negatively. Therefore, we present an incremental improvement for state-of-the-art routing algorithms that solves the aforementioned limitation and stabilizes the training of capsule networks.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.