Emergent Mind

Residual and Attentional Architectures for Vector-Symbols

(2207.08953)
Published Jul 18, 2022 in cs.LG and cs.NE

Abstract

Vector-symbolic architectures (VSAs) provide methods for computing which are highly flexible and carry unique advantages. Concepts in VSAs are represented by 'symbols,' long vectors of values which utilize properties of high-dimensional spaces to represent and manipulate information. In this new work, we combine efficiency of the operations provided within the framework of the Fourier Holographic Reduced Representation (FHRR) VSA with the power of deep networks to construct novel VSA based residual and attention-based neural network architectures. Using an attentional FHRR architecture, we demonstrate that the same network architecture can address problems from different domains (image classification and molecular toxicity prediction) by encoding different information into the network's inputs, similar to the Perceiver model. This demonstrates a novel application of VSAs and a potential path to implementing state-of-the-art neural models on neuromorphic hardware.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.