Emergent Mind

Beyond Nyströmformer -- Approximation of self-attention by Spectral Shifting

(2103.05638)
Published Mar 9, 2021 in cs.LG and cs.CL

Abstract

Transformer is a powerful tool for many natural language tasks which is based on self-attention, a mechanism that encodes the dependence of other tokens on each specific token, but the computation of self-attention is a bottleneck due to its quadratic time complexity. There are various approaches to reduce the time complexity and approximation of matrix is one such. In Nystr\"omformer, the authors used Nystr\"om based method for approximation of softmax. The Nystr\"om method generates a fast approximation to any large-scale symmetric positive semidefinite (SPSD) matrix using only a few columns of the SPSD matrix. However, since the Nystr\"om approximation is low-rank when the spectrum of the SPSD matrix decays slowly, the Nystr\"om approximation is of low accuracy. Here an alternative method is proposed for approximation which has a much stronger error bound than the Nystr\"om method. The time complexity of this same as Nystr\"omformer which is $O\left({n}\right)$.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.