Emergent Mind

On Algorithms for and Computing with the Tensor Ring Decomposition

(1807.02513)
Published Jul 6, 2018 in math.NA and cs.NA

Abstract

Tensor decompositions such as the canonical format and the tensor train format have been widely utilized to reduce storage costs and operational complexities for high-dimensional data, achieving linear scaling with the input dimension instead of exponential scaling. In this paper, we investigate even lower storage-cost representations in the tensor ring format, which is an extension of the tensor train format with variable end-ranks. Firstly, we introduce two algorithms for converting a tensor in full format to tensor ring format with low storage cost. Secondly, we detail a rounding operation for tensor rings and show how this requires new definitions of common linear algebra operations in the format to obtain storage-cost savings. Lastly, we introduce algorithms for transforming the graph structure of graph-based tensor formats, with orders of magnitude lower complexity than existing literature. The efficiency of all algorithms is demonstrated on a number of numerical examples, and in certain cases, we demonstrate significantly higher compression ratios when compared to previous approaches to using the tensor ring format.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.