Papers
Topics
Authors
Recent
2000 character limit reached

Multi-Branch Tensor Network Structure for Tensor-Train Discriminant Analysis (1904.06788v2)

Published 15 Apr 2019 in eess.SP, cs.LG, and eess.IV

Abstract: Higher-order data with high dimensionality arise in a diverse set of application areas such as computer vision, video analytics and medical imaging. Tensors provide a natural tool for representing these types of data. Although there has been a lot of work in the area of tensor decomposition and low-rank tensor approximation, extensions to supervised learning, feature extraction and classification are still limited. Moreover, most of the existing supervised tensor learning approaches are based on the orthogonal Tucker model. However, this model has some limitations for large tensors including high memory and computational costs. In this paper, we introduce a supervised learning approach for tensor classification based on the tensor-train model. In particular, we introduce a multi-branch tensor network structure for efficient implementation of tensor-train discriminant analysis (TTDA). The proposed approach takes advantage of the flexibility of the tensor train structure to implement various computationally efficient versions of TTDA. This approach is then evaluated on image and video classification tasks with respect to computation time, storage cost and classification accuracy and is compared to both vector and tensor based discriminant analysis methods.

Citations (6)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.