CaltechAUTHORS
  A Caltech Library Service

Tensor Regression Networks

Kossaifi, Jean and Lipton, Zachary and Khanna, Aran and Furlanello, Tommaso and Anandkumar, Anima (2017) Tensor Regression Networks. . (Unpublished) http://resolver.caltech.edu/CaltechAUTHORS:20190327-085728859

[img] PDF - Submitted Version
See Usage Policy.

791Kb

Use this Persistent URL to link to this item: http://resolver.caltech.edu/CaltechAUTHORS:20190327-085728859

Abstract

Convolutional neural networks typically consist of many convolutional layers followed by several fully-connected layers. While convolutional layers map between high-order activation tensors, the fully-connected layers operate on flattened activation vectors. Despite its success, this approach has notable drawbacks. Flattening discards multilinear structure in the activations, and fully-connected layers require many parameters. We address these problems by incorporating tensor algebraic operations that preserve multilinear structure at every layer. First, we introduce Tensor Contraction Layers (TCLs) that reduce the dimensionality of their input while preserving their multilinear structure using tensor contraction. Next, we introduce Tensor Regression Layers (TRLs), to express outputs through a low-rank multilinear mapping from a high-order activation tensor to an output tensor of arbitrary order. We learn the contraction and regression factors end-to-end, and by imposing low rank on both, we produce accurate nets with few parameters. Additionally, our layers regularize networks by imposing low-rank constraints on the activations (TCL) and regression weights (TRL). Experiments on ImageNet show that, applied to VGG and ResNet architectures, TCLs and TRLs reduce the number of parameters compared to fully-connected layers by more than 65% without impacting accuracy.


Item Type:Report or Paper (Discussion Paper)
Related URLs:
URLURL TypeDescription
http://arxiv.org/abs/1707.08308arXivDiscussion Paper
Subject Keywords:Machine Learning, Tensor Methods, Tensor Regression Networks, Low-Rank Regression, Tensor Regression Layers, Tensor Contraction
Record Number:CaltechAUTHORS:20190327-085728859
Persistent URL:http://resolver.caltech.edu/CaltechAUTHORS:20190327-085728859
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:94168
Collection:CaltechAUTHORS
Deposited By: George Porter
Deposited On:28 Mar 2019 22:05
Last Modified:28 Mar 2019 22:05

Repository Staff Only: item control page