A Caltech Library Service

Convolutional Dictionary Learning through Tensor Factorization

Huang, Furong and Anandkumar, Animashree (2015) Convolutional Dictionary Learning through Tensor Factorization. . (Unpublished)

[img] PDF - Submitted Version
See Usage Policy.


Use this Persistent URL to link to this item:


Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable models such as topic models, independent component analysis and dictionary learning. Model parameters are estimated via CP decomposition of the observed higher order input moments. However, in many domains, additional invariances such as shift invariances exist, enforced via models such as convolutional dictionary learning. In this paper, we develop novel tensor decomposition algorithms for parameter estimation of convolutional models. Our algorithm is based on the popular alternating least squares method, but with efficient projections onto the space of stacked circulant matrices. Our method is embarrassingly parallel and consists of simple operations such as fast Fourier transforms and matrix multiplications. Our algorithm converges to the dictionary much faster and more accurately compared to the alternating minimization over filters and activation maps.

Item Type:Report or Paper (Discussion Paper)
Related URLs:
URLURL TypeDescription Paper
Additional Information:We thank Cris Cecka for helpful discussion on fast implementation of block matrix inverse and initial discussions with Majid Janzamin and Hanie Sedghi on Toeplitz matrices.
Subject Keywords:Tensor CP decomposition, convolutional dictionary learning, convolutional ICA, blind deconvolution
Record Number:CaltechAUTHORS:20190401-123253238
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:94319
Deposited By: George Porter
Deposited On:01 Apr 2019 22:59
Last Modified:03 Oct 2019 21:02

Repository Staff Only: item control page