A Caltech Library Service

InfoCNF: An Efficient Conditional Continuous Normalizing Flow with Adaptive Solvers

Nguyen, Tan M. and Garg, Animesh and Baraniuk, Richard G. and Anandkumar, Anima (2019) InfoCNF: An Efficient Conditional Continuous Normalizing Flow with Adaptive Solvers. . (Unpublished)

[img] PDF - Submitted Version
See Usage Policy.


Use this Persistent URL to link to this item:


Continuous Normalizing Flows (CNFs) have emerged as promising deep generative models for a wide range of tasks thanks to their invertibility and exact likelihood estimation. However, conditioning CNFs on signals of interest for conditional image generation and downstream predictive tasks is inefficient due to the high-dimensional latent code generated by the model, which needs to be of the same size as the input data. In this paper, we propose InfoCNF, an efficient conditional CNF that partitions the latent space into a class-specific supervised code and an unsupervised code that shared among all classes for efficient use of labeled information. Since the partitioning strategy (slightly) increases the number of function evaluations (NFEs), InfoCNF also employs gating networks to learn the error tolerances of its ordinary differential equation (ODE) solvers for better speed and performance. We show empirically that InfoCNF improves the test accuracy over the baseline while yielding comparable likelihood scores and reducing the NFEs on CIFAR10. Furthermore, applying the same partitioning strategy in InfoCNF on time-series data helps improve extrapolation performance.

Item Type:Report or Paper (Discussion Paper)
Related URLs:
URLURL TypeDescription Paper
Baraniuk, Richard G.0000-0002-0721-8999
Record Number:CaltechAUTHORS:20200109-083652974
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:100576
Deposited By: Tony Diaz
Deposited On:09 Jan 2020 21:23
Last Modified:09 Jan 2020 21:23

Repository Staff Only: item control page