A Caltech Library Service

Guaranteed Scalable Learning of Latent Tree Models

Huang, Furong and Naresh, Niranjan Uma and Perros, Ioakeim and Chen, Robert and Sun, Jimeng and Anandkumar, Animashree (2019) Guaranteed Scalable Learning of Latent Tree Models. Proceedings of Machine Learning Research, 115 . pp. 883-893. ISSN 2640-3498.

[img] PDF - Published Version
See Usage Policy.

[img] PDF - Submitted Version
See Usage Policy.

[img] PDF - Supplemental Material
See Usage Policy.


Use this Persistent URL to link to this item:


We present an integrated approach to structure and parameter estimation in latent tree graphical models, where some nodes are hidden. Our overall approach follows a “divide-and-conquer” strategy that learns models over small groups of variables and iteratively merges into a global solution. The structure learning involves combinatorial operations such as minimum spanning tree construction and local recursive grouping; the parameter learning is based on the method of moments and on tensor decompositions. Our method is guaranteed to correctly recover the unknown tree structure and the model parameters with low sample complexity for the class of linear multivariate latent tree models which includes discrete and Gaussian distributions, and Gaussian mixtures. Our bulk asynchronous parallel algorithm is implemented in parallel and scales logarithmically with the number of variables and linearly with dimensionality of each variable.

Item Type:Article
Related URLs:
URLURL TypeDescription Paper
Alternate Title:Integrated Structure and Parameters Learning in Latent Tree Graphical Models, Scalable Latent Tree Model and its Application to Health Analytics
Additional Information:© 2020 by the author(s). Huang is supported by startup fund from Department of Computer Science, University of Maryland, National Science Foundation IIS-1850220 CRII Award 030742- 00001, and Adobe, Capital One and JP Morgan faculty fellowships. Sun is supported by the National Science Foundation award IIS-1418511, CCF-1533768 and IIS-1838042, the National Institute of Health award 1R01MD011682-01 and R56HL138415. Anandkumar is supported in part by Bren endowed chair, Darpa PAI, Raytheon, and Microsoft, Google and Adobe faculty fellowships.
Funding AgencyGrant Number
University of MarylandUNSPECIFIED
Bren Professor of Computing and Mathematical SciencesUNSPECIFIED
Defense Advanced Research Projects Agency (DARPA)UNSPECIFIED
Raytheon CompanyUNSPECIFIED
Record Number:CaltechAUTHORS:20220107-163918011
Persistent URL:
Official Citation:Huang, F., Naresh, N.U., Perros, I., Chen, R., Sun, J.; Anandkumar, A. (2020). Guaranteed Scalable Learning of Latent Tree Models. Proceedings of The 35th Uncertainty in Artificial Intelligence Conference, in Proceedings of Machine Learning Research, 115:883-893. Available from
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:112776
Deposited By: Tony Diaz
Deposited On:09 Jan 2022 03:36
Last Modified:09 Jan 2022 22:15

Repository Staff Only: item control page