Published September 17, 2025 | Supplemental material
Journal Article Open

Covariance Alignment: From Maximum Likelihood Estimation to Gromov–Wasserstein

  • 1. ROR icon New York University
  • 2. ROR icon Massachusetts Institute of Technology
  • 3. ROR icon California Institute of Technology

Abstract

Feature alignment methods are used in many scientific disciplines for data pooling, annotation, and comparison. As an instance of a permutation learning problem, feature alignment presents significant statistical and computational challenges. In this work, we propose the covariance alignment model to study and compare various alignment methods and establish a minimax lower bound for covariance alignment that has a nonstandard dimension scaling because of the presence of a nuisance parameter. This lower bound is in fact minimax optimal and is achieved by a natural quasi maximum likelihood estimator. However, this estimator involves a search over all permutations which is computationally infeasible even when the problem has moderate size. To overcome this limitation, we show that the celebrated Gromov–Wasserstein algorithm from optimal transport, which is more amenable to fast implementation even on large-scale problems, is also minimax optimal. These results give the first statistical justification for the deployment of the Gromov–Wasserstein algorithm in practice.

Copyright and License

© 2025 Society for Industrial and Applied Mathematics.

Funding

Yanjun Han was generously supported by the Norbert Wiener postdoctoral fellowship in statisticsat MIT IDSS. Philippe Rigollet was supported by NSF grants IIS-1838071, DMS-2022448, and CCF-2106377.George Stepaniants was supported through a National Science Foundation Graduate Research Fellowship undergrant 1745302.

Supplemental Material

Supplementary Materials

Files

supplement.pdf
Files (529.0 kB)
Name Size Download all
md5:77ae5c869a2df667bf70f6b33d5fa261
529.0 kB Preview Download

Additional details

Created:
September 25, 2025
Modified:
September 25, 2025