Forward-Backward Gaussian Variational Inference via JKO in the Bures-Wasserstein Space
papers
ICML
We devise a novel algorithm for Gaussian variational inference that comes with state-of-the-art convergence guarantees.
Abstract
Variational inference (VI) seeks to approximate a target distribution \pi by an element of a tractable family of distributions. Of key interest in statistics and machine learning is Gaussian VI, which approximates \pi by minimizing the Kullback-Leibler (KL) divergence to \pi over the space of Gaussians. In this work, we develop the (Stochastic) Forward-Backward Gaussian Variational Inference (FB-GVI) algorithm to solve Gaussian VI. Our approach exploits the composite structure of the KL divergence, which can be written as the sum of a smooth term (the potential) and a non-smooth term (the entropy) over the Bures-Wasserstein (BW) space of Gaussians endowed with the Wasserstein distance. For our proposed algorithm, we obtain state-of-the-art convergence guarantees when \pi is log-smooth and log-concave, as well as the first convergence guarantees to first-order stationary solutions when \pi is only log-smooth.
Links
Published at ICML 2023.
For a quick overview, check out our ICML poster and video.
And for more detail, see our arXiv preprint (2023) and our ICML paper (2023). :)
References
Diao, Michael Ziyang, Krishna Balasubramanian, Sinho Chewi, and Adil Salim. 2023. “Forward-Backward Gaussian Variational Inference via JKO in the Bures-Wasserstein Space.” In Proceedings of the 40th International Conference on Machine Learning, 202:7960–91. Proceedings of Machine Learning Research. PMLR. https://proceedings.mlr.press/v202/diao23a.html.
Diao, Michael, Krishnakumar Balasubramanian, Sinho Chewi, and Adil Salim. 2023. “Forward-Backward Gaussian Variational Inference via JKO in the Bures-Wasserstein Space.” https://arxiv.org/abs/2304.05398.