Special Session 73: Data-driven methods in dynamical systems

Gradient flows for sampling: affine invariance and numerical approximations

Yifan Chen
Caltech
USA
Co-Author(s):    Yifan Chen, Daniel Huang, Jiaoyang Huang, Sebastian Reich, Andrew Stuart
Abstract:
Sampling a target distribution with an unknown normalization constant is a fundamental problem in data driven inference. Using dynamical systems to generate solutions to approach the target gradually has been a compelling idea. In this talk, we focus on probability gradient flows as the dynamical system and study several related foundational questions in sampling distributions. Any implementation of a gradient flow needs an energy functional, a metric, and a numerical approximation scheme. We show how KL divergence is a special and unique energy functional and how the affine invariant property in the metric can improve convergence. We also discuss numerical approximations that lead to implementable methods such as interacting particles, parametric variational inference, and Kalman approaches.