Special Session 62: Group invariant machine learning

Numerically stable group invariants

Joseph W Iverson
Iowa State University
USA
Co-Author(s):    Jameson Cahill, Dustin G. Mixon, Daniel Packer
Abstract:
Machine learning algorithms are designed for Euclidean space, and the first step of many applications is to represent data in a finite-dimensional Hilbert space $V$. Frequently, many elements of $V$ represent the same data point, and furthermore, the equivalence classes are orbits of a finite group $G \leq \operatorname{O}(V)$. For instance, a point cloud may be represented as columns of a matrix, but the columns could be permuted in any order. To properly model such data, we seek a Euclidean embedding of the quotient metric space $V/G$. This amounts to finding a $G$-invariant function $f \colon V \to \mathbb{R}^n$ that separates orbits. With applications in mind, we would like to choose $f$ so that the resulting map $\tilde{f} \colon V/G \to \mathbb{R}^n$ is bilipschitz. In this talk, we share some bad news and some good news about such numerically stable group invariants. The bad news is that there usually does \textbf{not} exist an invariant polynomial $f$ for which $\tilde{f}$ is bilipschitz, even when restricted to the quotient image of the unit sphere. In fact, if $\tilde{f}$ is lower Lipschitz, then $f$ fails to be differentiable at any point fixed by any non-identity element of $G$. The good news is that we offer a large and flexible class of bilipschitz invariants, which we call \textit{max filter banks}. Max filters are easy to implement, and many special cases are familiar from the literature. Furthermore, when $G$ is finite, a sufficiently large bank of generically chosen max filters provides a bilipschitz Euclidean embedding of $V/G$. Based on joint work with Jameson Cahill, Dustin G. Mixon, and Daniel Packer.