Special Session 122: Topological Data Analysis Theory, Algorithms, and Applications

TopoFormer: Topology Meets Attention for Graph Learning
Baris Coskunuzer
University of Texas at Dallas
USA
Co-Author(s):    Md Joshem Uddin, Astrit Tola, Cuneyt Gurcan Akcora
Abstract:
In this talk, we show that reshaping topological ideas to fit modern deep learning pipelines can substantially improve graph learning. We present \textbf{TopoFormer}, a lightweight framework that injects topological inductive bias into transformer based models by converting each graph into a short sequence of topology aware tokens. The core module, \textbf{Topo-Scan}, slices a graph using node or edge filtrations to produce an ordered token sequence that captures multi scale structure, from local motifs to global organization, in a form naturally suited to attention. Unlike classical persistent homology pipelines, Topo-Scan is parallelizable, avoids costly persistence diagram computations, and integrates cleanly into end to end training. We provide stability guarantees for the proposed encodings and demonstrate state of the art performance on graph classification and molecular property prediction benchmarks, matching or surpassing strong GNN and topology based baselines while remaining scalable and compute predictable.