Special Session 113: Recent Advances in Uncertainty Quantification and Scientific Machine Learning with Applications to Complex Dynamical Systems

Assimilative Causal Inference: Tracing Causes from Effects to Predict and Attribute Significant Events
Marios Andreou
University of Wisconsin-Madison
USA
Co-Author(s):    Nan Chen, Erik Bollt
Abstract:
Causal inference is fundamental across scientific disciplines, yet state-of-the-art methods often struggle to capture instantaneous causal relationships in high-dimensional systems. This work introduces assimilative causal inference (ACI), a paradigm-shifting framework that reframes causality as a Bayesian inverse problem using data assimilation. Rather than measuring forward influence from causes to effects, ACI instead traces causality backwards by quantifying how incorporating future information from effects reduces the uncertainty in the estimated system state. As such, effects are interpolated onto causes, contrasting classical predictive approaches that extrapolate causes forward to identify effects. ACI dynamically determines causal interactions without observing candidate causes, accommodates short datasets, and scales efficiently to high dimensions. Crucially, it provides online tracking of causal roles, which may reverse intermittently, and facilitates rigorous criteria for the causal influence range (CIR) of a relationship. The ACI-based CIR metric is objectively defined, without empirical thresholds, and admits both forward- and backward-in-time formulations. The forward CIR quantifies the temporal reach of a cause, while the backward CIR traces the onset of triggers for an observed effect, enabling causal predictability and attribution in transient regimes. The effectiveness of ACI and its CIR framework is demonstrated on nonlinear dynamical systems showcasing intermittency, extreme events, and tipping points.