Topology-Aware Attention Framework for Time-Series Forecasting
A new research paper introduces a topology-aware attention framework that enhances time-series forecasting by incorporating geometric structure from persistent homology and Euler characteristic transforms. The framework, evaluated on three architecture families, uses a validation-gated local residual to capture local topological signals only when supported by held-out data. The study follows a no-leakage protocol with separate calibration, selection, and reporting stages.
Key facts
- arXiv:2605.03163
- Announce Type: cross
- Abstract introduces topology-aware attention framework
- Uses persistent homology (H0-H2)
- Uses anchored Euler characteristic transforms
- Uses kernel-Hilbert channels
- Validation-gated local residual captures local topological signals
- Includes Zeng-style local H0 component
- Exact Vietoris-Rips computations and smooth topological surrogates evaluated
- No-leakage protocol: train-only calibration, validation-only selection, test-only reporting
- Evaluated on three architecture families: lightweight attention/Ridge, PatchTSTForRegression, TimeSeriesTra
Entities
Institutions
- arXiv