Interval POMDP Shielding for Safe Autonomous Systems
A new paper on arXiv proposes a method to ensure safety in autonomous systems with imperfect perception. The approach uses interval Partially Observable Markov Decision Processes (POMDPs) to model perception uncertainty from finite labeled data. A runtime shield is constructed that provides finite-horizon safety guarantees with high probability, assuming true perception rates lie within learned confidence intervals.
Key facts
- arXiv paper 2604.20728
- Interval POMDP shielding for imperfect-perception agents
- System dynamics are known but perception uncertainty estimated from finite data
- Confidence intervals built for perception outcome probabilities
- Conservative belief set computed consistent with observations
- Finite-horizon safety guarantee with high probability over training data
- Shield blocks actions that could violate safety
- Assumes true perception uncertainty rates lie within learned intervals
Entities
Institutions
- arXiv