Distill-Belief: Efficient Closed-Loop Source Localization via Teacher-Student Framework
A recent preprint on arXiv (2604.26095) presents Distill-Belief, a teacher-student model designed for closed-loop inverse source localization and characterization (ISLC). This innovative approach tackles the difficulty of achieving precise uncertainty estimation alongside computational efficiency in belief-space planning. A Bayes-correct particle-filter teacher generates rich information-gain signals, while a streamlined student condenses the posterior into belief statistics for control and provides an uncertainty certificate for termination. During deployment, only the student is utilized, ensuring a consistent per-step cost. Testing across seven field modalities showcases the method's success in source localization and inferring latent field parameters under stringent time limitations.
Key facts
- Distill-Belief is a teacher-student framework for ISLC
- Uses Bayes-correct particle-filter teacher for information-gain signal
- Student distills posterior into belief statistics and uncertainty certificate
- Deployment uses only the student with constant per-step cost
- Tested on seven field modalities
- Addresses reward hacking in learned belief models
- Published on arXiv with ID 2604.26095
- Closed-loop mobile agent selects measurements under time constraints
Entities
Institutions
- arXiv