Latent Grammar Flow Framework Introduced for Neuro-Symbolic ODE Discovery
A novel framework known as Latent Grammar Flow (LGF) has been introduced for the extraction of ordinary differential equations from observational data. This method integrates equations into a discrete latent space as grammar-based representations, ensuring that semantically related equations are positioned closely through a behavioral loss function. A discrete flow model facilitates the sampling process, generating candidate equations that align with the observed data. Additionally, domain knowledge and constraints, such as stability, can be incorporated into the rules or act as conditional predictors. Aiming to deliver interpretable and transferable symbolic formulations, this framework surpasses black-box models in understanding both natural and engineered systems. The research, identified as arXiv:2604.16232, contributes to the fields of computer science and machine learning, emphasizing greater transparency in symbolic representations compared to traditional neural network methods.
Key facts
- Latent Grammar Flow (LGF) is a neuro-symbolic generative framework
- LGF discovers ordinary differential equations from data
- Equations are embedded as grammar-based representations in discrete latent space
- Behavioral loss positions semantically similar equations closer together
- Discrete flow model guides sampling to generate candidate equations
- Domain knowledge and constraints can be embedded into rules
- Framework provides interpretability beyond black-box models
- Research published on arXiv with identifier 2604.16232
Entities
Institutions
- arXiv