SBSE and AI Foundation Models: A Research Roadmap
A recent research roadmap investigates the convergence of search-based software engineering (SBSE) and artificial intelligence foundation models (FMs), such as large language models (LLMs). For approximately 25 years, SBSE has been a vibrant field, employing metaheuristic search methods to tackle software engineering challenges throughout its lifecycle. This roadmap highlights existing challenges and suggests pathways for collaboration: leveraging FMs to improve SBSE, utilizing SBSE to progress FMs, and merging both approaches. The findings are available on arXiv with the identifier 2505.19625.
Key facts
- Search-based software engineering (SBSE) integrates metaheuristic search with software engineering.
- SBSE has been active for about 25 years.
- The roadmap addresses the evolution of SBSE alongside AI foundation models (FMs).
- Foundation models include large language models (LLMs).
- Three core aspects are analyzed: using FMs to enhance SBSE, applying SBSE to advance FMs, and integrating SBSE and FMs.
- The paper is published on arXiv with identifier 2505.19625.
- The roadmap articulates current landscape, open challenges, and research directions.
- SBSE has demonstrated versatility across multiple domains.
Entities
Institutions
- arXiv