Ctx2Skill: Self-Evolving Framework for Context Learning in Language Models
A new arXiv paper (2604.27660) introduces Ctx2Skill, a self-evolving framework that enables language models to autonomously discover, refine, and select context-specific skills without human supervision or external feedback. The framework addresses the challenge of context learning, where LMs must reason over complex contexts exceeding their parametric knowledge. Ctx2Skill uses a multi-agent self-play loop with a Challenger that generates probing tasks to extract rules and procedures from context into natural-language skills, overcoming the prohibitive cost of manual annotation and lack of automatic feedback signals.
Key facts
- arXiv paper number: 2604.27660
- Title: From Context to Skills: Can Language Models Learn from Context Skillfully?
- Proposes Ctx2Skill framework
- Ctx2Skill is self-evolving and unsupervised
- Uses multi-agent self-play loop with a Challenger
- Addresses context learning for complex contexts
- Extracts skills as natural-language rules and procedures
- No human supervision or external feedback required
Entities
Institutions
- arXiv