Model Editing Technique Addresses Cold-Start Collapse in Generative Recommendation
A new approach called GenRecEdit adapts model editing from NLP to generative recommendation (GR) systems, tackling the severe cold-start collapse where accuracy on new items drops to near zero. Existing solutions rely on retraining with sparse feedback, which is computationally expensive and delayed. GenRecEdit enables training-free knowledge injection, but faces challenges due to GR's lack of explicit subject-object binding and stable token co-occurrence patterns. The method aims to improve recommendation accuracy for cold-start items in rapidly evolving catalogs.
Key facts
- Generative recommendation (GR) models suffer from cold-start collapse with accuracy dropping to near zero.
- Current solutions require retraining with cold-start interactions, hindered by sparse feedback and high cost.
- GenRecEdit is inspired by model editing in NLP for training-free knowledge injection.
- GR lacks explicit subject-object binding common in natural language.
- GR does not exhibit stable token co-occurrence patterns.
- The approach targets rapidly evolving recommendation catalogs.
- The paper is available on arXiv with ID 2603.14259.
- The announcement type is replace-cross.
Entities
Institutions
- arXiv