Consistency Distillation Reduces Memorization in Diffusion Models
A new study from arXiv (2604.23552) investigates how consistency distillation affects memorization in diffusion models. The research shows that applying consistency distillation to a teacher model that has memorized data significantly reduces transferred memorization in the student model while preserving or improving sample quality. The authors provide a theoretical analysis using a random framework to explain this behavior. The work addresses a critical gap in understanding how distillation, a common deployment step, reshapes memorization dynamics in generative models.
Key facts
- arXiv paper 2604.23552 analyzes memorization in consistency distillation for diffusion models.
- Consistency distillation reduces transferred memorization from teacher to student.
- Sample quality is preserved or improved after distillation.
- Theoretical analysis uses a random framework to explain the behavior.
- The study addresses the impact of an additional training phase on memorization.
Entities
Institutions
- arXiv