DMGD: Training-Free Dataset Distillation via Diffusion Models
A new framework called Dual Matching Guided Diffusion (DMGD) proposes training-free dataset distillation using semantic and distribution matching. It eliminates the need for fine-tuning by optimizing conditional likelihood for semantic alignment and employs optimal transport for distribution matching. The method enhances synthetic data diversity while maintaining alignment with original datasets.
Key facts
- DMGD stands for Dual Matching Guided Diffusion.
- The framework is training-free, requiring no fine-tuning.
- Semantic Matching uses conditional likelihood optimization.
- A dynamic guidance mechanism improves synthetic data diversity.
- Optimal transport (OT) based Distribution Matching aligns distributions.
- The approach addresses limitations of diffusion-based dataset distillation.
- The paper is available on arXiv with ID 2605.03877v1.
- The method eliminates the need for auxiliary classifiers.
Entities
Institutions
- arXiv