MetaMoE: Privacy-Preserving Mixture-of-Experts Unification via Diversity-Aware Proxy Selection
MetaMoE is a framework designed to protect privacy by integrating separately trained, domain-specific experts into one Mixture-of-Experts (MoE) model, utilizing public proxy data as stand-ins for private information that cannot be accessed. It tackles the issue of distributed data among clients that cannot be shared due to privacy regulations. A key feature of MetaMoE is its diversity-aware proxy selection, which identifies relevant and varied samples from public datasets to mimic private data distributions and aid in router training. These proxies also help synchronize expert training for better unification, while a context-aware router optimizes expert selection for diverse inputs. Its effectiveness has been validated through experiments in computer vision and natural language processing.
Key facts
- MetaMoE is a privacy-preserving framework for MoE unification.
- Uses public proxy data as surrogates for private data.
- Diversity-aware proxy selection picks relevant and diverse samples.
- Proxies supervise router learning and align expert training.
- Context-aware router improves expert selection.
- Experiments conducted on computer vision and NLP tasks.
- Addresses distributed data privacy constraints.
- arXiv:2605.14289.
Entities
Institutions
- arXiv