Mod-CL: Self-Supervised Learning for Automatic Modulation Classification
Mod-CL, an innovative self-supervised learning framework, enhances automatic modulation classification (AMC) by utilizing intra-instance modulation consistency. The high expense of labeled data poses challenges for deep learning AMC techniques. Current SSL methods often rely on task-agnostic pretexts that mix representations with irrelevant factors such as symbol, channel, and noise. Mod-CL addresses this by creating positive pairs from various temporal segments of the same signal, maintaining the modulation type while altering the waveform. This task-aware approach aligns self-supervision directly with modulation classification. Further details can be found in the arXiv preprint 2605.11875.
Key facts
- arXiv preprint 2605.11875 introduces Mod-CL.
- Mod-CL is a Modulation consistency-based Contrastive Learning framework.
- It addresses high cost of labeled data in deep learning AMC.
- Existing SSL methods rely on task-agnostic pretext objectives.
- Intra-instance modulation consistency is identified as a task-aware prior.
- Positive pairs are constructed from different temporal segments of the same signal.
- The model learns representations invariant to symbol, channel, and noise.
- The paper is available at https://arxiv.org/abs/2605.11875.
Entities
Institutions
- arXiv