Multi-Task Transformer with Orthogonal Decomposition for Clinical Data
A new multi-task learning framework, Orthogonal Task Decomposition (OrthTD), is proposed for multimodal clinical data. Built on a unified Transformer, it separates patient representations into shared and task-specific subspaces using geometric orthogonality constraints to reduce redundancy and prevent negative transfer. Evaluated on a cohort of 12,430 surgical patients, the method addresses challenges in balancing shared and task-specific learning in multi-task settings.
Key facts
- OrthTD is a multi-task framework for multimodal clinical data.
- It uses a unified Transformer with Orthogonal Task Decomposition.
- Orthogonality constraints separate shared and task-specific subspaces.
- Aims to reduce redundancy and isolate task-specific signals.
- Evaluated on 12,430 surgical patients.
- Addresses negative transfer in hard parameter sharing.
- Proposed to balance shared and task-specific learning.
- Published on arXiv with ID 2605.03570.
Entities
Institutions
- arXiv