Cross-Window Knowledge Distillation Boosts Pulmonary CT Analysis
A new method for knowledge distillation across different CT windows is improving how we analyze pulmonary CT scans. This technique involves passing valuable insights from a teacher encoder, which focuses on the most important window, to several student encoders. Researchers tested it on three datasets: COPD-CT-DF with 719 cases, RSNA PE with 1,433 cases, and an in-house CTEPD set with 161 cases, leading to significant increases in AUC scores. For instance, in COPD-CT-DF, the AUC jumped from 0.75-0.81 to 0.90-0.94, a gain of 10.1-16.5 percentage points, all statistically significant. Improvements were also noted in RSNA PE and CTEPD, making this approach a promising tool for multi-window pulmonary CT analysis.
Key facts
- Cross-window knowledge distillation framework proposed for multi-window pulmonary CT
- Student encoders learn latent clinical priors from a teacher trained on the most informative window
- Evaluated on three cohorts: COPD-CT-DF (n=719), RSNA PE (n=1,433), CTEPD (n=161)
- Per-window AUC on COPD-CT-DF improved from 0.75-0.81 to 0.90-0.94 (10.1-16.5 percentage points, P<0.001)
- Ensemble AUC on COPD-CT-DF reached 0.9960
- RSNA PE AUC improved from 0.80-0.83 to 0.90-0.92
- CTEPD AUC improved from 0.6264 to 0.7481
- Method internalizes pathological signatures invisible to supervised approaches
Entities
—