ARTFEED — Contemporary Art Intelligence

Cross-Architecture Proxy Tuning Enables Training-Free Clinical LLM Adaptation

ai-technology · 2026-04-30

Cross-Architecture Proxy Tuning (CAPT) is an innovative technique that enables the adaptation of next-generation general-domain large language models (LLMs) to the clinical field without the need for training. This model-ensembling strategy accommodates models with distinct vocabularies, utilizing contrastive decoding to integrate clinically pertinent signals while maintaining fluency and reasoning from the general domain. In tests involving six clinical classification and text-generation tasks, CAPT, when paired with a new-generation general-domain model and an older clinical model, consistently surpassed the performance of each model alone and leading ensembling methods, achieving average improvements of +17.6% over UniTE and +41.4% over proxy tuning. Validation was conducted through token-level analysis and case studies by physicians, effectively eliminating the expensive retraining requirement for new model generations in specialized fields like healthcare.

Key facts

  • CAPT enables training-free adaptation of new-generation LLMs using legacy clinical models.
  • CAPT supports models with disjoint vocabularies.
  • Contrastive decoding selectively injects clinically relevant signals.
  • Evaluated on six clinical classification and text-generation tasks.
  • CAPT outperforms both individual models and state-of-the-art ensembling approaches.
  • Average +17.6% over UniTE, +41.4% over proxy tuning across tasks.
  • Validated through token-level analysis and physician case studies.
  • Eliminates costly retraining for each new model generation.

Entities

Sources