ARTFEED — Contemporary Art Intelligence

US threatens action against Chinese AI 'distillation' copycats

ai-technology · 2026-04-24

The Trump administration has threatened to crack down on Chinese AI companies using 'distillation' to replicate US models cheaply. Analysts predict weaker Chinese start-ups could be forced out of the market within six to 12 months. Distillation trains a smaller 'student' model on outputs from a more advanced 'teacher' model, allowing cheaper replication of capabilities. Helen Toner of Georgetown University's Centre for Security and Emerging Technology testified before the Senate on Wednesday about the technique. Beijing-based information systems architect Zhang Ruiwang noted that some Chinese firms claiming to 'self-develop' models rely heavily on distillation and lack original research. Even stronger developers use distillation to accelerate iteration, but development cycles may lengthen from three months to a year or more.

Key facts

  • Trump administration threatens action against AI distillation by Chinese rivals.
  • Distillation trains a smaller 'student' model on outputs from a 'teacher' model.
  • Helen Toner testified before the Senate on Wednesday.
  • Toner is interim executive director at Georgetown University's Centre for Security and Emerging Technology.
  • Zhang Ruiwang is a Beijing-based information systems architect.
  • Weaker Chinese start-ups could be forced out within six to 12 months.
  • Some Chinese firms claim to 'self-develop' models but rely on distillation.
  • Development cycles may lengthen from three months to a year or more.

Entities

Institutions

  • Georgetown University
  • Centre for Security and Emerging Technology
  • Senate

Locations

  • United States
  • China
  • Beijing

Sources