ARTFEED — Contemporary Art Intelligence

China's 10,000-card AI computing clusters race

ai-technology · 2026-05-05

Cities and tech giants in China are racing to build massive 10,000-card computing clusters, linking 10,000 or more AI accelerator chips to enable faster model training and reduce costs. These clusters function as supercomputers integrating high-performance GPUs and advanced storage. Domestic champions like Huawei Technologies, Alibaba Group Holding, and GPU specialist Moore Threads compete to position their chips at the center of these systems. The clusters are seen as a new form of infrastructure, sparking an arms race among cities and technology companies.

Key facts

  • China is building 10,000-card computing clusters for AI.
  • Clusters link 10,000 or more AI accelerator chips.
  • They enable faster AI iteration and reduce model training times.
  • Huawei Technologies, Alibaba Group Holding, and Moore Threads are key players.
  • Clusters function as supercomputers with high-performance GPUs and advanced storage.
  • Cities and tech giants are racing to build these clusters.
  • The clusters are considered a new form of infrastructure in China.
  • The goal is faster training, lower costs, and wider AI adoption.

Entities

Institutions

  • Huawei Technologies
  • Alibaba Group Holding
  • Moore Threads

Locations

  • China

Sources