ARTFEED — Contemporary Art Intelligence

Google DeepMind unveils Decoupled DiLoCo for distributed AI training

ai-technology · 2026-04-24

Google DeepMind and Google Research have introduced Decoupled DiLoCo, a groundbreaking framework for training large language models across remote data centers. This system boosts hardware resilience and cuts down on bandwidth needs. By breaking training into separate 'islands' of computation, it allows local problems to be isolated while other units keep learning. Building on earlier advances like Pathways and DiLoCo, it reduces delays that previously affected global distributed techniques. Tests with Gemma 4 models showed steady availability and performance similar to traditional methods, even during hardware issues. They successfully trained a 12 billion parameter model across four U.S. regions using high-speed networking, over 20 times faster than regular methods. The project was led by Arthur Douillard and a talented team.

Key facts

  • Decoupled DiLoCo is a new distributed architecture for training LLMs across distant data centers.
  • It divides training runs into decoupled 'islands' of compute (learner units) with asynchronous data flow.
  • The approach isolates local disruptions so other parts continue learning efficiently.
  • It builds on earlier advances Pathways and DiLoCo.
  • Testing with Gemma 4 models showed maintained availability and equal ML performance despite hardware failures.
  • Successfully trained a 12 billion parameter model across four U.S. regions using 2-5 Gbps wide-area networking.
  • Achieved results more than 20 times faster than conventional synchronization methods.
  • Enables mixing different hardware generations (e.g., TPU v6e and TPU v5p) in a single training run.
  • The work was done by a team across Google DeepMind and Google Research.
  • Leads include Arthur Douillard, Keith Rush, Yani Donchev, Zachary Charles, Ayush Dubey, Blake Woodworth, Ionel Gog, Josef Dean, Nova Fallen, Zachary Garrett.

Entities

Institutions

  • Google DeepMind
  • Google Research

Locations

  • United States

Sources