ARTFEED — Contemporary Art Intelligence

Google Partners with Marvell on Inference AI Chips to Compete with Nvidia

ai-technology · 2026-04-20

In a bid to rival Nvidia, Google is collaborating with Marvell Technology to create AI inference chips. The unveiling of their next-generation tensor processing units is set for this week at the Google Cloud Next conference in Las Vegas, with subsequent chips tailored for inference. This initiative allows Alphabet to address the increasing demand for AI software. According to Gartner analyst Chirag Dekate, there is a notable trend toward inference. Google’s Chief Scientist, Jeff Dean, highlighted the importance of differentiating between training and inference chips. Demand for Google’s TPUs has skyrocketed, especially after Meta entered into a multibillion-dollar agreement through Google Cloud. Anthropic has also expanded TPU availability to 1 million chips and is collaborating with Broadcom for chips expected in 2027. However, supply limitations could impede Google's goals, as top teams are prioritized for the available chips.

Key facts

  • Google is developing inference AI chips with Marvell Technology
  • The company plans to announce new TPU generations at Google Cloud Next in Las Vegas
  • Inference is the stage where AI models field queries and produce outputs
  • Gartner analyst Chirag Dekate said the battleground is shifting towards inference
  • Google Chief Scientist Jeff Dean noted specialization between training and inference chips is becoming sensible
  • Meta struck a multibillion-dollar agreement to procure TPUs via Google Cloud
  • Anthropic signed a deal with Broadcom for chips enabling 3.5 gigawatts of computing power starting in 2027
  • Nvidia CEO Jensen Huang said its chips can handle applications "you can't do with TPUs"

Entities

Institutions

  • Google
  • Marvell Technology
  • Nvidia
  • Alphabet
  • Bloomberg
  • Gartner
  • Google Cloud Next
  • Meta
  • Anthropic
  • Broadcom
  • PyTorch
  • GTC

Locations

  • Las Vegas
  • United States

Sources