- Intel and Google signed a multi-year agreement to keep Xeon in the cloud infrastructure
- Google Cloud instances C4 and N4 are already running on Xeon 6 processors
- Intel and Google jointly develop custom IPUs for networking and storage
Intel and Google have announced a multi-year collaboration that will keep Intel Xeon processors at the heart of the Google Cloud infrastructure for the foreseeable future.
The deal spans multiple generations of Xeon chips and includes systems used for AI workloads, inference tasks and general computing across Google’s global data centers.
Google Cloud instances like C4 and N4 already rely on Xeon 6 processors, and this deal ensures that pattern continues.
The article continues below
Why CPUs still matter in an era of specialized AI hardware
“AI is reshaping how infrastructure is built and scaled,” said Lip-Bu Tan, CEO of Intel.
“Scaling AI requires more than accelerators – it requires balanced systems. CPUs and IPUs are central to delivering the performance, efficiency and flexibility modern AI workloads demand.”
The announcement comes at a time when many hyperscalers are accelerating the adoption of custom Arm-based processors for AI tasks.
Counterpoint Research recently claimed that 90% of AI servers running custom silicon will rely on the Arm instruction set architecture, leaving x86 with only a small share of new implementations.
To ensure Xeon remains relevant, Intel and Google are also jointly developing custom infrastructure processing units designed to handle network, storage and security workloads.
These IPUs act as ASIC-based accelerators that shift infrastructure tasks away from host CPUs, freeing up Xeon processors to focus on application execution.
This separation improves system efficiency and resource allocation across large cloud deployments running AI tools, AI agents, and large language models.
CPUs and infrastructure acceleration remain a cornerstone of AI systems—from training orchestration to inference and deployment,” said Amin Vahdat, SVP and Chief Technologist of AI Infrastructure at Google.
Google currently uses both Xeon 5 and Xeon 6 processors across multiple service tiers, along with its own custom Arm-based Axion processors.
These deployments continue alongside Google’s own custom processors used in other parts of its infrastructure stack.
Intel and Google state that collaboration across CPUs and IPUs will continue across future system generations, covering ongoing integration efforts across cloud infrastructure layers.
They maintain that CPUs and infrastructure accelerators remain part of current cloud design patterns across distributed systems.
Many workloads running in Google data centers require backward compatibility with x86 architecture, while others need maximum single-threaded performance that Xeon CPUs provide.
These requirements are expected to persist for years, which explains why Intel and Google signed this multi-year agreement.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews and opinions in your feeds. Be sure to click the Follow button!
And of course you can too follow TechRadar on TikTok for news, reviews, video unboxings, and get regular updates from us on WhatsApp also.



