- Nvidia is investing $2 billion to bring Marvell into the NVLink Fusion ecosystem
- NVLink Fusion enables third-party accelerators to communicate efficiently with Nvidia GPUs
- Marvell provides custom XPUs and scale-out networking for heterogeneous AI infrastructure
Nvidia has invested $2 billion in Marvell Technology and entered into a strategic partnership that connects the custom chip designer to Nvidia’s AI factory ecosystem through NVLink Fusion.
NVLink Fusion enables third-party accelerators to communicate with Nvidia components over a high-bandwidth, low-latency interconnect, while maintaining compatibility with Nvidia’s rack-scale AI platforms.
The move integrates Marvell’s capabilities in high-performance analog, optical DSP, silicon photonics and custom XPUs with Nvidia’s GPU, CPU and network infrastructure.
The article continues below
Nvidia expands its AI ecosystem
“The inference bend has arrived. Demand for token generation is rising and the world is racing to build AI factories,” said Jensen Huang, founder and CEO of Nvidia.
“Together with Marvell, we’re enabling customers to leverage Nvidia’s AI infrastructure ecosystem and scale to build specialized AI computing.”
NVLink Fusion was first launched in May 2025 as a platform for heterogeneous AI infrastructure.
It allows non-Nvidia accelerators to communicate with Nvidia GPUs over a high-bandwidth fabric.
Marvell will provide custom XPUs and NVLink Fusion-compatible scale-out networks to the partnership.
Nvidia will provide Vera CPUs, ConnectX NICs, BlueField DPUs and NVLink interconnect components.
Each NVLink Fusion platform must include at least one Nvidia product to function properly, meaning that Marvell-designed ASICs still generate revenue for Nvidia despite using custom silicon.
“By connecting Marvell’s leadership in high-performance analog, optical DSP, silicon photonics and custom silicon to Nvidia’s expanding AI ecosystem through NVLink Fusion, we are enabling customers to build scalable, efficient AI infrastructure,” said Matt Murphy, chairman and CEO of Marvell.
Marvell reported $8.2 billion in revenue for its 2026 fiscal year, which ended January 2026, with data center revenue accounting for more than 74% of the total.
The company’s acquisition of Celestial AI late last year added photonic matter technology to its portfolio, and this deal now places this capability within Nvidia’s ecosystem.
The two companies will also collaborate on silicon photonics technology and on transforming the world’s telecommunications networks into AI infrastructure using Nvidia’s Aerial AI-RAN for 5G and 6G networks.
Marvell isn’t the only company joining Nvidia’s proprietary ecosystem, as Samsung Foundry joined the NVLink Fusion program last October.
Arm followed soon after, entering the program in November, enabling its licensees to build NVLink-capable CPUs for a wider range of applications.
However, not all major chipmakers have signed up, as Nvidia competitors AMD, Intel and Broadcom remain absent from the program.
These competitors have instead chosen to back the open UALink standard as a competing rack-scale interconnect, creating a clear distinction in the industry.
The absence of these rivals matters because Marvell is already helping Amazon develop its Trainium line of AI accelerators, putting the company in an unusual position.
The existing relationship with Amazon predates this new partnership with Nvidia by several years, creating potential tension between the two deals.
The announcement from Nvidia and Marvell did not address whether the Trainium collaboration would come out of this deal, leaving the question unanswered.
However, Nvidia’s $2 billion investment successfully draws a turnkey silicon designer into a proprietary ecosystem where it controls the interconnect standard.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews and opinions in your feeds. Be sure to click the Follow button!
And of course you can too follow TechRadar on TikTok for news, reviews, video unboxings, and get regular updates from us on WhatsApp also.



