- Nvidia controls processors and networks and forms the backbone of AI factories today
- Nvidia could soon control not just chips, but energy, models and applications
- Huang frames AI not as software, but as the foundation of modern industry
Nvidia CEO Jensen Huang recently described artificial intelligence through the metaphor of a multi-layered system.
The framework explains how modern AI systems work as an industrial chain rather than isolated software tools.
The structure consists of five layers: energy, chips, infrastructure, models and applications, which interact with industries and consumers.
The article continues below
How the AI stack works across layers
“Every successful application draws on every layer below it, right down to the power plant that keeps it alive,” Huang wrote, illustrating how intelligence generated in real-time depends on physical resources across the computing ecosystem.
Nvidia already dominates the processor layer, provides networking technologies and provides computing platforms in large data centers.
The company’s impact on infrastructure includes systems that connect thousands of processors to machines capable of continuously generating intelligence.
These facilities, sometimes described as AI factories, require land, power supply and network systems to operate at scale.
Huang noted that the construction of new chip manufacturing facilities, computer assembly facilities and data centers is taking place in several regions.
“We’re a few hundred billion dollars into it,” he wrote. “Trillions of dollars in infrastructure still need to be built.”
The expansion reflects one of the largest industrial developments associated with modern computing.
At the top of the stack are applications that convert computing capacity into economic value.
Huang cited examples including drug discovery platforms, industrial robotics, legal analysis tools, and autonomous vehicles that act as physical embodiments of artificial intelligence.
“A self-driving car is an AI application incorporated into a machine,” he wrote. “A humanoid robot is an AI application incorporated into a body.”
These systems rely on models capable of processing language, images, scientific data, and real-world environments, increasing the demand for computing resources across the lower layers of the stack.
The framework also suggests how Nvidia could expand across the layers it described.
Companies that control core technology sometimes extend into adjacent layers, similar to Amazon after building AWS.
Nvidia has actively expanded into network systems and large-scale computing infrastructure.
The company has also invested in areas such as photonics, which affect how data moves between computer systems.
If Nvidia expands further into models, infrastructure, power supply or applications, the company can operate across most of the layers described in Huang’s framework.
By framing AI as a layered stack, Nvidia isn’t just explaining the industry, it’s staking its claim across it.
From chips to infrastructure to applications, the company wants to have its cake and eat it too.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews and opinions in your feeds. Be sure to click the Follow button!
And of course you can too follow TechRadar on TikTok for news, reviews, video unboxings, and get regular updates from us on WhatsApp also.



