- AMD says new instinct MI350 -Serie GPUs are 35x better at inferencing than their predecessors
- AMD claims it has exceeded its energy efficiency targets, determines bold goals
- MI400 GPUs will run future Helios AI stands
AMD has revealed its instinct MI350 series GPUs that have promised a staggering 4x improvement to AI evil compared to the previous generational chips -enough to have Nvidia worried about the market dominance of its blackwell chips.
The company’s CEO Lisa SU also revealed details of the Helios AI RACKet to be built on the next generation’s instinct MI400 series GPUs as well as AMD EPYV Venice CPUs and AMD Pensando Vulcano Nics.
The news came on AMDS Advancing AI 2025 conference along with a number of other hardware, software and AI messages.
AMD’s latest chips fought against Nvidias Blackwell
In addition to the 4X improvement to AI performance, AMD also boasts a fuel 35x generation improvement in inferencing as well as price-performance gains that lock 40% more tokens-per-dollar compared to its key as-for-like rival, NVIDIA B200.
Despite Nvidia’s market dominance, AMD proudly claims that seven out of 10 of the largest model builders and AL companies use its instinct accelerators including Meta, Openai, Microsoft and Xai.
The MI300X has been implemented for Llama 3 and 4 inference with meta and proprietary and open source models with Azure, among others.
In addition to performance, AMD also hangs on its environmental goals and claims that its MI350 series GPUs exceeded the five-year organizational target to improve the energy efficiency of AI training and high-performance computing hubs with 30x-by reaching a figure of 38x.
By 2030, the company also wants to increase the rack-scale energy efficiency by 20x compared to 2024, and it already predicts a 95% reduction in electricity to typical AI model education.
Looking ahead, the instinct MI400 series GPUs are expected to deliver up to 10 times more benefit information on mixing expert models.
Despite the bold claims, AMD’s market cap remains significantly lower than NVIDIA’s and reaches $ 192.14 billion at the time of the press.
“AMD drives AI innovation at an unprecedented pace, highlighted by the launch of our AMD Instinct MI350 series accelerators, Advances in Our Next Generation AMD ‘Helios’ Rack-Scale solutions and growing momentum for our ROCM Open Software Stack,” SU said.
“We are entering the next phase of AI, driven by open standards, shared innovation and AMD’s expanding management across a broad ecosystem of hardware and software partners who work together to define the future of AI.”



