- Samsung begins commercial HBM4 shipments as AI memory competition heats up
- HBM4 reaches 11.7 Gbps speeds while pushing higher bandwidth and efficiency gains to data centers
- Samsung scales production plans with roadmap extending to HBM4E and custom memory variants
Samsung says it has not only begun mass production of HBM4 memory, but also shipped the first commercial units to customers, which it claims is an industry first for the new generation of high-bandwidth memory.
HBM4 is built on Samsung’s sixth-generation 10nm-class DRAM process and uses a 4nm logic base die, which reportedly helped the South Korean memory giant achieve stable yields without redesigns as production ramped up.
That’s a technical claim that will likely be tested when large-scale deployments begin and independent performance results appear.
Up to 48 GB capacity
The new memory reaches a consistent transfer rate of 11.7 Gbps, with headroom up to 13 Gbps in certain configurations.
Samsung compares that to an industry baseline of 8 Gbps, and puts the HBM3E at 9.6 Gbps. The total memory bandwidth increases to 3.3 TB/s per stack, which is approximately 2.7 times higher than the previous generation.
Capacities range from 24GB to 36GB in 12-layer stacks, with 16-layer versions coming later. This can increase capacity to 48GB for customers who need denser configurations.
Power consumption is a key issue as HBM designs increase the number of pins, and this generation goes from 1,024 to 2,048 pins.
Samsung says it improved power efficiency by about 40% compared to the HBM3E via low-voltage-through-silicon-via technology and power distribution adjustments, along with thermal changes that increase heat dissipation and resistance.
“Instead of taking the conventional path of using existing proven designs, Samsung took the leap and adopted the most advanced nodes such as 1c DRAM and the 4nm logic process for HBM4,” said Sang Joon Hwang, EVP and Head of Memory Development at Samsung Electronics.
“By leveraging our process competitiveness and design optimization, we are able to ensure significant performance freedom, enabling us to satisfy our customers’ increasing demands for higher performance when they need it.”
The company also points to its scale of production and in-house packaging as key reasons it can meet expected growth in demand.
That includes closer coordination between foundry and memory teams, as well as partnerships with GPU manufacturers and hyperscalers that build custom AI hardware.
Samsung says it expects its HBM business to grow strongly in 2026, with HBM4E sampling planned for later this year, and custom HBM samples to follow in 2027.
Whether competitors respond with similar timelines or faster alternatives will shape how long this early lead lasts.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews and opinions in your feeds. Be sure to click the Follow button!
And of course you can too follow TechRadar on TikTok for news, reviews, video unboxings, and get regular updates from us on WhatsApp also.



