- SOCAMM is a new modular memory form factor exclusive of nvidia systems
- Micron says Socham offers high bandwidth, low effect and less footprint
- SK Hynix plans the production of Socamm as demand for the AI infrastructure is growing
On the recent NVIDIA GTC 2025, memory producers removed Micron and SK Hynix the wrapping of their respective Socamm solutions.
This new modular memory form factor is designed to unlock the full potential of AI platforms and is only developed for NVIDIA’s Grace Blackwell platform.
Socham or small contour compression Attached memory module is based on LPDDR5X and aims to meet growing performance and efficiency requirements on AI servers. The form factor reportedly offers higher bandwidth, lower power consumption and a smaller footprint compared to traditional memory modules such as RDIMMS and MRDIMMS. Socamm is specific to Nvidia’s AI architecture and therefore cannot be used in AMD or Intel systems.
More cost effective
Micron announced that it will be the first to send socammal products in volume, and its 128 GB Socamm modules are designed for NVIDIA GB300 Grace Blackwell Ultra Superchip.
According to the company, the modules deliver more than 2.5 times the bandwidth of RDIMMS while using a third of the power.
The compact 14x90mm design is designed to support effective server layouts and thermal control.
“AI is running a paradigm shift in computing, and the memory is at the heart of this development,” said Raj Narasimhan, senior vice president and general manager of Micron’s Compute and Networking Business Unit.
“Micron’s contribution to the Nvidia Grace Blackwell platform provides performance and power-saving benefits for AI training and infernic applications.”
SK Hynix also presented its own low-power socamm solution on the GTC 2025 as part of a wider AI memory portfolio.
Unlike Micron, the company did not go too much detailed about it, but said it is to place Socamm as a key offering for future AI infrastructure and plans to start mass production “in line with the advent of the market”.
“We are proud to present our line-up of industry-leading products at GTC 2025,” said SK Hynix’s president and head of AI Infra Juseon (Justin) Kim.
“With a differentiated competitiveness in the AI memory, we are on the field to bring our future as the full Stack AI memory provider forward.”