- Samsung used AMDS XIilinx FPGA to operate its Smartssd -Storing Device
- It promised to reduce corporate dependence on servers
- Computational storage units, however, are faded, equally generative AI stepped
Samsung came up with the concept of a SmartsssD back in 2018 before generative AI started. This computational storage drive would be power-server-less computing, which brings the computer closer to where data is stored. Smartssd had NAND, HBM and RDIMM memory sitting next to an FPGA accelerator in SSD itself. This FPGA was built by Xilinx, which AMD bought in October 2020.
Frop until 2025, and Smartssd has all disappeared from Samsung’s portfolio. You can still buy them from Amazon (and others) under the AMD Xilinx brand (rather than Samsungs) for $ 517.70 with a 3.84 TB capacity.
The fact that it is a Gen3 SSD and the novel, but complicated the nature of hardware made it a difficult sale. Then came the double whammy of Covid-19 and AI; The latter, more than anything else, is probably the reason why Samsung gave up CSDs.
Generative AI required another form of calculation resource that CSDs simply couldn’t deliver at that time, and while LLMS needs SSDs, storage capacity was rather than calculating features what it was about.
In short, CSD represented an interesting but niche market, one that is closer to traditional servers. It was nice, but did not have the explosive growth potential for AI-related hardware. Therefore, I think Samsung painted it after its second generation, despite the fact that the company concerned that “the calculation market has great potential” in 2022.
What is the next for CSD?
Look at
The dedicated page on the SNIA website, the group that oversees the standardization of calculation storage, shows little progress since the launch in October 2023 of a CS API. A video published in 2024 by the co-chairmen of the SNIA CS-technical working group mentions a version 1.1 that is under development.
One of its staunchest advocates, Scaleflux, changed his “about us” page to omit the calculation stock in its entirety. Instead, it focuses on providing products that use CS under the hood. Its CSD5000 company SSD has, for example, a physical capacity of 122.88TB, but a logical capacity of 256 TB (with a compression ratio of about 2: 1) mentioned in the small print. It is achieved using on board calculation.
Given the growing significance of AI inference, it would make sense to get some of it done as close as possible to where the data lives, it is on SSD. With ASICS (application -specific integrated circuits) that get more popular thanks to hyperscalers (Google, Microsoft) and AI companies (Openai), the market for corporate -inferent AI SSDs – especially at the edge – could open up before rather than later.



