- John Carmack has shared an idea to use fiber instead of RAM
- This is a future vision for replacing RAM modules in AI workloads
- Although highly theoretical and far off, there are other possible solutions in the shorter term to reduce the AI’s all-consuming appetite for RAM
John Carmack has floated an idea to effectively use fiber cables as ‘storage’ rather than conventional RAM modules, which is a particularly exciting vision of the future given the current memory crisis and all the chaos it creates.
Tom’s Hardware noticed the co-founder of id Software’s post on X, where Carmack suggests that a very long fiber optic cable – and we’re talking 200 km long – could effectively fill the system’s RAM, at least when working with AI models.
Carmack notes, “256 Tb/s data rates over 200 km distance have been demonstrated on single-mode fiber optics, which works out to 32 GB of data in flight, ‘stored’ in the fiber, with 32 TB/s bandwidth. Neural Network Inference and Training [AI] can have deterministic weight reference patterns, so it is amusing to consider a system with no DRAM and weights continuously streamed into an L2 cache by a recycled fiber loop.”
What this means is that said fiber length is a loop where the necessary data (usually stored in RAM) is “continuously streamed” and keeps the AI processor always fed (since the AI model weights can be accessed sequentially – otherwise this wouldn’t work). This would also be a very environmentally friendly, power saving way of performing these tasks compared to traditional RAM.
As Carmack points out, this is the “modern equivalent of the old mercury echo tube memories,” or delay line memories, where data is stored in waves passing through a coil of wire.
It’s not an idea that’s feasible now, but a concept for the future, as mentioned – and what Carmack argues is that it’s a conceivable way forward that may have a “better growth trajectory” than what we’re currently looking at with traditional DRAM.
Analysis: Flash forward
There are very obvious problems with RAM right now in terms of supply and demand, with the latter far outstripping the former thanks to the rise of AI and the huge memory requirements therein. (Not only for servers in data centers that send queries to popular AI models, but also video RAM in AI accelerator cards).
So what Carmack envisions is another way to operate with AI models that use fiber lines instead. This could, in theory, free the rest of us to stop worrying about RAM costing a ridiculous amount of cash (or indeed a PC or graphics card, and the list goes on with the memory crisis).
The problem is that there are many problems with such a fiber proposal, as Carmack acknowledges. That includes the large amount of fiber needed and difficulty maintaining signal strength through the loop.
But there are other options along these lines, and other people have talked about similar concepts over the past few years. Carmack mentions, “Much more practically, you should be able to assemble cheap flash memory to provide almost any read bandwidth you need, as long as it’s done one page at a time and pipelined well in advance. It should be viable for end-to-end rendering today if flash and accelerator vendors could agree on a high-speed interface.”
In other words, this is an army of cheap flash memory modules joined together and working massively in parallel, but as Carmack notes, the key would be to agree on an interface where these chips could work directly with the AI accelerator.
This is an interesting proposition in the shorter term, but one that relies on the relevant manufacturers (of AI GPUs and storage) getting together and hammering out a new system along these lines.
The RAM crisis is expected to last this year, and probably next year as well, potentially dragging on even longer than that with all the possible pain involved for consumers. So looking at alternative solutions to memory in the form of AI models could be a worthwhile endeavor towards ensuring that this RAM crisis is the last episode we have to suffer through.

The best laptops for all budgets
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews and opinions in your feeds. Be sure to click the Follow button!
And of course you can also follow TechRadar on YouTube and TikTok for news, reviews, video unboxings, and get regular updates from us on WhatsApp also.



