- Thermodynamic computing uses physical energy flows instead of fixed digital circuits to perform AI calculations
- Image data is allowed to degrade naturally through small fluctuations in the computer’s components
- Scaling to complex image generation will require entirely new hardware designs and approaches
Researchers are exploring a new type of computer that uses natural energy flows to potentially perform AI tasks more efficiently.
Unlike traditional digital computers, which rely on fixed circuits and exact calculations, thermodynamic computing works with randomness, noise, and physical interactions to solve problems.
The idea is that this method could allow AI tools, including image editing programs, to run on far less power than current systems.
How thermodynamic imaging works
The process of thermodynamic imaging is unusual compared to normal computing. It begins with the computer receiving a set of images which it then allows to “degrade”.
In this context, degradation does not mean that the images are deleted or damaged; this means that the data in the images is allowed to spread or change naturally due to small fluctuations in the system.
These fluctuations are caused by the physical energy moving through the computer’s components, such as small currents and vibrations.
Over time, these interactions cause the images to become blurry or noisy, creating a kind of natural disturbance – then the system measures the likelihood of reversing this disorder, adjusting its internal settings to make reconstruction more likely.
By running this process many times, the computer gradually restores the original images without following the step-by-step logic used by conventional computers.
Stephen Whitelam, a researcher at Lawrence Berkeley National Laboratory, has shown that thermodynamic computing can produce simple images such as handwritten digits.
These outputs are far simpler than those from AI image generators like DALL-E or Google Gemini’s Nano Banana Pro.
Still, the research proves that physical systems can perform basic machine learning tasks, showing a new way AI could work.
But scaling this approach to produce high-quality, fully featured images will require new types of hardware.
Proponents claim that thermodynamic computing can reduce the energy required for AI image generation by a factor of ten billion compared to standard computers.
If successful, this will greatly reduce energy consumption in data centers running AI models.
Although the first thermodynamic computer chip has been made, the current prototypes are basic and cannot match mainstream AI tools.
Researchers emphasize that the concept is limited to basic principles, and practical implementations will require breakthroughs in both hardware and computer design.
“This research suggests that it is possible to make hardware to do certain types of machine learning … at significantly lower energy costs than we currently do,” Whitelam told IEEE.
“We don’t yet know how to design a thermodynamic computer that would be as good at imaging as, say, DALL-E…it will still be necessary to figure out how to build the hardware to do this.”
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews and opinions in your feeds. Be sure to click the Follow button!
And of course you can too follow TechRadar on TikTok for news, reviews, video unboxings, and get regular updates from us on WhatsApp also.



