- Floating data centers aim to bypass network bottlenecks using offshore deployment
- The Samsung model plugs directly into shore power for faster AI scaling
- Offshore barges can significantly reduce data center implementation timelines
Samsung Heavy Industries has unveiled a large floating data center ship model that could support AI tools globally.
Samsung and OpenAI signed a letter of intent in October 2025 for a comprehensive partnership, including the development of a floating data center.
The design of this data center is specifically intended to host future versions of systems such as OpenAI’s ChatGPT on a waterborne platform.
The article continues below
Offshore deployment strategy for AI infrastructure
This vessel would sit offshore and connect directly to power and cooling near coastal energy assets.
Samsung says the concept compresses the usual years-long build-out of land-based data centers into a much shorter timeline.
The entire project is also paired with a Dallas-based infrastructure developer, Mousterian Corp., focused on high-density AI compute.
The floating model aims to shorten the time needed to provide power and cooling for AI workloads.
Instead of waiting for new grid connections, the system docks near existing thermal or nuclear plants.
This approach treats the coastline as a digital infrastructure deployment zone, and the barges carry fully liquid-cooled data halls that can scale on demand.
Developers claim that “speed to power is the new moat” for AI tools and cloud operators.
Anyone who can turn on the computer and power quickly gains a real advantage over slower rivals.
That’s why the partnership claims this strategy could shift capacity delivery from year to quarter for some locations.
The Dallas-based partner says the floating data center initiative intends to deliver more than 1.5 GW of capacity within about three years.
“Speed to power is the new moat. We have thoughtfully partnered with some of the leading global conglomerates, allowing us to deliver over 1,500 MW of capacity over the next 3 years,” said Min Suh, CEO of Mousterian Corp.
However, this number means that there will be more barge-based projects, each tied to local power and grid constraints.
Each vessel would host thousands of servers designed for AI training and inference workloads.
The 1.5 GW target also depends on approvals, speed of construction and the availability of water adjacent to baseload facilities.
Some analysts doubt that pace can be sustained in practice, and maritime data centers still face technical, regulatory and financial hurdles at scale.
Risks and operational uncertainty at sea
Although floating data centers tackle some of the problems associated with land-based data centers, they also introduce new challenges.
Experts worry that these facilities create new risks around cybersecurity, physical access and long-term reliability.
Saltwater environments, storm exposure and standby times complicate operations significantly.
Maintenance and fiber optic connections are also becoming more complex offshore than onshore.
Furthermore, claims to deliver 1.5 GW in 36 months rest on unproven timelines across shipbuilding, permitting and tenant onboarding.
Market demand for AI tools and data centers is real, but execution remains uncertain.
The model can add a niche option instead of overhauling how most AI calculations are positioned, and the true test will be how many barges actually come online as planned.
Via Dallas Innovates
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews and opinions in your feeds.



