- Huawei launches CloudMatrix 384 Supernode to Rival Nvidias NVL72 AI
- CloudMatrix 384 delivers nearly twice as large as twice as large memory and bandwidth
- Consuming almost four times the force but system effectiveness is less critical in china
Huawei has placed himself like the Chinese Nvidia for some time, and now South China Morning Posts Reports that the company has launched a new AI infrastructure architecture set to compete with the US Chip Giant’s NVL72 system.
Nvidia’s NVL72 Links 72 GPUs using NVLink technology so they can act as a single, powerful GPU. It is built for trillion parameter AI models and delivers real-time inference at speeds up to 30 times faster than previous systems by avoiding traditional data transfer bottlenecks.
SCMP Said Huawei’s rival to this, CloudMatrix 384 Supernode, has been described as a “nuclear level product” by named Huawei sources. According to reports, it uses the 384 ASCEND 910C chips to deliver 300 petaflops of dense BF16 calculation, which is almost twice the 180 petaflops offered by NVIDIA’s NVL72.
Power hungry
CloudMatrix 384 Supernode has so far been deployed at Huawei’s data centers in Wuhu, a city in central Anhui province, the report says.
Writing about the new product, Semianalysis Confirmation of this rack-scale solution competes directly with Nvidia’s GB200 NVL72, and in some measurements it exceeds. The site says that despite sanctions, China’s domestic semiconductor capacities are growing, and Huawei’s strength is within system level engineering, including networks, optics and software.
While ascend chips are very dependent on foreign supply chains, such as HBM from Samsung and Wafers from TSMC, Huawei has managed to skirt export control through complex sourcing strategies.
CloudMatrix 384 not only exceeds NVL72 with regard to calculation, it also offers 3.6x Total memory capacity and 2.1x more memory bandwidth. However, something Huawei is probably less eager to shout that it consumes almost four times the force.
Semianalysis Points out that this in China is less of a problem than you might think. Ample power production means that efficiency is less of a restriction compared to the West, and China is expanding its energy network rapidly, which supports such a power-hungry AI infrastructure even when supply exceeds demand.