AI GPUs soon need more power than a small country as HBM memory growth spirals out of control


  • Future AI -Memory Chips could require more power than whole industrial zones combined
  • 6TB memory in a GPU sounds amazing until you see the stream -drawing
  • HBM8 stacks are impressive in theory but frightening in practice for any energy conscious business

The relentless drive to expand the AI ​​processor power is introducing a new era of memory technology, but it comes at a price that raises practical and environmental concerns, experts have warned.

Research from the Korea Advanced Institute of Science & Technology (KAIST) and Terabyte Interconnection and Package Laboratory (TERA) suggests by 2035, AI GPU accelerators equipped with 6 TB HBM could become a reality.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top