Meta builds a 1700W superchip and custom MTIA chips, while Nvidia, AMD, Intel and ARM are rejected for ends


  • Meta’s 1700W superchip delivers 30 PFLOPs and 512GB HBM memory
  • MTIA 450 and 500 prioritize conclusions over pre-training workloads
  • Future MTIA generations will support GenAI inference and ranking of workloads

Meta advances its AI infrastructure with a portfolio of custom MTIA chips designed specifically for inference workloads across its apps.

The company is developing a 1700W superchip capable of 30 PFLOPs and 512GB HBM, integrated into the same MTIA infrastructure to handle large-scale inference tasks.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top