Meta has unveiled four custom AI accelerator chips under its MTIA family, with the MTIA 300 already deployed and the MTIA 400 nearing rollout, as the company simultaneously secures millions of GPUs from Nvidia and AMD to bolster its AI data centers.
Meta's Custom AI Chip Lineup
- MTIA 300: Deployed a few weeks ago for training smaller AI models that power core tasks like content ranking and ads on Facebook and Instagram.
- MTIA 400: Completed testing and is on the path to deployment; optimized for generative AI inference tasks such as image and video generation from text prompts.
- MTIA 450 and MTIA 500: Planned for operation in 2027, targeting advanced generative AI inference.
- These chips are not intended for training large language models (LLMs).
Manufacturing and Supply Chain Strategy
- All MTIA chips are manufactured by Taiwan Semiconductor Manufacturing Company (TSMC).
- Meta aims to improve cost per performance and diversify silicon supply to mitigate price volatility and vendor dependency.
- The company acknowledges concerns about high-bandwidth memory (HBM) supply but states it has secured enough for current plans, though it didn't detail contract terms.
- A diversified supply chain approach is emphasized, with hundreds of U.S.-based engineers leading the silicon development.
