Skip to main content
Meta MTIA 500 chip, a powerful AI accelerator with enhanced memory and low-precision data processing.

Editorial illustration for Meta unveils MTIA 500 chip with higher memory and low‑precision data tweaks

Meta Unveils MTIA 500 Chip for AI Model Acceleration

Meta unveils MTIA 500 chip with higher memory and low‑precision data tweaks

3 min read

Meta has rolled out four new custom chips aimed at powering its AI models and recommendation engines, a move that underscores the company’s push to amass as much compute as it can. The MTIA family, introduced alongside the broader “MTIA 450” platform, is positioned as the backbone for Meta’s internal large‑language‑model experiments and the massive data‑driven pipelines that feed its social feeds. While the earlier MTIA 450 already boasted a sizable memory pool, engineers are now focusing on squeezing more capacity into the next iteration and tweaking how the silicon handles low‑precision numbers—a trade‑off that can boost throughput without a proportional rise in power draw.

The timing is notable: the upcoming version is slated for a rollout later next year, suggesting Meta is planning ahead for the next wave of model scaling. All of this points to a deliberate strategy of hardware‑first development, where incremental gains in memory and data representation are expected to translate into tighter control over AI workloads.

— Meta says the MTIA 500, which is slated to arrive later next year, will have even more memory than MTIA 450 and include “innovations in low‑precision data.”

Meta says the MTIA 500, which is slated to arrive later next year, will have even more memory than MTIA 450 and include "innovations in low-precision data." The MTIA chips are part of Meta's broader strategy to hoard as much computing power as possible in order to develop cutting-edge artificial intelligence. Meta first shared details about its chip development plans in 2023, when it released its first product under the MTIA banner. As software companies and AI labs continue to train increasingly powerful AI models, they have begun announcing ambitious plans to build custom chips that serve their own specific AI needs.

OpenAI, for example, has also said it's partnering with Broadcom to build custom accelerators, following a path similar to Meta's. Earlier this year, Meta was reported to be scaling back some of its in-house efforts to make high-end chips that would compete more directly with leading players like Nvidia. The company now appears eager to dispel that narrative by announcing this new road map for MTIA chips.

But making custom silicon remains enormously expensive and technically complex, which means Meta will likely continue purchasing the majority of its AI hardware from other firms, at least in the near future. That reality is reflected in the company's recent chip buying spree. Meta unveiled its new MTIA chips shortly after announcing multibillion dollar deals with Nvidia and AMD.

Will Meta's new silicon make a dent? The company unveiled four MTIA chips this week, designed to run its generative AI features and content‑ranking models. Built on the open‑source RISC‑V ISA, the processors are a joint effort with Broadcom and will be fabricated by TSMC.

The MTIA 500, slated for delivery later next year, promises more on‑chip memory than its predecessor, the MTIA 450, and incorporates tweaks for low‑precision data handling. These enhancements suggest Meta is pushing for tighter integration of hardware and software across its platforms. Yet the impact of higher memory and low‑precision tricks on real‑world performance remains unclear.

The broader aim, according to the announcement, is to amass as much compute as possible for cutting‑edge artificial‑intelligence work. Whether this hardware push will translate into measurable gains for end users is still an open question. For now, the chips sit in Meta's roadmap, awaiting silicon‑level validation.

Unclear timeline. Meta has not disclosed production volumes or pricing, leaving analysts to wonder how quickly the chips could be deployed across its services.

Further Reading

Common Questions Answered

What specific improvements does the MTIA 500 chip offer over the previous MTIA 450 model?

The MTIA 500 chip promises increased on-chip memory compared to the MTIA 450 and introduces innovations in low-precision data handling. These enhancements are designed to improve Meta's ability to run large language models and data-driven recommendation engines more efficiently.

How is Meta developing its custom AI chips, and who are its manufacturing partners?

Meta is developing its MTIA chips through a collaborative effort with Broadcom, using the open-source RISC-V instruction set architecture (ISA). The chips will be fabricated by TSMC, with the MTIA 500 planned for delivery later next year as part of Meta's strategy to build significant computing capabilities for AI development.

What is the primary purpose of Meta's MTIA chip family in the company's AI strategy?

The MTIA chip family is designed to power Meta's internal large language model experiments and support the massive data-driven pipelines that feed its social media platforms. By developing custom silicon, Meta aims to accumulate more computational power and enhance its AI and recommendation engine capabilities.