Facebook owner Meta joins Google, Microsoft and Amazon in custom AI chip race
Meta is developing its first AI training chip to reduce reliance on external suppliers like Nvidia. Partnering with TSMC, Meta is testing the chip for content recommendation systems on Facebook and Instagram. If successful, they plan to expand its use to generative AI products by 2026. Meta's previous custom chip was abandoned due to poor performance.
Facebook owner Meta is reportedly testing its first internally developed chip designed for training artificial intelligence (AI) systems. This development holds much significance as the company joins other tech giants like Google, Microsoft, Amazon and ChatGPT-maker OpenAI to create custom silicon and decrease dependence on external suppliers such as Nvidia.
Citing two sources, news agency Reuters reports that Meta has begun a small-scale deployment of the chip and plans to expand production if testing proves successful. This initiative is part of Meta's broader strategy to reduce its substantial infrastructure costs while continuing to invest heavily in AI technology.
One source told the agency that Meta's new training chip is a dedicated accelerator specifically designed for AI tasks, potentially offering greater power efficiency than the graphics processing units (GPUs) typically used for AI workloads.
The company is partnering with Taiwan-based TSMC to manufacture the chip. Testing began after Meta completed its first “tape-out” - a critical phase in chip development that involves sending an initial design through a chip factory.
The new chip is part of Meta's Training and Inference Accelerator (MTIA) series and began inference - the process of running AI systems during user interactions - specifically for content recommendation systems on Facebook and Instagram, last year.
Meta executives have indicated plans to start using their in-house chips by 2026 for training, beginning with recommendation systems before expanding to generative AI products like their chatbot Meta AI.
Meta’s chief product officer Chris Cox described the company’s chip development as a gradual process but noted that executives consider the first-generation inference chip a “big success.”
“We're working on how would we do training for recommender systems and then eventually how do we think about training and inference for gen AI,” Cox added.
Previously, Meta abandoned an in-house custom inference chip after poor test results, instead ordering billions of dollars worth of Nvidia GPUs in 2022.
- Published On Mar 12, 2025 at 09:21 AM IST