Meta, the parent company of Facebook, has begun testing its first custom chip designed specifically for training artificial intelligence systems. This significant step is part of Meta’s strategy to decrease its dependency on outside suppliers like Nvidia and develop more proprietary technology. The company has launched a small-scale deployment of the chip, with plans to increase production for broader use if these initial tests yield positive results.
The development of in-house chips is critical for Meta’s long-term vision, as the tech giant seeks to lower its substantial infrastructure costs amid significant investments in AI tools to spur growth. Meta has estimated its total expenses for 2025 will reach between $114 billion and $119 billion, which includes about $65 billion allocated for capital expenditures mainly driven by AI infrastructure.
The new chip, characterized as a dedicated accelerator, is engineered to perform AI-specific tasks, potentially improving power efficiency compared to traditional graphics processing units (GPUs). Meta is collaborating with Taiwan Semiconductor Manufacturing Company (TSMC) to manufacture this chip. This test phase followed the successful completion of the chip’s first “tape-out,” a critical milestone in the silicon development process. However, the testing entails substantial financial stakes and risks, as the tape-out phase itself can cost millions without any assurance of success.
This training chip is part of Meta’s Meta Training and Inference Accelerator (MTIA) initiative. Although the program has faced challenges, including prior projects that were ultimately shelved, Meta has recently employed an MTIA chip for running recommendation systems that determine content visibility on its platforms, including Facebook and Instagram. The goal is to utilize these chips extensively by 2026 for training and inference tasks in various AI applications, such as recommendation systems and future generative AI products.
Despite the setbacks in previous chip endeavors, Meta views its current efforts as a gradual but promising journey toward establishing a more self-sufficient AI.