According to The Information, Meta Platforms Inc. is in talks to spend billions of dollars to purchase Google’s artificial intelligence chips, indicating that the Internet search giant has made progress in creating products to compete with the industry’s best-selling artificial intelligence accelerators. The share price of Nvidia Corporation dropped due to this news.
According to The Information, citing a person familiar with the matter, Meta is in talks to use these chips – namely tensor processing units (Tpus) – in its data centers in 2027. The news media also said that Meta might rent these chips from Google Cloud again next year.
Shares of Google’s parent company, Alphabet Inc., rose by 2.7% at one point in late trading, while shares of Nvidia dropped by 2.7% at one point.
This agreement will help establish TPU as an alternative to NVIDIA chips. Nvidia chips are currently the gold standard for large technology companies and startups from Meta to OpenAI, which require powerful computing capabilities to develop and run artificial intelligence platforms. Google has previously reached an agreement with Anthropic PBC to supply up to one million TPU chips. Despite this, NVIDIA still dominates the market.
After the Anthropic deal was announced, Seaport analyst Jay Goldberg called it a “strong validation” for the TPU (Transport Production Unit). He said, “Many people have been considering TPU before, and now there might be even more people thinking about it.”
A representative of Meta declined to comment, while Google has not immediately responded to a request for comment.
Meta is very likely to use Google’s TPU (Anthropic is already using it), which indicates that third-party providers of large language models are likely to use Google as an auxiliary supplier for inference acceleration chips in the short term. We have calculated that Meta’s capital expenditure by 2026 will be at least 100 billion US dollars, which means it will invest at least 40 to 50 billion US dollars in inference chip capacity next year. Due to enterprise customers’ desire to use TPU and Gemini LLM on Google Cloud, the growth rate of Google Cloud’s consumption and backlog orders may exceed that of other hyperscale cloud service providers and new cloud platforms.
Asian stocks related to Alphabet rose sharply in early Asian trading on Tuesday. In South Korea, the share price of IsuPetasys, a company that supplies multi-layer boards to Alphabet, soared by 18%, hitting a new intraday high. In Taiwan, the share price of MediaTek Inc. rose by nearly 5%.
Reaching an agreement with Meta (one of the largest investors in the global data center and artificial intelligence development fields) is undoubtedly a victory for Google. But the key lies in whether these tensor chips can demonstrate sufficient energy efficiency and computing power to become a feasible solution in the long term.
Tensor chips – originally developed specifically for AI tasks over a decade ago – are now becoming increasingly popular outside of NVIDIA, serving as an effective way to train and run complex AI models. As global enterprises are concerned about over-reliance on NVIDIA, the appeal of tensor chips as an alternative is growing day by day. It should be noted that in this market, even AMD lags far behind NVIDIA.
The Graphics processing unit (GPU) is the part of the chip market dominated by NVIDIA. It was originally designed to accelerate graphics rendering – mainly for video games and other visual effects applications – but because it can handle large amounts of data and computing, it is also very suitable for training artificial intelligence models. On the other hand, TPU is a specialized product known as an application-specific integrated circuit (ASIC) or microchip, and its design application is relatively simple.
Tensor chips are also applied by Google in its own accelerators for artificial intelligence and machine learning tasks. Because Google and its DeepMind division have developed cutting-edge artificial intelligence models such as Gemini, the company is able to feed back the experience of these teams to chip designers. Meanwhile, the customization capability of chips has also greatly benefited the artificial intelligence team.


