Google’s custom AI chips are emerging as a serious competitor to Nvidia’s market-leading GPUs, with major implications for the artificial intelligence hardware industry. The Tensor Processing Units (TPUs), which Google has been developing for over a decade, are now attracting significant interest from major tech companies, causing Nvidia shares to tumble last month following reports that Meta was exploring a deal to use Google’s AI chips.
Morgan Stanley projects explosive growth for Google’s TPU business, estimating that 5 million TPUs will be purchased in 2027 and approximately 7 million in 2028—a significant increase from prior projections. The investment bank calculates that every 500,000 TPU chips sold could potentially add around $13 billion in revenue to Google’s balance sheet in 2027.
Unlike Nvidia’s GPUs, which were originally designed for gaming and later adapted for AI workloads, TPUs were purpose-built for artificial intelligence from the ground up. Originally designed by a team led by Jonathan Ross (now CEO of Groq), TPUs utilize a specialized architecture called a systolic array that allows a more constant stream of data to pass through the chip, making them more efficient at certain AI tasks. Google’s latest “Ironwood” TPU is reportedly more than four times better than its predecessor for both training models and inference.
Major customers are already adopting TPUs. Apple used TPUs to train its in-house AI model, and Anthropic announced a blockbuster deal in October to use up to 1 million TPUs. Broadcom revealed it has received $21 billion in orders from Anthropic for Google’s Ironwood TPUs. Meta is also in early testing phases, though no long-term deal has been confirmed.
The key advantage of TPUs lies in their cost-efficiency at scale. Thousands of TPUs can work in tandem in a single “pod,” and their faster calculation speeds can make large-scale operations more economical, particularly for inference tasks. However, Nvidia maintains a significant advantage through its CUDA software, which only works with Nvidia chips and creates substantial friction for companies considering a switch. Google is working to address this by better supporting PyTorch, Meta’s popular AI development tool that’s seeing more demand than Google’s own TensorFlow.
While Google remains its own biggest TPU customer, using the chips across products like Search, Maps, and its Gemini 3 model, the external market is expanding rapidly. Industry experts suggest that rather than completely replacing Nvidia, the rise of specialized chips will likely lead to market diversification, with companies using multiple chip providers rather than relying on a single vendor.
Key Quotes
We don’t see TPU as a significant threat to Nvidia’s business, but it has been a real player in the market for many years. It is possible that Google sells TPU servers externally in the future, to many more customers. Right now, they are very selective.
Jordan Nanos, member of technical staff at research firm SemiAnalysis, provides a measured perspective on the competitive landscape, suggesting that while TPUs are gaining traction, Nvidia’s dominance isn’t immediately threatened. His observation about Google’s selective approach hints at untapped market potential.
Every 500,000 TPU chips sold could potentially add around $13 billion in revenue to Google’s balance sheet in 2027.
Morgan Stanley’s projection, detailed in their December research note, quantifies the massive revenue opportunity Google could unlock by expanding its TPU business beyond internal use. This represents a potentially transformative new revenue stream for the company.
Our Take
The TPU story illustrates how vertical integration in AI infrastructure is becoming a competitive necessity for tech giants. Google’s decade-long investment in custom chips is now paying dividends, not just through internal cost savings but as a potential multi-billion dollar business line. The timing is particularly strategic—as the AI industry shifts from the training-intensive phase to inference-heavy production deployments, chips optimized for inference efficiency become increasingly valuable.
What’s most intriguing is the feedback loop advantage: Google uses TPUs internally, learns from real-world performance, and iterates on chip design accordingly. This creates a virtuous cycle that pure chip manufacturers can’t easily replicate. However, Nvidia’s CUDA moat remains formidable, and Google’s success will largely depend on how effectively it can reduce switching costs for potential customers. The market is likely heading toward a multi-vendor future where different chips excel at different workloads—a healthy development for innovation and competition.
Why This Matters
This development represents a critical inflection point in the AI hardware market, which has been dominated by Nvidia’s near-monopoly position. The emergence of viable alternatives like Google’s TPUs could fundamentally reshape the economics of AI development and deployment, particularly as companies scale up inference workloads.
The cost implications are enormous. As AI models grow larger and more companies move from experimental phases to production-scale deployments, the efficiency gains from specialized chips could translate to billions in savings. This is especially significant for inference operations, which occur continuously as AI applications serve users, compared to one-time training costs.
Market diversification benefits the entire AI ecosystem by reducing dependency on a single vendor, potentially lowering costs through competition, and spurring innovation as companies compete on chip performance and efficiency. For businesses investing heavily in AI infrastructure, having multiple viable chip options provides strategic flexibility and negotiating leverage.
The $21 billion order from Anthropic alone signals that major AI companies are willing to bet big on alternatives to Nvidia, potentially accelerating a broader industry shift toward specialized, purpose-built AI hardware.
Related Stories
- Nvidia CEO Jensen Huang Reveals Public Speaking Struggles Despite AI Success
- Meta and Nvidia Billionaires’ Wealth Soars $152B in AI Boom
- Nvidia faces US probe over potential export rules violations to China
- Big Tech’s 2025 AI Plans: Meta, Apple, Tesla, Google Unveil Roadmap
- Google’s Sundar Pichai Responds to Microsoft CEO Satya Nadella’s ‘Dance’ Comment on AI in 2024
Source: https://www.businessinsider.com/google-tpu-ai-chip-explained-nvidia-2025-12