AWS Sees Low Demand for AMD AI Chips, Sticks with Nvidia in 2024

Amazon Web Services (AWS) has revealed that insufficient customer demand is preventing the cloud giant from offering AMD’s MI300 series AI chips on its platform, despite considering the option 18 months ago. This revelation came from Gadi Hutt, senior director for customer and product engineering at Amazon’s chip unit, Annapurna Labs, during AWS’s re:Invent conference this week.

The decision highlights the continued dominance of Nvidia in the AI chip market, as AWS prioritizes customer-driven deployment strategies. Hutt emphasized that AWS follows customer demand closely, stating that if customers show strong indications that AMD’s AI chips are needed, there would be no reason not to deploy them. However, AWS is “not yet” seeing that level of demand for AMD’s offerings.

AMD’s stock price dropped approximately 2% following the initial publication of this story, reflecting investor concerns about the company’s ability to compete in the lucrative AI chip market. Despite AMD’s recent optimism—including an increased GPU sales forecast citing robust demand—the company remains significantly behind market leader Nvidia in the AI accelerator space.

At the same re:Invent conference, AWS announced the launch of P6 servers equipped with Nvidia’s latest Blackwell GPUs, further cementing the partnership between the two companies. This announcement underscores AWS’s commitment to providing cutting-edge AI infrastructure to its cloud customers, with Nvidia remaining the preferred supplier.

Despite the setback in AI chips, AWS and AMD maintain a close partnership in other areas. AWS continues to offer cloud access to AMD’s CPU server chips, and according to Hutt, AMD’s AI chip product line remains “always under consideration” for future deployment. This suggests that while current demand doesn’t justify immediate adoption, AWS hasn’t closed the door on AMD’s AI offerings entirely.

The situation reflects the broader competitive dynamics in the AI chip market, where Nvidia has established a commanding lead through its CUDA software ecosystem and early mover advantage in AI-optimized hardware. AMD faces the challenge of not only developing competitive hardware but also convincing major cloud providers and their customers to adopt alternatives to the established Nvidia ecosystem.

Key Quotes

We follow customer demand. If customers have strong indications that those are needed, then there’s no reason not to deploy.

Gadi Hutt, senior director for customer and product engineering at Amazon’s Annapurna Labs, explained AWS’s customer-driven approach to chip offerings, revealing why AMD’s AI chips haven’t been deployed despite previous consideration.

We are ’not yet’ seeing that high demand for AMD’s AI chips.

Hutt’s frank assessment of market demand for AMD’s MI300 series chips on AWS’s platform, directly explaining the absence of AMD AI accelerators from the cloud service’s offerings and highlighting Nvidia’s continued market dominance.

Our Take

This story reveals a critical challenge in the AI chip market: technical capability alone doesn’t guarantee market success. AMD has developed competitive AI hardware with its MI300 series, yet struggles to convert that into cloud platform adoption—the primary distribution channel for AI compute resources. The issue isn’t necessarily AMD’s technology but rather the powerful network effects surrounding Nvidia’s CUDA ecosystem. Developers have years of experience optimizing for Nvidia hardware, creating switching costs that go beyond raw performance metrics. AWS’s demand-driven approach is rational but creates a chicken-and-egg problem: customers won’t demand what they haven’t tried, and cloud providers won’t offer what customers don’t demand. This dynamic could perpetuate Nvidia’s dominance unless AMD or other competitors find alternative go-to-market strategies or offer compelling enough advantages to overcome inertia.

Why This Matters

This development is significant for the AI industry as it reveals the stark reality of Nvidia’s market dominance and the challenges facing competitors trying to break into the AI chip market. AWS, as one of the world’s largest cloud providers, serves as a critical distribution channel for AI hardware. Their decision to hold off on AMD chips due to lack of customer demand signals that enterprises and developers remain heavily invested in Nvidia’s ecosystem.

The story has broader implications for AI infrastructure competition and pricing. With limited competition, Nvidia can maintain premium pricing for its GPUs, which impacts the cost of AI development and deployment across industries. AMD’s struggle to gain traction, despite technical capabilities, suggests that software ecosystems and developer familiarity matter as much as hardware performance in the AI chip market.

For businesses investing in AI, this news reinforces the importance of the Nvidia platform while also highlighting potential future opportunities if AMD can build momentum. The situation also affects AI accessibility and democratization, as more competition could drive down costs and make advanced AI capabilities more widely available.

For those interested in learning more about artificial intelligence, machine learning, and effective AI communication, here are some excellent resources:

Source: https://www.businessinsider.com/amazon-not-enough-demand-amd-ai-chips-aws-nvidia-2024-12