‘Keep your friends close, but your enemies closer’, sage advice from the Godfather to Sun Tzu. Words widely and deeply taken to heart in the Tech and AI industry, be they the ‘Magnificent 7’, or aspirants to that club.
The AI infrastructure gold rush can continue to be framed as a ‘Frenemies Fest’. For now, all the major providers of AI data center services in ranking order, Amazon AWS, Microsoft Azure (and their partner OpenAI), Google Cloud and others, have now partnered with Nvidia and its founder CEO Jensen Huang.
And had him on stage at their Developer/Customer conferences. While all making long-term infrastructure and chip investments to ultimately replace same Nvidia AI chip hardware and software in their customer offerings.
The latest was Amazon AWS, the largest data center compute provider of AI and other services to businesses large and small. The WSJ frames it well in “Why Amaxon and Nvidia Need Each Other”:
“Amazon needed to put on a good AI show this week. It got a little help from a surprising friend. The e-commerce titan also happens to run the world’s largest cloud-computing business.”
“In fact, Amazon’s AWS unit now generates significantly more annual revenue than IBM and Oracle and comes second only to Microsoft in the market for business-focused software and related services. But Amazon has also been perceived as lagging behindits largest cloud rival in the field of generative artificial intelligence, given Microsoft’s aggressive push into the technology since the public launch of OpenAI’s ChatGPT almost exactly one year ago.”
“Hence, Amazon used its annual AWS re:Invent conference on Tuesday to lean hard into generative AI. It even announced its own chatbot called Q, which looks like a business-focused version of Microsoft’s Copilot.”
“But most notable was the appearance of Nvidia Chief Executive OfficerJensen Huang, who joined AWS CEOAdam Selipsky on stage at the Las Vegas event to announce an “expanded collaboration” between the two companies. That will include AWS being the first cloud provider to launch services with Nvidia’s new GH200 NVL32 “superchips” that will start shipping next year.”
“Tech executives often cross-pollinate each other’s trade shows, and the occasions are generally not worthy of note. But this was the first time Huang has appeared at the annual confab for AWS, which has been a major customer of Nvidia’s data-center business over the last several years.”
“And it came amid rumors of growing friction between the two companies, as Amazon has gone further than its cloud rivals in designing its own in-house chips, while Nvidia has been pushing into offering cloud-computing services of its own. Amazon even used the same keynote on Tuesday to announce the fourth version of its Graviton processor and the second version of its Trainium accelerator—the latter of which competes with Nvidia’s chips in the training of AI models.”
So it’s clear for the next few years, Nvidia will continue to be the largest and closest provider of critical AI GPU chips to the major data center and Foundation LLM AI companies, while they make every effort and investment to over time reduce their dependence on Nvidia’s AI chips and software. Frenemies for the foreseeable future in these early years of the AI Tech Wave. Close friends and closer enemies indeed. Stay tuned.
(NOTE: The discussions here are for information purposes only, and not meant as investment advice at any time. Thanks for joining us here)