AI: Amazon and Anthropic Scaling Up. RTZ #534
...new expanding relationship between Amazon AWS & Anthropic AI
Amazon is fast exploring a further big investment in the #2 LLM AI company after OpenAI, Anthropic AI. As I’ve discussed before, Amazon already has a multi-billion dollar early stake, as does Google. A potential deal here is of course driven by the AI Infrastructure scaling I’ve discussed, where having AI data centers of a 100,000+ GPUs is the next metric in the AI Table Stakes race driving this AI Tech Wave.
As the Information explains in “Amazon Discussing New Multibillion-Dollar Investment in Anthropic”:
“Amazon is discussing making a second multibillion-dollar investment in OpenAI rival Anthropic, according to a person involved in the discussions. The new deal is similar to Amazon’s initial $4 billion dollar investment in the startup, which was struck last year. But this time, Amazon wants Anthropic to make a concession.”
“The cloud giant is asking Anthropic, which uses Amazon’s cloud services to train its AI, to use a large number of servers powered by chips developed by Amazon, this person said. The problem is that Anthropic prefers to use Amazon servers powered by Nvidia-designed AI chips.”
That is a big ‘ask’.
Nvidia of course is the AI Infrastructure of preference for most business customers, for both its roadmap of AI Data Center GPU chip and infrastructure pipeline, and its industry leading software moat, CUDA, and dozens of software libraries tied to most vertical industry applications.
“The size of Amazon’s total investment in Anthropic could depend on the outcome of this discussion, specifically on the number of Amazon chips Anthropic agrees to use, this person said. The status of the talks couldn’t be learned.”
“The discussions are an example of the competing priorities of large cloud providers and developers of conversational AI that have formed alliances due to the high cost and complexity of producing the technology. The first such marriage in the industry, between Microsoft and OpenAI, has been remarkably beneficial to both companies but has lately become fraught over OpenAI’s concerns that it isn’t getting enough servers from Microsoft to stay ahead of smaller AI rivals. And while Microsoft is developing its own AI server chip, which it hopes OpenAI will want to use, OpenAI hasn't been interested in it, said a person with direct knowledge of the situation. (OpenAI is also developing a chip to run its AI models.)”
Switching chip and hardware platforms is an excruciatingly complicated and challenging task:
“Shifting to the Amazon server chip could be technically challenging for Anthropic because the Amazon software that developers must use with the Traininum chips isn’t as mature as Nvidia’s Cuda software, which AI developers have become accustomed to. Such a move could also lock Anthropic into using Amazon Trainium servers, making it more difficult for the AI startup to use other cloud providers or to lease its own data centers in the future, as Amazon doesn’t make its hardware available to facilities run by other companies.”
“Amazon, though, has good reason to get Anthropic to use its own chips, known as Trainium: The cloud giant could reduce the number of Nvidia chips it has to buy. If it can get its cloud customers to agree to use Trainium-powered servers, it won’t need as many Nvidia chips. As part of Amazon's initial deal with Anthropic, the startup agreed to use some Trainium servers but mainly relied on Nvidia servers in Amazon data centers, said the person who has been involved in the discussions involving the companies.”
“Amazon has increasingly been selling artificial intelligence services powered by Anthropic, a major OpenAI rival, to Amazon’s cloud customers.”
Financial terms are in flux apparently, and seems to be modeled partially on the evolving OpenAI/Microsoft partnership:
“Anthropic recently sought funding from investment firms at a valuation of $30 billion to $40 billion, and any investment deal with Amazon could come in the form of convertible notes that become equity after Anthropic raises capital from other investors. In addition to the investment, the companies are negotiating a cloud deal in which the companies share revenue from the sale of Anthropic’s model to Amazon cloud customers such as Doordash and Goldman Sachs, and Anthropic agrees to rent out specialized servers from Amazon to develop its technology.”
“Amazon a year ago agreed to invest up to $4 billion into Anthropic in a similar deal that was completed earlier this year. Since then, Anthropic has likely spent hundreds of millions of dollars to rent Amazon servers and shared hundreds of millions of additional dollars with Amazon for reselling its models to cloud customers. Amazon also uses Anthropic to power its Q coding assistant for software developers, which competes with ChatGPT, Microsoft’s GitHub Copilot and coding assistant startup Cursor.”
The relationship between the two companies has been a win/win proposition to date:
“The partnership has helped Amazon’s cloud unit maintain its revenue growth rate of 19% in the third quarter though companies using generative AI have been making spending cuts elsewhere in their IT and cloud budgets, an Amazon Web Services executive told The Information.”
All this is going on of course, as Anthropic is exploring another funding round at a $40+ billion valuation. It’s all of course to ramp up their AI Compute infrastrucutre.
“An Anthropic Supercomputer?”
“A new deal could bring the companies even closer. A senior Amazon official privately said Anthropic CEO Dario Amodei earlier this year discussed his interest in using a large-scale AI data center server cluster to develop technology, similar to the ambitious data center plans of rivals such as Elon Musk’s xAI and OpenAI, according to a person who spoke to the official. It isn’t clear whether Amazon has committed to building a supercomputing cluster for Anthropic.”
A unique element here of course is Anthropic’s financial and equity partnership with Google, whose Google Cloud unit comptes with industry leader Amazon AWS (Microsoft Azure being #2 in the cloud data center space):
“Anthropic has a similar but smaller cloud partnership with Google, which has also invested billions of dollars into Anthropic. Google has also been selling its own AI, Gemini, to Google Cloud customers, meaning it hasn’t been as reliant on Anthropic as Amazon has.”
“Amazon also has been developing its own AI that could eventually rival Anthropic’s.”
And then of course are the rapidly dynamics of the multiple corporate alliances, amongst these top LLM AI and Cloud Data Center companies:
“OpenAI is in a much stronger financial position than Anthropic, in terms of losses as a percentage of revenue. Anthropic recently projected it would generate $83 million of revenue a month by the end of this year, with 25% to 50% of that figure being paid out to Anthropic’s cloud partners in the form of revenue sharing. OpenAI generates four to five times more revenue than Anthropic and pays a smaller percentage to Microsoft, its exclusive cloud provider.”
“Anthropic’s ability to use servers from two different cloud providers might be an advantage compared to OpenAI’s situation. OpenAI has been frustrated with Microsoft’s ability to provide it with servers, prompting OpenAI to seek an alternative provider—Oracle and Crusoe.”
And all this doesn’t include Meta and Elon Musk’s xAI Grok AI Infrastructure data center builds that are both in the 100,000+ Nvidia chip configurations.
Anthropic of course is rapidly expanding its relationships with enterprise customers and software companies with Amazon AWS, as this deal today with Palantir illustrates. This is an active area for LLM AI companies, and the cloud services companies as I discussed a few days ago.
These evolving alliances and investments in this AI Tech Wave again point to the multi-trillion dollar AI ‘Compute’ builds ahead. Amazon of course is making sure it’s aggressively running in the race. With Anthropic’s help. Stay tuned.
(NOTE: The discussions here are for information purposes only, and not meant as investment advice at any time. Thanks for joining us here)