AI: Nvidia still Reigns. RTZ #365
...momentum continue to build from Hopper to Blackwell AI GPU chips
Yes, Nvidia continues to reign in this AI Tech Wave , as the company announced record quarterly results again tonight. As the WSJ notes in “Nvidia’s Sales Triple, Signaling AI Boom’s Staying Power”:
“CEO Jensen Huang gives better-than-expected outlook, saying demand remains strong for the company’s chips”.
“Nvidia delivered a record quarter and signaled that the AI boom is still going strong, driving its already meteoric stock up Wednesday evening above $1,000 a share.”
“The chips made by Nvidia have powered the rise of artificial intelligence, which is threatening to disrupt virtually every major industry. Chief Executive Jensen Huang declared the beginning of a new industrial revolution where Nvidia was helping turn $1 trillion of data centers into “AI factories.”
“AI will bring significant productivity gains to nearly every industry and help companies be more cost- and energy-efficient,” Huang said Wednesday.”
“Nvidia executives also told analysts that demand remains strong for the company’s current AI chip as well as for its next-generation product expected later this year.”
The momentum continued to accelerate, including revenues from key customers, who are also long-term competitors, as I’ve highlighted.
“Revenue in the latest quarter more than tripled from a year earlier to $26 billion, and net profit soared more than sevenfold to $14.88 billion. Both numbers were quarterly records for Nvidia, and beat analysts’ expectations.”
“Nvidia’s chief financial officer, Colette Kress, said Wednesday that large cloud-computing companies such as Alphabet’s Google, Microsoft and Amazon.com accounted for somewhere around 45% of the company’s data-center revenue—more than $10 billion.”
“Nvidia’s stock price has more than tripled in the past 12 months, sending its valuation above $2 trillion. The stock rose 6% in after-hours trading Wednesday following its earnings report, surpassing $1,000 a share—although Nvidia said it would split its stock 10-for-1, effective June 7. It also increased its dividend to 10 cents a share from 4 cents, based on the current share count.”
“Nvidia’s sales turned sharply upward about a year ago, after OpenAI’s ChatGPT wowed users with its ability to generate humanlike text. OpenAI used thousands of Nvidia’s AI chips to create ChatGPT, analysts say, and there are few alternatives for the computation-intensive job of creating and deploying such systems.”
And the momentum will likely continue with the company’s next generation AI GPU Blackwell chips taking the mantle from the current Hopper H100 chips:
“The AI boom turned Nvidia’s chips into hotly contested commodities, with tech CEOs jostling over who has more of them. Determined to stay atop the heap, Nvidia plans to launch a new generation of AI chips late this year, following their unveiling in March at a company conference some dubbed the “AI Woodstock.”
“Those chips, code-named Blackwell, are set to cost more than $30,000 each, setting the stage for a further surge in sales if the appetite for AI chips stays strong and Nvidia fends off challenges from competitors and regulators.”
“Huang on Wednesday said the company was producing the chips now and that shipments would start in the second quarter ahead of them being in operation in the fourth quarter. Those chips would bring in “a lot” of revenue this year, he said.”
The company also continues to make headway on the Inference side of LLM AI computing, an area that investors have been focused on, as I’ve highlighted as well.
So far the quarter continues on the company’s execution path for ‘Accelerated Computing’ it highlighted a few weeks ago, along with the roadmap for its AI products and services going into later this year and next.
The Information has more salient items of note from the earnings call of longer-term import, including the company’s software ‘moat’ of its open source CUDA AI software framework:
“Even if competitors are able to build better chips, Huang reminded investors of the dominance of Cuda, the Nvidia-made software that AI app developers use in conjunction with its chips, essentially locking developers into Nvidia GPUs. CFO Kress and CEO Huang said recent improvements to Cuda improved inference performance of its H100 chips by three times. “That kind of tells you something about the richness of our architecture and the richness of our software,” Huang said. That, and app developers don’t want the hassle of switching to new software if they use another chip!”
For now, Nvidia and its foundry partner TSMC, continue to be in pole position for AI GPU infrastructure for the next couple of years at least. Stay tuned.
(NOTE: The discussions here are for information purposes only, and not meant as investment advice at any time. Thanks for joining us here)