AI: SK Hynix ready for its AI close-up. RTZ #557
...data center based HBM (high-bandwidth memory) and local RAM (random access memory) the key Memory hardware inputs for AI
At this point in the AI Tech Wave, we understand that AI GPUs from Nvidia, AI semi fabs from Taiwan Semiconductor (TSMC), trillions in AI data centers, multi-Gigawatts of Power for said data centers, and tons of related infrastructure for all of the above are critical ‘table stakes’ AI INPUTS in short supply for the next few years.
Add to that of course MEMORY chips, both the High-Bandwidth (HBM) kind, and the local RAM on smartphones led by Apple. The leading company for HBM currently is SK Hynix, a South Korean tech juggernaut. It’s another tech company doing the right thing at the right time AI story. And a key input for Nvidia itself to roll out its next generation Blackwell AI GPUs.
Bloomberg lays it out in “SK Group’s Billionaire Scion Bet on a Failing Chipmaker and Won Big”:
“South Korea’s Jensen”
“Cutting-edge chips for artificial intelligence have multiplied Nvidia Corp.’s share price and made its co-founder and CEO Jensen Huang a global rock star. Not as well known, but equally dramatic, has been the related rise of SK Group Chairman Chey Tae-won (64) in South Korea. His SK Hynix Inc., a memory maker that had long labored in the shadow of Samsung Electronics Co., has become Nvidia’s key partner in providing advanced, high-density memory chips.”
“The money spigot that Nvidia tapped has also made SK Hynix and Taiwan Semiconductor Manufacturing Co. the most prominent AI beneficiaries in Asia.”
“For Chey, it’s been a transformative time. Better known by the Anglicized “Tony” in international circles, the billionaire has been crisscrossing the globe and chatting with industry leaders like Huang, Microsoft Corp. Chief Executive Officer Satya Nadella and OpenAI boss Sam Altman. This month, he spoke at the Asia-Pacific Economic Cooperation CEO Summit in Peru — which he’ll chair at next year’s edition in Seoul — and hosted his own conference dubbed the SK AI Summit.”
And like many good stories, it has elements of turnarounds and redemptions:
“It’s a huge rebound from his brush with the law a decade ago, when he was found guilty of mishandling funds and served time in prison — sharing what seems an unfortunately common rite of passage among Korean business leaders, including Samsung heir Jay Y. Lee. Chey himself is a nephew of SK Group’s founder.”
And of course personal transformations:
“Chey has thrown himself into the role of AI evangelist, openly discussing its promises as well as risks and dilemmas. He often muses on the question of how much is the right amount to invest in the capital-intensive global contest for AI leadership. His style of speaking off the cuff contrasts markedly with other Korean tycoons, who shun publicity as much as humanly possible.”
And of course the epic narrative of a leader defying the odds with his conviction and execution:
“The chairman’s renewed confidence reflects SK Hynix’s turbulent history. Chey made a highly risky bet to acquire debt-ridden Hynix in 2012. Against the urging of his lieutenants, the head of oil-to-telecom SK Group made an 11th-hour offer to Hynix’s creditors, which had failed in their two previous attempts to find a suitor. Some thought Chey had lost his mind.”
The company itself also has quite the turnaround story as well, occupying a key spot in Box 1 of the AI tech stack:
“Hynix traces its roots to Hyundai Group. It got started as Hyundai Electronics Co. in 1983. It changed hands several times since then, including during the height of Korea’s financial crisis in the late 1990s, when the government required large conglomerates to swap assets among themselves. Then, a deep downturn in DRAM prices plunged the company into the red and its ownership came under its creditors.”
And racing to success under the noses of larger competitors:
“Since the acquisition, SK has put billions of dollars into research and development. Its most critical decision was to continue developing high-bandwidth memory (HBM) at a time when senior leaders at Samsung didn’t see the technology as a priority and practically disbanded their HBM team.”
SK Hynix of course works closely with TSMC on their HBM chip generations, including HBM4:
“When OpenAI unveiled ChatGPT in late 2022 and set off a frenzy of demand for Nvidia’s standard-setting AI accelerators, SK Hynix was ready to ride the wave. HBM is essential to helping those AI chips reach their full potential, and SK Hynix was selected as the primary provider by Nvidia. The Korean firm’s value has shot up to 117 trillion won ($84 billion), more than a 110% increase since the start of 2023. It’s become Korea’s second most valuable company.”
And of course now there is the possible narrative of a larger competitors that could come from behind with billions in resources:
“Samsung is still three times as large and vastly better resourced, with its HBM technology on a path to approval by Nvidia. In the meantime, Chey and SK Hynix are pressing their advantage.”
But the persistent newcomer is focused on keeping its lead:
“The smaller memory maker is offering more bonuses and perks than Samsung, to motivate its current employees and to attract more talent from the bigger company — which this year has faced worker strikes and a series of tech setbacks. SK Hynix’s engineers say with a sense of pride, to match that of their reenergized chairman, that HBM actually stands for Hynix’s Best Memory.”
The whole piece is worth reading for SK Hynix’s headstart in HBM with the traditional meaning, high bandwidth memory.
Besides Samsung and others, Micron in the US is also playing catch up in HBM for the multi-hundred billions in AI data centers. But SK Hynix is the one to beat for now, as Nvidia is in AI GPUs, and TSMC in semi fabs. Stay tuned.
(NOTE: The discussions here are for information purposes only, and not meant as investment advice at any time. Thanks for joining us here)