AI: "Some day all computers would work this way"
...Steve and Bill's journeys through technologies
The Bigger Picture, Sunday January 14, 2024
“Within ten minutes it was obvious to me that all computers would work this way”. According to Silicon Valley mythology, that’s what Steve Jobs came away with in late 1979 when visiting Xerox Palo Alto Research Center (PARC). And seeing their implementation of a Graphical User Interface (GUI), and a prototypical ‘Mouse’. All developed by the Alan Kay Learning Research Group.
If Steve had lived longer than 20,984 days, he might have had a similar reaction after seeing ChatGPT and the work by OpenAI with its founder/CEO Sam Altman. The average for most humans is 30,000 days, 40,000 if we’re lucky.
Both Steve Jobs (56) and Bill Gates (68), his peer, arch-rival, and eventually friend in the 2007 photo together above, dented the world of computers for the rest of us. And Bill had that opportunity to see AI, well before his 25 thousandth day. He got an early peek at OpenAI’s GPT LLM AI and ChatGPT in September 2022, a little earlier than most, when released in late November 2022.
For most of us, ‘within ten minutes’ of interacting with ChatGPT, it was obvious ‘that all computers would work this way’. Especially after experiencing the ‘multimodal’ versions of ChatGPT and other AI chat agents of late. And thus we are on yet another wild tech ride in these early days of the AI Tech Wave. In all their emerging shapes and forms.
As that other tech founder/CEOs like Jensen Huang of Nvidia observed in 2017, “AI Software was poised to eat traditional software’. Empowering most humans to ’program’ the computers directly n English. Or the language of their choice.
Gates has been excited about the ‘Age of AI’ for some time now. And he seems increasingly convinced too, that ‘all computers will work this way soon’.
So it’s notable that this week he did a podcast episode on his series ‘Unconfuse me with Bill Gates’ with Sam Altman. Sitting down with the founder/CEO of OpenAI. The 33 minute interview is worth a listen this Sunday for ‘the Bigger Picture’ around AI.
There is also a transcript of the interview here for those who’d rather see it in that form. The whole episode is worth listening to or reading. But here’s a piece of it that illustrates how some day ‘all computers will work this way’. Especially after we get the currently imposing ‘Compute’ costs of LLM AI software and hardware down:
“BILL GATES: In terms of equity, technology is often expensive, like a PC or Internet connection, and it takes time to come down in cost. I guess the costs of running these AI systems, it looks pretty good that the cost per evaluation is going to come down a lot?”
“SAM ALTMAN: It’s come down an enormous amount already. GPT-3, which is the model we’ve had out the longest and the most time to optimize, in the three and a little bit years that it has been out, we’ve been able to bring the cost down by a factor of 40. For three years’ time, that’s a pretty good start.“
“For 3.5, we’ve brought it down, I would bet, close to 10 at this point. Four is newer, so we haven’t had as much time to bring the cost down there, but we will continue to bring the cost down. I think we are on the steepest curve of cost reduction ever of any technology I know, way better than Moore’s Law.”
“It’s not only that we figured out how to make the models more efficient, but also, as we understand the research better, we can get more knowledge, we can get more ability into a smaller model. I think we are going to drive the cost of intelligence down to so close to zero that it will be this before-and-after transformation for society.”
“Right now, my basic model of the world is cost of intelligence, cost of energy. [Bill laughs] Those are the two biggest inputs to quality of life, particularly for poor people, but overall. If you can drive both of those way down at the same time, the amount of stuff you can have, the amount of improvement you can deliver for people, it’s quite enormous.”
“We are on a curve, at least for intelligence, we will really, really deliver on that promise. Even at the current cost, which again, this is the highest it will ever be and much more than we want, for 20 bucks a month, you get a lot of GPT-4 access, and way more than 20 bucks’ worth of value. We’ve come down pretty far.”
I’ve discussed in previous posts the merits of ‘Small AI’ as well as ‘Big AI’, that Sam touches on with Bill above. And how we’ve barely experienced the beginning of what LLM/Generative AI is about.
Consider the half hour episode a longer glimpse into why this LLM/Generative AI wave could usher in the mainstream reality of ‘all computers’ working this way soon. It’s a useful ‘Bigger Picture’ this Sunday, as we race through the first month of an AI driven 2024 to come.
And for me it certainly brings to mind the poignant question of what Steve Jobs would think about and say. Had the same time to experience AI as his friend Bill Gates today. Stay tuned.