Be a part of high executives in San Francisco on July 11-12, to listen to how leaders are integrating and optimizing AI investments for achievement. Learn More
The period of ever-larger artificial intelligence fashions is coming to an finish, in keeping with OpenAI CEO Sam Altman, as price constraints and diminishing returns curb the relentless scaling that has outlined progress within the subject.
Talking at an MIT occasion final week, Altman recommended that additional progress wouldn’t come from “big, big fashions.” In response to a current Wired report, he mentioned, “I feel we’re on the finish of the period the place it’s going to be these, like, big, big fashions. We’ll make them higher in different methods.”
Although Mr. Altman didn’t cite it immediately, one main driver of the pivot from “scaling is all you need” is the exorbitant and unsustainable expense of coaching and working the highly effective graphics processes wanted for big language fashions (LLMs). ChatGPT, as an illustration, reportedly required more than 10,000 GPUs to coach, and calls for much more assets to repeatedly function.
Nvidia dominates the GPU market, with about 88% market share, according to John Peddie Analysis. Nvidia’s latest H100 GPUs, designed particularly for AI and high-performance computing (HPC),can price as a lot as $30,603 per unit — and much more on eBay.
Occasion
Remodel 2023
Be a part of us in San Francisco on July 11-12, the place high executives will share how they’ve built-in and optimized AI investments for achievement and averted widespread pitfalls.
Coaching a state-of-the-art LLM can require lots of of hundreds of thousands of {dollars}’ value of computing, mentioned Ronen Dar, cofounder and chief expertise officer of Run AI, a compute orchestration platform that hastens data science initiatives by pooling GPUs.
As prices have skyrocketed whereas advantages have leveled off, the economics of scale have turned towards ever-larger fashions. Progress will as a substitute come from enhancing mannequin architectures, enhancing knowledge effectivity, and advancing algorithmic methods past copy-paste scale. The period of limitless knowledge, computing and mannequin dimension that remade AI over the previous decade is lastly drawing to a detailed.
‘Everybody and their canine is shopping for GPUs’
In a current Twitter Areas interview, Elon Musk recently confirmed that his corporations Tesla and Twitter have been shopping for 1000’s of GPUs to develop a brand new AI firm that’s now officially called X.ai.
“It looks like everybody and their canine is shopping for GPUs at this level,” Musk said. “Twitter and Tesla are definitely shopping for GPUs.”
Dar identified these GPUs will not be accessible on demand, nevertheless. Even for the hyperscaler cloud suppliers like Microsoft, Google and Amazon, it might probably typically take months — so corporations are literally reserving entry to GPUs. “Elon Musk should wait to get his 10,000 GPUs,” he mentioned.
VentureBeat reached out to Nvidia for a touch upon Elon Musk’s newest GPU buy, however didn’t get a reply.
Not simply concerning the GPUs
Not everybody agrees {that a} GPU disaster is on the coronary heart of Altman’s feedback. “I feel it’s truly rooted in a technical statement over the previous yr that we could have made fashions bigger than needed,” mentioned Aidan Gomez, co-founder and CEO of Cohere, which competes with OpenAI within the LLM area.
A TechCrunch article reporting on the MIT occasion reported that Altman sees dimension as a “false measurement of mannequin high quality.”
“I feel there’s been means an excessive amount of give attention to parameter rely, possibly parameter rely will pattern up for certain. However this jogs my memory a whole lot of the gigahertz race in chips within the Nineties and 2000s, the place all people was making an attempt to level to a giant quantity,” Altman mentioned.
Nonetheless, the truth that Elon Musk just bought 10,000 data center-grade GPUs signifies that, for now, entry to GPUs is all the things. And since that entry is so costly and exhausting to come back by, that’s definitely a disaster for all however essentially the most deep-pocketed of AI-focused corporations. And even OpenAI’s pockets solely go so deep. Even they, it seems, could finally need to look in a brand new course.
VentureBeat’s mission is to be a digital city sq. for technical decision-makers to realize information about transformative enterprise expertise and transact. Discover our Briefings.