Nvidia’s CEO has said that AI Scaling Laws will increase computing power by a million times in the next decade, dismissing fears that artificial intelligence has hit a wall. Jensen Huang’s admission comes as Sam Altman, OpenAI’s chief executive, posted on X earlier this month saying “There is no wall.”
The “wall” in this context refers to the idea that the factors driving advancements in generative AI over the past 15 years—commonly known as scaling laws—have reached their limit.
there is no wall
— Sam Altman (@sama) November 14, 2024
However, Huang said at a special address in November that “the industry’s current trajectory scales computing power four-fold annually, projecting a million-fold increase over a decade.”
He added: “For comparison, Moore’s Law achieved a 100-fold increase per decade. These scaling laws apply not only to LLM (large language model) training but with the advent of OpenAI Strawberry also to inference.”
Moore’s Law describes the doubling of transistors on integrated circuits every two years, driving exponential computing power growth. Scaling laws, broader in scope, explain performance improvements in systems like AI, linking model size, data, and computation via power-law trends, extending beyond transistor density to other computational aspects.
To explain why, he shines a light on OpenAI’s latest model, o1, which outperforms GPT-4 in reasoning capabilities. It handles advanced math and complex tasks by employing a step-by-step process its maker describes as “thinking.” This enhanced inference relies on far more computing power than a typical ChatGPT response.
“We know that we need more compute whatever the approach is,” he told The Economist. As AI adoption grows, inference becomes increasingly critical. While Nvidia’s earlier GPU generations support inference, Blackwells will improve performance dramatically, making it “dozens of times better.” Already, inference accounts for at least half of Nvidia’s infrastructure use.
Nvidia boss optimistic over AI scaling as revenue earnings explode
Nvidia’s results for the quarter ending in October showcased its continued upward momentum. Despite a slight deceleration in growth, ReadWrite reported that the company’s revenue surpassed $35 billion, pointing out an impressive year-on-year increase of 94%.
While its next-generation chips aren’t available just yet, Nvidia’s Blackwell hardware should give the company another major boost in revenue. Companies will begin to upgrade or provide it to their customers, fueling Team Green’s data center business further.
Huang continued: “Over the next decade, we will accelerate our roadmap, to keep pace with training and inference scaling demands, and to discover the next plateaus of intelligence.”
Featured image: Ideogram