Nvidia wobbles as world's sole AI winner

A dip in Nvidia's share price, after another set of astonishing results, may reflect concern about the lopsided nature of the market.

Iain Morris, International Editor

August 29, 2024

5 Min Read
Nvidia CEO Jensen Huang wearing leather jacket
Jensen Huang, Nvidia's boss, saw his compensation rise 60% last year.(Source: Nvidia)

Few sectors are as grossly lopsided as the young market for what people have taken to calling artificial intelligence but really smacks of advanced pattern recognition and analytics. On the one side are the cash-burning "generative AI" companies like OpenAI and their Big Tech sponsors like Microsoft, not to mention the hundreds of other organizations, including telcos, captivated by a technology that makes them no money. On the other is Nvidia.

The giant chipmaker has emerged from relative obscurity as a producer of graphics processing units (GPUs) for games to become AI's serendipitous sole big winner. GPUs, as luck would have it, are ideal for training the large language models that underpin generative AI, mainly because they can perform multiple calculations at the same time, unlike the general-purpose central processing units (CPUs) used for routine data-center tasks. Nvidia faced hardly any competition in this GPU market and with CUDA, its software platform, has been able to build a protective moat around its business.

All this has brought Midas-like wealth to Nvidia and its bosses, keeping CEO Jensen Huang, who has the look of an intellectual Hell's Angel with superior grooming, in a lifetime supply of expensive leather and other luxuries. His own compensation rocketed 60% in Nvidia's last fiscal year, to more than $34 million (exceeding the $32.6 million earned by Rory Read during his final year as boss of Vonage, a company that has had markedly different fortunes from Nvidia since it was bought by Ericsson).

Even the relative lackeys earn juicy rewards. In March, when Nvidia advertised for a senior software engineer to work on Aerial, a radio access network product that is relatively inconsequential to Huang, the midpoint of its promised salary range was $260,000.

Unreal performance

Nvidia can afford all this because its own financial performance looks unreal. For the recently ended second quarter (the May-to-July period), it booked about $30 billion in sales, a year-over-year increase of 122%, and a net profit of almost $17 billion, up 152% on the year-earlier quarter. Yet Nvidia's stock-market reward for all this was a 2.1% dip in its share price, which is still worth 28 times more than it was five years ago. In pre-market trading, at the time of writing, shares were down 4.5%.

The surface reason appears twofold. First, Nvidia's guidance for revenues in the current quarter wasn't quite as lofty as analysts were hoping. Second, Blackwell, Nvidia's next-generation GPU, has encountered a few small design problems. Fixing them may take several weeks.

But the small wobble despite such massive gains reflects a growing nervousness about that market imbalance. During the regular call about results, Toshiya Hari, an analyst with Goldman Sachs, asked for Huang's view on the "heated debate" about customers' return on investment "and what that means for the sustainability of capex going forward." In other words: If customers don't see the financial benefits of generative AI, won't they eventually stop buying from Nvidia?

Huang's response to this question and another on the same topic was to insist returns are already good and that a step change is happening in the data-center market as GPUs substitute for CPUs that are "running out of steam," like elite soldiers replacing fallen comrades.

"You have $1 trillion worth of general-purpose computing infrastructure. And the question is, do you want to build more of that or not?" he said, according to a Motley Fool transcript.  "And for every $1 billion worth of Juniper CPU-based infrastructure that you stand up, you probably rent it for less than $1 billion." People are "clamoring" for Hopper, the current-generation GPUs, and Blackwell because "they start saving money" with the switch from CPUs, he said.

But extreme lopsidedness is not a healthy condition. Challengers such as AMD and Intel have failed to have much impact and slow Nvidia. A sign of that is Nvidia's outrageous gross margin, up 4.5 percentage points year-over-year, to 75.7%, for the second quarter. For comparison, Intel weighed in with a gross margin of just 38.7% for its own second quarter (down 1.1 percentage points), while AMD managed 53% (a gain of three points). Nvidia, it seems, can effectively charge what it wants.

The telco pitch

For all the excitement about GPUs and Huang's apparent antipathy toward CPUs, Nvidia may interest telcos mainly as a CPU alternative to Intel. Grace, the CPU part of Nvidia's Grace Hopper superchip, could feasibly be combined with radio access network (RAN) software and replace the more customized silicon found in most of today's infrastructure. Intel dominates this small virtual RAN market today, but its current difficulties could focus attention on cultivating rivals that use Arm, an alternative CPU architecture.

Nvidia now leads this pack, according to one source outside the company who, requesting anonymity, shared results of recent trials comparing Nvidia with other chipmakers. A Grace CPU with 72 Arm "cores" (the building blocks of a chip) beat other Arm licensees on various measures. Combined with an accelerator card from Qualcomm, used to process more demanding code, it also did better than an Intel CPU and accelerator, leaving a bigger number of cores free for other tasks.

This would not be the preference of Nvidia, which is positioning Hopper with Aerial software as the accelerator. But GPUs are notoriously power-hungry, and even Nvidia admits that a telco deployment makes economic sense only if they are multipurpose, used for AI tasks as well as RAN acceleration. The question is whether those AI benefits will eventually outweigh the upfront GPU costs. If the answer for telcos and other companies is no, that bubble will eventually burst.

In the meantime, Huang continues to preach about artificial general intelligence (AGI), the next frontier. Generative AI turns out to be not very smart at all – incapable of reasoning or understanding causality, key traits of higher-level intelligence. The "general" should be redundant but provides a convenient new label and abbreviation to market the real thing and differentiate it from its pattern-recognition predecessor. Behavior by humans toward other species shows higher forms of intelligence exploit lower forms. So if AGI really does surpass our cognitive abilities, as the cheerleaders insist it will, prepare to be enslaved.

Read more about:

AI

About the Author

Iain Morris

International Editor, Light Reading

Iain Morris joined Light Reading as News Editor at the start of 2015 -- and we mean, right at the start. His friends and family were still singing Auld Lang Syne as Iain started sourcing New Year's Eve UK mobile network congestion statistics. Prior to boosting Light Reading's UK-based editorial team numbers (he is based in London, south of the river), Iain was a successful freelance writer and editor who had been covering the telecoms sector for the past 15 years. His work has appeared in publications including The Economist (classy!) and The Observer, besides a variety of trade and business journals. He was previously the lead telecoms analyst for the Economist Intelligence Unit, and before that worked as a features editor at Telecommunications magazine. Iain started out in telecoms as an editor at consulting and market-research company Analysys (now Analysys Mason).

Subscribe and receive the latest news from the industry.
Join 62,000+ members. Yes it's completely free.

You May Also Like