This article was originally published on Fool.com. All figures quoted in US dollars unless otherwise stated.
This article was originally published on Fool.com. All figures quoted in US dollars unless otherwise stated.
There's no doubt Nvidia (NASDAQ: NVDA) has been the big winner of the first stage of the AI revolution. In the wake of OpenAI's ChatGPT debut in November 2022, virtually all businesses and cloud giants have clamoured for its market-leading GPUs to fuel generative AI.
Of course, as the outright leader, Nvidia was able to charge a pretty penny for its chips, as evidenced by its skyrocketing revenue and margins over the past 18 months.
But as a reaction, most cloud computing giants have begun investing in their own custom accelerators, which are partially co-designed by ASIC companies Broadcom (NASDAQ: AVGO) and Marvell Technologies.
Could the rise of custom ASICs crowd out demand for Nvidia? In its recent earnings release, Broadcom CEO Hock Tan offered words of caution.
'I flipped in my view last quarter'
Last week, Broadcom released its fiscal third-quarter earnings. Broadcom beat analyst expectations, though its guidance came in a little light.
As its name indicates, Broadcom has broad exposure to the chip industry, and many of its non-AI businesses are still mired in a two-year downturn. However, Broadcom's AI businesses across custom ASICs, networking, and optical interconnect are all booming. In terms of custom ASICs, CEO Hock Tan noted on the conference call with analysts that Broadcom's AI ASIC revenue grew 3.5 times over the past year.
Despite having this ASIC business, Tan used to say that the 'merchant' market, or rather neutral general third-party chipmakers like Nvidia, would eventually win the day in terms of the AI accelerator market, in keeping with the history of semiconductors.
But when asked by an analyst last Thursday, Tan noted, "I flipped in my view. And I did that, by the way, last quarter, maybe even six months ago."
Tan now says that he believes the cloud giants will increasingly train large language models on their own custom silicon ASICs, due to the cost advantages of making one's own chips, as well as the added advantage of taking some control back from Nvidia.
Those GPUs are much more -- or XPUs -- are much more important. And if that's the case, what better thing to do than bringing the control under the control of your own destiny by creating your own custom silicon accelerators?
And that's what I'm seeing all of them do. It's just doing it at different rates and they're starting at different times. But they all have started ... they will all go in that direction totally into ASIC or, as we call it, XPUs, custom silicon.
So, what could this mean for Nvidia?
On its call with analysts, Nvidia disclosed that 45% of its revenue currently comes from cloud hyper-scalers, and more than 50% came from large internet companies -- likely meaning Meta Platforms -- and other enterprises.
When one looks at the 45% of revenue from the cloud operators and then additional revenue from Meta, which is also ramping its own custom accelerators with Broadcom, that's a good chunk of Nvidia's revenue going to customers who are currently looking for in-house alternatives.
Now, Nvidia investors shouldn't panic that all of its cloud demand will go to zero. We still appear to be in the early innings of the AI buildout, and many customers of cloud companies still like to use Nvidia's latest and greatest chips, with their developers familiar with Nvidia's CUDA software.
Some cloud vendors have been developing custom ASICs for years but are still clearly buying many Nvidia GPUs. Nvidia's new Blackwell chip, set to be released around the end of the year, will offer a step-change in performance from the current Hopper line.
So, there will still be demand for Nvidia chips, even from the cloud vendors.
But will Nvidia's margins and multiple suffer?
While there will continue to be demand for Nvidia's offerings, Nvidia likely won't retain the 90%-plus of the AI chip market it has garnered to date. It's also possible cloud companies will demand lower prices in the future, with perhaps more bargaining power, as they ramp up their custom ASIC offerings and offer them to customers at much lower prices.
One reason why Nvidia may have dipped after its recent earnings report is that its gross margins ticked down off their highs, perhaps suggesting limits to the company's pricing power, which seemed limitless throughout 2023 and earlier this year.
After this pullback, Nvidia stock trades at 48 times trailing and 36 times forward earnings estimates. So it's not as expensive as it once was and does not have an outlandish valuation for its quality.
But it still isn't a 'cheap' stock, which makes Nvidia vulnerable to any setback or hang-up. So, any deceleration, whether caused by ASIC competition or some other factor affecting the AI market generally could limit the stock's gains.
This article was originally published on Fool.com. All figures quoted in US dollars unless otherwise stated.
This article was originally published on Fool.com. All figures quoted in US dollars unless otherwise stated.