This article was originally published on Fool.com. All figures quoted in US dollars unless otherwise stated.
The proliferation of artificial intelligence (AI) has turned out to be a huge catalyst for the semiconductor industry, and that's not surprising as training and moving AI models into production is possible only through chips that are being deployed in huge numbers in data centers.
This is the reason why Nvidia (NASDAQ: NVDA) has witnessed a massive spurt in demand for its graphics processing units (GPUs). The ability of GPUs to process huge amounts of data simultaneously has made them the default choice for cloud service providers to train large language models (LLMs). This explains why Nvidia's data center business has been growing at a phenomenal pace over the past few quarters and is expected to get even bigger in the future.
However, GPUs aren't the only kind of chips that are in terrific demand thanks to the surge in AI adoption. Micron Technology (NASDAQ: MU), a manufacturer of memory chips, has also been winning big from the growing adoption of AI. This article takes a closer look at how AI has been driving Micron's business and examines why this memory specialist looks like a better AI stock to buy right now when compared to Nvidia.
Micron Technology is benefiting from AI adoption in multiple areas
Micron Technology released its fiscal 2024 fourth-quarter results (for the three months ended Aug. 29) on Sept. 25. The company's revenue shot up 93% year over year to $7.75 billion, exceeding the consensus estimate of $7.65 billion. Additionally, Micron swung to a profit of $1.18 per share last quarter as compared to a loss of $1.07 per share in the same period last year. Analysts were expecting the company to deliver $1.11 per share in earnings.
It wasn't surprising to see Micron crushing Wall Street's estimates as the memory market's dynamics have been improving because of AI. For instance, the demand for Micron's data-center memory chips is exceeding supply, and that's not surprising as these chips are used by the likes of Nvidia while manufacturing their GPUs. More specifically, AI-focused GPUs are equipped with high-bandwidth memory (HBM) chips thanks to their ability to process huge amounts of data quickly.
This is the reason why Micron expects the HBM market to generate annual revenue of $25 billion in 2025 as compared to just $4 billion last year. The company also adds that it sold "several hundred millions of dollars" worth of HBM last year, and it has sold out its HBM capacity for next year already.
Meanwhile, the adoption of AI is also leading to an improvement in the demand for solid-state drives (SSDs) used in data centers. Micron's data center SSD revenue tripled in fiscal 2024. It won't be surprising to see Micron maintaining impressive growth in this segment in the future as the deployment of AI servers is expected to drive an average annual growth of 60% in data-center SSD demand in the coming years, according to market research firm TrendForce.
However, this isn't where Micron's AI-related catalysts end. The adoption of AI-enabled PCs is going to be another solid growth driver for the company, driving impressive growth in volumes in both compute and storage-memory chips. On its latest earnings conference call, Micron CEO Sanjay Mehrotra remarked:
AI PCs require a higher capacity of memory and storage. As an example, leading PC OEMs have recently announced AI-enabled PCs with a minimum of 16GB of DRAM for the value segment and between 32GB to 64GB for the mid and premium segments, versus an average content across all PCs of around 12GB last year.
A similar scenario is unfolding in the smartphone market, where Micron points out that Android original equipment manufacturers (OEMs) are launching AI smartphones with 12 gigabytes (GB) to 16GB of dynamic random access memory (DRAM), up from the average capacity of 8GB seen in 2023 flagship smartphones.
IDC estimates that the market for generative AI smartphones could grow at an annual rate of 78% through 2028. On the other hand, Gartner estimates that AI PC shipments could jump a whopping 165% next year. The stunning growth of these end markets presents a bright multiyear opportunity for Micron to grow its sales and earnings thanks to the increase in memory consumption by AI-enabled edge devices, such as smartphones and PCs.
All this indicates that Micron is a more diversified AI stock than Nvidia, as it stands to gain from the adoption of this technology in several ways beyond just data centers. Even better, Micron's forecast suggests that it is about to clock faster growth than its illustrious peer.
Micron's guidance and valuation make it a top AI stock to buy right now
Micron is coming off a terrific fiscal Q4 with remarkable growth in its top and bottom lines. The company expects the momentum to continue in fiscal 2025's Q1. It has guided for $8.7 billion in revenue in the current quarter along with adjusted earnings of $1.74 per share at the midpoint.
The top-line guidance points toward an 85% year-over-year increase, while the bottom line would be a big improvement over the year-ago period's non-GAAP loss of $0.95 per share. For comparison, Nvidia is expecting its top line to grow 80% year over year in the current quarter. That was one of the reasons why investors pressed the panic button after its latest results as the chipmaker has been consistently delivering triple-digit growth in the past few quarters.
What's more, Micron's valuation means that investors looking to add an AI stock to their portfolios right now would do well to buy it right away over Nvidia. Micron's forward price-to-earnings ratio of 11 is significantly lower than Nvidia's forward-earnings multiple of 44, making the former a no-brainer investment right now, considering its eye-popping growth.
This article was originally published on Fool.com. All figures quoted in US dollars unless otherwise stated.