Skip to content

Search the site

micronmemoryearningsHBMNews

Memory milestones for Micron: Just one acronym matters to analysts...

"We are doing well with respect to our goals on HBM3E yields..."

Those in the semiconductor sector have to play the long game; it’s a feast and famine industry. Micron has played it with aplomb: After a period of belt-tightening for the memory chip maker, it’s now bountiful  – with data centres consuming record volumes of SSD and AI demand driving revenue.

“Memory is essential to extend the frontier of AI capability.

"Multiple vectors will drive AI memory demand over the coming years. Growing model sizes and input token requirements, multimodality, multi-agent solutions, continuous training, and the proliferation of inference workloads from cloud to the edge” said CEO Sanjay Mehrotra on a Q4 call.

But there was just one acronym on analysts' minds during a Q&A...

High-Bandwidth Memory (HBM)

The company sold over $1 billion in SSDs to the data centre for the first time during the quarter and Mehrotra said the growth was not just from AI, noting that “data center SSD demand continues to be driven by strong growth in AI as well as a recovery in traditional compute and storage.”

(A strategy focused on “greater levels of vertical integration, including Micron-designed controllers and firmware” is also paying off, he added.)

But if there was one thing on analysts’ minds, it was high-bandwidth memory (HBM) and nearly every question on the call was focused on it. As Micron executives have said on both the previous two earnings calls, Micron is sold out of its HBM supply through 2025.

Did Musk's "Colossus" eat all the H200s?

Micron earnings by technology, 2024.

"We are extremely focused on delivering our goals of getting to our share in HBM to be in line with DRAM share sometime in 2025, extremely focused on continuing to ramp up our production capacity and yield ramp, which are going well according to our plan," said Mehrotra, adding that "we are doing well with respect to our goals on HBM3E yields with 8-high. And in '25, of course, we will be increased – beginning our output in early 2025 with 12-high."

(He was referring to the different forms of its latest HBM memory: Its HBM3E 12-high 36GB offers more than 1.2 TB/s of memory bandwidth for example; the kind of firepower those looking to deliver performant AI workloads are desperate for – an OpenAI engineer previously noted that memory is "frequently the bottleneck, not necessarily compute" for training LLMs.)

LLMs’ “Bullsh*t” problem, DARPA, testing for nonsense

"Twelve-high will be going through its own yield ramp and 12-high will be ramping through our calendar year '25 and HBM4 will be a 2026 product. Like any other new product, of course, there are in the early stages, always ramp-up of yield involved. But we are... doing really quite well in terms of continuing to ramp up the yield and the quality of our products" said Micron's CEO.

Micron invested $8.1 billion in CapEx in fiscal 2024. Fiscal 2025 CapEx will be “meaningfully higher” and comfortably over $10 billion it suggested.

It did not mention previous problems in China (where the government had suggested its products represented a security risk) but said it continues work on facility in Xi'an.

Latest