Micron announces HBM4 36GB shipment, enhancing AI performance and power efficiency, supporting growing data center demands.
Quiver AI Summary
Micron Technology, Inc. has announced the shipment of HBM4 36GB 12-high samples to key customers, highlighting its leadership in high-performance memory essential for AI applications. The new HBM4 memory features a 2048-bit interface, delivering speeds over 2.0 TB/s, which is more than 60% faster than its predecessor, HBM3E, and offers over 20% better power efficiency. This advancement is critical for managing inference in generative AI workloads, enabling quicker insights and fostering innovation across various sectors like healthcare and finance. Micron plans to ramp up production of HBM4 in 2026 to align with customer needs for next-gen AI platforms, reinforcing its position as a key player in the AI memory market.
Potential Positives
- Micron Technology announced the successful shipment of HBM4 36GB 12-high samples, reinforcing its leadership in memory performance and power efficiency for AI applications.
- The new HBM4 memory features a 2048-bit interface, achieving speeds greater than 2.0 TB/s per memory stack and over 60% better performance compared to the previous generation, enhancing AI inference capabilities.
- HBM4 offers over 20% better power efficiency than the previous HBM3E products, maximizing data center efficiency while providing high throughput.
- Micron's role in the AI revolution is solidified as it aligns the ramp of HBM4 production with customers' next-generation AI platform readiness, ensuring seamless integration into the evolving technology landscape.
Potential Negatives
- The announcement focuses heavily on the upcoming HBM4 product but lacks specificity regarding mass production timelines, which may raise concerns about the company's ability to meet customer demands and expectations.
- Although the release emphasizes improvements in performance and power efficiency, it does not provide comparative data on market competition, which may leave doubts about Micron's competitive positioning.
- The press release highlights future plans for HBM4 ramp-up but does not disclose any risks or challenges associated with the production or integration of the new technology, potentially downplaying potential setbacks.
FAQ
What is Micron's latest memory product?
Micron has announced the shipment of HBM4 36GB 12-high samples to key customers, enhancing AI capabilities.
How does HBM4 improve AI performance?
HBM4 features a 2048-bit interface achieving speeds over 2.0 TB/s, enhancing inference performance for AI models.
What are the power efficiency improvements of HBM4?
HBM4 offers over 20% better power efficiency than the previous HBM3E generation, maximizing throughput while minimizing power consumption.
When will HBM4 be ramped for production?
Micron plans to ramp HBM4 production in calendar year 2026, aligning with customer AI platform readiness.
What sectors can benefit from HBM4 technology?
HBM4 will drive innovations in healthcare, finance, and transportation, facilitating quicker insights and discoveries.
Disclaimer: This is an AI-generated summary of a press release distributed by GlobeNewswire. The model used to summarize this release may make mistakes. See the full release here.
$MU Congressional Stock Trading
Members of Congress have traded $MU stock 4 times in the past 6 months. Of those trades, 1 have been purchases and 3 have been sales.
Here’s a breakdown of recent trading of $MU stock by members of Congress over the last 6 months:
- REPRESENTATIVE ROBERT BRESNAHAN sold up to $15,000 on 04/08.
- REPRESENTATIVE JULIE JOHNSON sold up to $15,000 on 04/07.
- REPRESENTATIVE JEFFERSON SHREVE sold up to $50,000 on 04/07.
- SENATOR ASHLEY MOODY purchased up to $100,000 on 01/22.
To track congressional stock trading, check out Quiver Quantitative's congressional trading dashboard.
$MU Insider Trading Activity
$MU insiders have traded $MU stock on the open market 19 times in the past 6 months. Of those trades, 0 have been purchases and 19 have been sales.
Here’s a breakdown of recent trading of $MU stock by insiders over the last 6 months:
- SCOTT J DEBOER (EVP, CTO and Products Officer) sold 23,047 shares for an estimated $2,304,700
- APRIL S ARNZEN (EVP and Chief People Officer) has made 0 purchases and 2 sales selling 19,890 shares for an estimated $1,859,376.
- MICHAEL CHARLES RAY (SVP, Chief Legal Officer) has made 0 purchases and 2 sales selling 13,050 shares for an estimated $1,305,000.
- SANJAY MEHROTRA (President and CEO) has made 0 purchases and 6 sales selling 7,500 shares for an estimated $784,985.
- SCOTT R. ALLEN (CVP, Chief Accounting Officer) sold 5,000 shares for an estimated $508,800
- MARY PAT MCCARTHY has made 0 purchases and 7 sales selling 4,808 shares for an estimated $415,712.
To track insider transactions, check out Quiver Quantitative's insider trading dashboard.
$MU Hedge Fund Activity
We have seen 938 institutional investors add shares of $MU stock to their portfolio, and 902 decrease their positions in their most recent quarter.
Here are some of the largest recent moves:
- CAPITAL WORLD INVESTORS added 13,185,983 shares (+32.1%) to their portfolio in Q1 2025, for an estimated $1,145,730,062
- INFINITUM ASSET MANAGEMENT, LLC added 6,871,200 shares (+8589.0%) to their portfolio in Q1 2025, for an estimated $597,038,568
- JPMORGAN CHASE & CO removed 6,543,762 shares (-20.7%) from their portfolio in Q1 2025, for an estimated $568,587,480
- NORGES BANK added 5,799,020 shares (+49.7%) to their portfolio in Q4 2024, for an estimated $488,045,523
- CAPITAL INTERNATIONAL INVESTORS added 4,892,312 shares (+29.8%) to their portfolio in Q1 2025, for an estimated $425,092,989
- NATIONAL BANK OF CANADA /FI/ removed 4,134,305 shares (-50.4%) from their portfolio in Q1 2025, for an estimated $359,229,761
- D. E. SHAW & CO., INC. removed 4,031,603 shares (-96.1%) from their portfolio in Q1 2025, for an estimated $350,305,984
To track hedge funds' stock portfolios, check out Quiver Quantitative's institutional holdings dashboard.
$MU Analyst Ratings
Wall Street analysts have issued reports on $MU in the last several months. We have seen 3 firms issue buy ratings on the stock, and 1 firms issue sell ratings.
Here are some recent analyst ratings:
- Wedbush issued a "Underperform" rating on 04/28/2025
- Citigroup issued a "Buy" rating on 04/01/2025
- Barclays issued a "Overweight" rating on 03/21/2025
- KeyBanc issued a "Overweight" rating on 12/19/2024
To track analyst ratings and price targets for $MU, check out Quiver Quantitative's $MU forecast page.
$MU Price Targets
Multiple analysts have issued price targets for $MU recently. We have seen 5 analysts offer price targets for $MU in the last 6 months, with a median target of $125.0.
Here are some recent targets:
- An analyst from Morgan Stanley set a target price of $112.0 on 03/21/2025
- Matt Bryson from Wedbush set a target price of $125.0 on 12/19/2024
- Brian Chin from Stifel Nicolaus set a target price of $130.0 on 12/19/2024
- Tom O'Malley from Barclays set a target price of $110.0 on 12/19/2024
- Aaron Rakers from Wells Fargo set a target price of $140.0 on 12/19/2024
Full Release
A Media Snippet accompanying this announcement is available in this link.
BOISE, Idaho, June 10, 2025 (GLOBE NEWSWIRE) -- The importance of high-performance memory has never been greater, fueled by its crucial role in supporting the growing demands of AI training and inference workloads in data centers. Micron Technology, Inc. (Nasdaq: MU), today announced the shipment of HBM4 36GB 12-high samples to multiple key customers. This milestone extends Micron’s leadership in memory performance and power efficiency for AI applications. Built on its well-established 1ß (1-beta) DRAM process, proven 12-high advanced packaging technology and highly capable memory built-in self-test (MBIST) feature, Micron HBM4 provides seamless integration for customers and partners developing next-generation AI platforms.
A leap forward
As use of generative AI continues to grow, the ability to effectively manage inference becomes more important. Micron HBM4 features a 2048-bit interface, achieving speeds greater than 2.0 TB/s per memory stack and more than 60% better performance over the previous generation.
1
This expanded interface facilitates rapid communication and a high-throughput design that accelerates the inference performance of large language models and chain-of-thought reasoning systems. Simply put, HBM4 will help AI accelerators respond faster and reason more effectively.
Additionally, Micron HBM4 features over 20% better power efficiency compared to Micron’s previous-generation HBM3E products, which first established new, unrivaled benchmarks in HBM power efficiency in the industry. 2 This improvement provides maximum throughput with the lowest power consumption to maximize data center efficiency. 2
Generative AI use cases continue to multiply, and this transformative technology is poised to deliver significant benefits to society. HBM4 is a crucial enabler, driving quicker insights and discoveries that will foster innovation in diverse fields such as healthcare, finance and transportation.
"Micron HBM4’s performance, higher bandwidth and industry-leading power efficiency are a testament to our memory technology and product leadership," said Raj Narasimhan, senior vice president and general manager of Micron’s Cloud Memory Business Unit. "Building on the remarkable milestones achieved with our HBM3E deployment, we continue to drive innovation with HBM4 and our robust portfolio of AI memory and storage solutions. Our HBM4 production milestones are aligned with our customers’ next-generation AI platform readiness to ensure seamless integration and volume ramp."
Intelligence Accelerated: Micron’s role in the AI revolution
For nearly five decades, Micron has pushed the boundaries of memory and storage innovation. Today, Micron continues to accelerate AI by delivering a broad portfolio of solutions that turn data into intelligence, fueling breakthroughs from the data center to the edge. With HBM4, Micron reinforces its position as a critical catalyst for AI innovation and a reliable partner for our customers’ most demanding solutions.
Micron plans to ramp HBM4 in calendar year 2026, aligned to the ramp of customers’ next-generation AI platforms. For more information on Micron HBM4, visit https://www.micron.com/products/memory/hbm .
Additional resources:
About Micron Technology, Inc.
Micron Technology, Inc. is an industry leader in innovative memory and storage solutions, transforming how the world uses information to enrich life
for all
. With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron
®
and Crucial
®
brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence (AI) and compute-intensive applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more about Micron Technology, Inc. (Nasdaq: MU), visit micron.com.
© 2025 Micron Technology, Inc. All rights reserved. Information, products, and/or specifications are subject to change without notice. Micron, the Micron logo, and all other Micron trademarks are the property of Micron Technology, Inc. All other trademarks are the property of their respective owners.
Micron Product and Technology Communications Contact:
Mengxi Liu Evensen
+1 (408) 444-2276
[email protected]
Micron Investor Relations Contact
Satya Kumar
+1 (408) 450-6199
[email protected]
1 Based on internal Micron HBM4 testing and published HBM3E specifications (2.0 TB/s vs. 1.2 TB/s).
2 Based on internal Micron simulation projections in comparison to Micron HBM3E 36GB 12-high and similar competitive products.