The artificial intelligence (AI) revolution is no longer a distant promise—it’s a present-day reality. At the forefront of this transformation is Nvidia, whose dominance in AI chips has made it a household name. Yet, beneath the spotlight of its GPUs lies a subtler but equally critical shift: the redefinition of AI infrastructure. By acquiring AI startups, investing in cloud partnerships, and expanding domestic manufacturing, Nvidia is not just building a chip empire—it’s reshaping the entire ecosystem. For investors, this signals an opportunity to look beyond the GPU spotlight and uncover underappreciated players like Micron Technology, whose role in AI infrastructure is poised to deliver outsized returns.
The Nvidia Effect: From Chips to Ecosystems
Nvidia’s 2025 acquisitions—Gretel, Lepton AI, and CentML—highlight its strategy to control the full AI stack. Gretel’s synthetic data tools address privacy and data scarcity, while Lepton AI’s GPU rental platform democratizes access to AI compute. CentML’s optimization software ensures these models run efficiently on Nvidia hardware. These moves are not isolated; they reflect a broader trend. In 2024 alone, Nvidia participated in 49 funding rounds for AI startups, including $100 million in OpenAI’s $6.6 billion round and $1.05 billion in Wayve’s autonomous driving venture.
Nvidia’s investments are not just about financial returns—they’re about ecosystem control. By backing startups that enhance data quality, cloud accessibility, and model efficiency, Nvidia ensures its GPUs remain the backbone of AI innovation. This strategy is amplified by its domestic manufacturing partnerships with TSMC, Foxconn, and Wistron, which aim to produce $500 billion in AI infrastructure over four years. The result? A self-reinforcing cycle where Nvidia’s hardware, software, and cloud partnerships lock in market share.
The Hidden Catalyst: Micron Technology’s AI Infrastructure Play
While Nvidia grabs headlines, Micron Technology operates in the shadows of the AI boom. As a supplier of high-bandwidth memory (HBM), Micron’s HBM3E chips are critical for AI training and inferencing. These chips power Nvidia’s Blackwell Ultra B300 GPU and its upcoming GB200 and GB300 systems. What sets Micron apart is its technological edge: its 12-high HBM3E design offers 50% more memory capacity than SK Hynix’s 8-high version, while reducing power consumption by 30%.
Micron’s financials underscore its strategic importance. In Q2 2025, the company reported non-GAAP diluted EPS of $1.56, with operating margins at 24.9% and EBITDA margins at 50.7%. HBM gross margins of 50–55%—well above industry averages—reflect its premium positioning. Data center revenue now accounts for over half of Micron’s total revenue, a testament to surging demand for AI memory. For fiscal 2025, Micron raised its revenue guidance to $11.1–$11.3 billion, with non-GAAP EPS projected to jump 141% year-over-year.
Why Micron Is Undervalued—and Why That’s a Problem
Despite these strengths, Micron trades at a forward P/E of 10, significantly lower than the Nasdaq-100’s 30. This undervaluation stems from market myopia: investors fixate on AI chipmakers like Nvidia and AMD while overlooking the memory and storage layer. Yet, the HBM market is projected to grow from $4 billion in 2023 to $130 billion by 2033, with Micron capturing 24% of the market by year-end.
Micron’s partnerships with Nvidia and AMD are further cementing its role. Its HBM3E is already in production for next-gen AI systems, and its $14 billion capex plan for 2025—funded by new facilities in Idaho and Singapore—positions it to meet surging demand. Additionally, its $150 billion U.S. manufacturing commitment aligns with global supply chain resilience trends, making it a strategic partner for hyperscalers like Microsoft and Amazon.
The Investment Case: Balancing Risk and Reward
For investors, the key is to balance Nvidia’s visibility with Micron’s potential. While Nvidia’s $100 billion market cap and $100 million OpenAI investment signal its dominance, Micron’s $70 billion valuation offers a more compelling risk-reward profile. Its HBM market share growth, 50%+ gross margins, and alignment with AI’s infrastructure needs make it a high-conviction play.
However, risks persist. The HBM market is capital-intensive, and Micron’s $14 billion capex could strain short-term margins. Additionally, geopolitical tensions and supply chain disruptions could impact its U.S. manufacturing plans. Yet, these risks are mitigated by Micron’s technological leadership and long-term contracts with AI leaders.
Conclusion: Look Beyond the GPU Spotlight
Nvidia’s strategic acquisitions and investments are more than a corporate strategy—they’re a blueprint for the future of AI. By controlling the ecosystem, from data to cloud to hardware, Nvidia ensures its dominance. But for investors seeking alpha, the real opportunity lies in the underappreciated players like Micron. As AI adoption accelerates, the demand for HBM and memory solutions will outpace even the most bullish GPU forecasts. Micron’s undervalued stock, robust financials, and critical role in AI infrastructure make it a compelling addition to any portfolio.
In the AI arms race, the winners won’t just be the chipmakers—they’ll be the enablers. And Micron, with its HBM3E and strategic vision, is ready to lead.