Connect with us

AI Research

Advanced Micro Devices vs. Micron Technology

Published

on


  • Shares of AMD and Micron Technology have soared impressively in the past three months.

  • Both are set to benefit from identical end markets, but one of them is growing at a much faster pace.

  • The valuation will make it clear which of these semiconductor stocks is worth buying right now.

  • 10 stocks we like better than Micron Technology ›

The demand for artificial intelligence (AI) chips has been increasing at a nice pace in the past few years. Major cloud service providers (CSPs), hyperscalers, and governments have been spending a lot of money on shoring up their cloud infrastructure so that they can run AI workloads.

This explains why the businesses of Advanced Micro Devices (NASDAQ: AMD) and Micron Technology (NASDAQ: MU) have gained terrific traction in recent quarters. As a result, shares of both these chip designers have clocked impressive gains in the past three months. AMD has jumped 32% during this period, and Micron stock is up 36%.

But if you had to put your money into just one of these AI semiconductor stocks right now, which one should it be? Let’s find out.

Image source: Getty Images.

AMD designs chips that go into personal computers (PCs), servers, and gaming consoles, and for other applications such as robotics, automotive, and industrial automation. AI has created impressive demand for the company’s chips in these areas, leading to healthy growth in its top and bottom lines.

The company’s revenue in the first quarter of 2025 was up by 36% from the year-ago period to $7.4 billion, while non-GAAP earnings per share shot up by 55% to $0.96. This solid growth was primarily driven by the data center and PC markets, which accounted for 81% of its top line. AMD’s data center revenue was up by 57% from the year-ago period, while the PC business reported a 68% increase.

In the data center business, AMD sells both central processing units (CPUs) and graphics processing units (GPUs) that are deployed in AI servers. The demand for both these products is strong, which is evident from the terrific growth the company recorded in Q1. Importantly, AMD estimates that the market for AI accelerator chips in data centers could create a $500 billion annual revenue opportunity in 2028.

So, the outstanding growth that AMD clocked in the data center business in Q1 seems sustainable, especially considering that it generated $12.6 billion in revenue from data center chip sales last year — nearly double the 2023 revenue. AMD is pushing the envelope on the product development front with new chips that are expected to pack in a serious performance upgrade and may even help it take market share away from Nvidia.

Meanwhile, AMD’s consistent market share gains in PC CPUs make it a solid bet on the secular growth of the AI PC market, which is expected to clock an annual growth rate of 42% in shipments through 2028. All this indicates that AMD is on track to take advantage of the growing adoption of AI chips in multiple applications, and that’s expected to lead to an acceleration in its bottom-line growth.

Consensus estimates are projecting a 17% jump in AMD’s earnings this year, followed by a bigger jump of 45% in 2026. As such, this semiconductor company is likely to remain a top AI stock in the future as well.

Micron Technology manufactures and sells memory chips that are used for both computing and storage purposes, and the likes of AMD and Nvidia are its customers. In fact, just like AMD, Micron’s memory chips are used in AI accelerators such as GPUs and custom processors, PCs, and the smartphone and automotive end markets.

Micron has been witnessing outstanding demand for a type of chip known as high-bandwidth memory (HBM), which is known for its ability to transmit huge amounts of data at high speeds. This is the reason why HBM is being deployed in AI accelerators, and the demand for this memory type is so strong that the likes of Micron have already sold out their capacity for this year.

Not surprisingly, Micron is ramping up its HBM production capacity, and it’s going to increase its capital expenditure to $14 billion in the current fiscal year from $8.1 billion in the previous one. The company’s focus on improving its HBM production capacity is a smart thing to do from a long-term perspective, as this market is expected to grow to $100 billion in annual revenue by 2030, compared to $35 billion this year.

Micron’s memory chips are used in PCs and smartphones as well. Apart from the growth in unit volumes that AI-enabled PCs and smartphones are expected to create going forward, the amount of memory going into these devices is also expected to increase. CEO Sanjay Mehrotra remarked on the company’s latest earnings conference call:

AI adoption remains a key driver of DRAM content growth for smartphones, and we expect more smartphone launches featuring 12 gigabytes or more compared to eight gigabytes of capacity in the average smartphone today.

Similarly, AI-enabled PCs are expected to sport at least 16GB of DRAM to run AI workloads, up by a third when compared to the average DRAM content in PCs last year. So, just like AMD, Micron is on its way to capitalizing on multiple AI-focused end markets. However, it is growing at a much faster pace than AMD because of the tight memory supply created by AI, which is leading to a nice increase in memory prices.

The favorable pricing environment is the reason why Micron’s adjusted earnings more than tripled in the previous quarter to $1.91 per share on the back of a 37% increase in its top line. Analysts are forecasting a 6x jump in Micron’s earnings in the current fiscal year, and they have raised their earnings expectations for the next couple of years as well.

MU EPS Estimates for Current Fiscal Year Chart
MU EPS Estimates for Current Fiscal Year data by YCharts.

So, Micron stock seems poised to sustain its impressive growth momentum, thanks to the AI-fueled demand for HBM.

Both AMD and Micron are growing at solid rates, with the latter clocking a much faster pace thanks to the favorable demand-supply dynamics in the memory industry. What’s more, Micron is trading at a significantly cheaper valuation compared to AMD, despite its substantially stronger growth.

AMD PE Ratio Chart
AMD PE Ratio data by YCharts.

Investors looking for a mix of value and growth can pick Micron over AMD, considering the former’s attractive valuation and the phenomenal earnings growth that it can deliver. However, one can’t go wrong with AMD either. The company should be able to justify its valuation in the long run, considering its ability to clock stronger earnings growth.

Before you buy stock in Micron Technology, consider this:

The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Micron Technology wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.

Consider when Netflix made this list on December 17, 2004… if you invested $1,000 at the time of our recommendation, you’d have $699,558!* Or when Nvidia made this list on April 15, 2005… if you invested $1,000 at the time of our recommendation, you’d have $976,677!*

Now, it’s worth noting Stock Advisor’s total average return is 1,060% — a market-crushing outperformance compared to 180% for the S&P 500. Don’t miss out on the latest top 10 list, available when you join Stock Advisor.

See the 10 stocks »

*Stock Advisor returns as of June 30, 2025

Harsh Chauhan has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices and Nvidia. The Motley Fool has a disclosure policy.

Better Artificial Intelligence (AI) Stock: Advanced Micro Devices vs. Micron Technology was originally published by The Motley Fool



Source link

AI Research

OpenAI business to burn $115 billion through 2029 The Information

Published

on


OpenAI CEO Sam Altman walks on the day of a meeting of the White House Task Force on Artificial Intelligence (AI) Education in the East Room at the White House in Washington, D.C., U.S., September 4, 2025.

Brian Snyder | Reuters

OpenAI has sharply raised its projected cash burn through 2029 to $115 billion as it ramps up spending to power the artificial intelligence behind its popular ChatGPT chatbot, The Information reported on Friday.

The new forecast is $80 billion higher than the company previously expected, the news outlet said, without citing a source for the report.

OpenAI, which has become one of the world’s biggest renters of cloud servers, projects it will burn more than $8 billion this year, some $1.5 billion higher than its projection from earlier this year, the report said.

The company did not immediately respond to Reuters request for comment.

To control its soaring costs, OpenAI will seek to develop its own data center server chips and facilities to power its technology, The Information said.

OpenAI is set to produce its first artificial intelligence chip next year in partnership with U.S. semiconductor giant Broadcom, the Financial Times reported on Thursday, saying OpenAI plans to use the chip internally rather than make it available to customers.

The company deepened its tie-up with Oracle in July with a planned 4.5-gigawatts of data center capacity, building on its Stargate initiative, a project of up to $500 billion and 10 gigawatts that includes Japanese technology investor SoftBank. OpenAI has also added Alphabet’s Google Cloud among its suppliers for computing capacity.

The company’s cash burn will more than double to over $17 billion next year, $10 billion higher than OpenAI’s earlier projection, with a burn of $35 billion in 2027 and $45 billion in 2028, The Information said.

Read the complete report by The Information here.



Source link

Continue Reading

AI Research

The Energy Monster AI Is Creating

Published

on


We don’t really know how much energy artificial intelligence is consuming. There aren’t any laws currently on the books requiring AI companies to disclose their energy usage or environmental impact, and most firms therefore opt to keep that controversial information close to the vest. Plus, large language models are evolving all the time, increasing in both complexity and efficiency, complicating outside efforts to quantify the sector’s energy footprint. But while we don’t know exactly how much electricity data centers are eating up to power ever-increasing AI integration, we do know that it’s a whole lot. 

“AI’s integration into almost everything from customer service calls to algorithmic “bosses” to warfare is fueling enormous demand,” the Washington Post recently reported. “Despite dramatic efficiency improvements, pouring those gains back into bigger, hungrier models powered by fossil fuels will create the energy monster we imagine.”

And that energy monster is weighing heavily on the minds of policymakers around the world. Global leaders are busily wringing their hands over the potentially disastrous impact AI could have on energy security, especially in countries like Ireland, Saudi Arabia, and Malaysia, where planned data center development outpaces planned energy capacity. 

In a rush to keep ahead of a critical energy shortage, public and private entities involved on both the tech and energy sides of the issue have been rushing to increase energy production capacities by any means. Countries are in a rush to build new power plants as well as to keep existing energy projects online beyond their planned closure dates. Many of these projects are fossil fuel plants, causing outcry that indiscriminate integration of artificial intelligence is undermining the decarbonization goals of nations and tech firms the world over. 

“From the deserts of the United Arab Emirates to the outskirts of Ireland’s capital, the energy demands of AI applications and training running through these centres are driving the surge of investment into fossil fuels,” reports the Financial Times. Globally, more than 85 gas-powered facilities are currently being built to meet AI’s energy demand according to figures from Global Energy Monitor.

In the United States, the demand surge is leading to the resurrection of old coal plants. Coal has been in terminal decline for years now in the U.S., and a large number of defunct plants are scattered around the country with valuable infrastructure that could lend itself to a speedy new power plant hookup. Thanks to the AI revolution, many of these plants are now set to come back online as natural gas-fired plants. While gas is cleaner than coal, the coal-to-gas route may come at the expense of clean energy projects that could have otherwise used the infrastructure and coveted grid hookups of defunct coal-fired power plants. 

“Our grid isn’t short on opportunity — it’s short on time,” Carson Kearl, Enverus senior analyst for energy and AI, recently told Fortune. “These grid interconnections are up for grabs for new power projects when these coal plants roll off. The No. 1 priority for Big Tech has changed to [speed] to energy, and this is the fastest way to go in a lot of cases,” Kearl continued.

Last year, Google stated that the company’s carbon emissions had skyrocketed by a whopping 48 percent over the last five years thanks to its AI integration. “AI-powered services involve considerably more computer power – and so electricity – than standard online activity, prompting a series of warnings about the technology’s environmental impact,” the BBC reported last summer. Google had previously pledged to reach net zero greenhouse gas emissions by 2030, but the company now concedes that “as we further integrate AI into our products, reducing emissions may be challenging.”

By Haley Zaremba for Oilprice.com 

More Top Reads From Oilprice.com





Source link

Continue Reading

AI Research

Who is Shawn Shen? The Cambridge alumnus and ex-Meta scientist offering $2M to poach AI researchers

Published

on


Shawn Shen, co-founder and Chief Executive Officer of the artificial intelligence (AI) startup Memories.ai, has made headlines for offering compensation packages worth up to $2 million to attract researchers from top technology companies. In a recent interview with Business Insider, Shen explained that many scientists are leaving Meta, the parent company of Facebook, due to constant reorganisations and shifting priorities.“Meta is constantly doing reorganizations. Your manager and your goals can change every few months. For some researchers, it can be really frustrating and feel like a waste of time,” Shen told Business Insider, adding that this is a key reason why researchers are seeking roles at startups. He also cited Meta Chief Executive Officer Mark Zuckerberg’s philosophy that “the biggest risk is not taking any risks” as a motivation for his own move into entrepreneurship.With Memories.ai, a company developing AI capable of understanding and remembering visual data, Shen is aiming to build a niche team of elite researchers. His company has already recruited Chi-Hao Wu, a former Meta research scientist, as Chief AI Officer, and is in talks with other researchers from Meta’s Superintelligence Lab as well as Google DeepMind.

From full scholarships to Cambridge classrooms

Shen’s academic journey is rooted in engineering, supported consistently by merit-based scholarships. He studied at Dulwich College from 2013 to 2016 on a full scholarship, completing his A-Level qualifications.He then pursued higher education at the University of Cambridge, where he was awarded full scholarships throughout. Shen earned a Bachelor of Arts (BA) in Engineering (2016–2019), followed by a Master of Engineering (MEng) at Trinity College (2019–2020). He later continued at Cambridge as a Meta PhD Fellow, completing his Doctor of Philosophy (PhD) in Engineering between 2020 and 2023.

Early career: Internships in finance and research

Alongside his academic pursuits, Shen gained early experience through internships and analyst roles in finance. He worked as a Quantitative Research Summer Analyst at Killik & Co in London (2017) and as an Investment Banking Summer Analyst at Morgan Stanley in Shanghai (2018).Shen also interned as a Research Scientist at the Computational and Biological Learning Lab at the University of Cambridge (2019), building the foundations for his transition into advanced AI research.

From Meta’s Reality Labs to academia

After completing his PhD, Shen joined Meta (Reality Labs Research) in Redmond, Washington, as a Research Scientist (2022–2024). His time at Meta exposed him to cutting-edge work in generative AI, but also to the frustrations of frequent corporate restructuring. This experience eventually drove him toward building his own company.In April 2024, Shen began his academic career as an Assistant Professor at the University of Bristol, before launching Memories.ai in October 2024.

Betting on talent with $2M offers

Explaining his company’s aggressive hiring packages, Shen told Business Insider: “It’s because of the talent war that was started by Mark Zuckerberg. I used to work at Meta, and I speak with my former colleagues often about this. When I heard about their compensation packages, I was shocked — it’s really in the tens of millions range. But it shows that in this age, AI researchers who make the best models and stand at the frontier of technology are really worth this amount of money.”Shen noted that Memories.ai is looking to recruit three to five researchers in the next six months, followed by up to ten more within a year. The company is prioritising individuals willing to take a mix of equity and cash, with Shen emphasising that these recruits would be treated as founding members rather than employees.By betting heavily on talent, Shen believes Memories.ai will be in a strong position to secure additional funding and establish itself in the competitive AI landscape.His bold $2 million offers may raise eyebrows, but they also underline a larger truth: in today’s technology race, the fiercest competition is not for customers or capital, it’s for talent.





Source link

Continue Reading

Trending