Connect with us

AI Research

Nebius stock soars on AI infrastructure deal with Microsoft

Published

on


Thomas Fuller | SOPA Images | Lightrocket | Getty Images

Shares of Nebius Group soared more than 60% in extended trading on Monday after the provider of Nvidia graphics chips for training artificial intelligence models said it’s signed an agreement with Microsoft worth up to $19.4 billion over five years.

Nebius, which is based in Amsterdam, will deliver computing resources to Microsoft using a data center in New Jersey, according to a statement. Nebius changed its name from Yandex NV last year after Russian investors bought Yandex’s Russian-language search engine and other assets.

Microsoft is increasingly looking to third parties to address a demand shortage for cloud infrastructure that can handle AI workloads. OpenAI, one of Microsoft’s top Azure cloud customers, has been searching for additional capacity as users ramp up their adoption of ChatGPT. OpenAI’s latest computing supplier is Google.

Microsoft has also turned to CoreWeave for additional AI computing power, and OpenAI has signed a direct agreement with CoreWeave that’s worth billions of dollars.

CoreWeave shares were up about 5% after hours, while Microsoft shares were little changed.

Nebius said it’s looking at financing options to grow faster than it had planned, according to the statement. In a filing with the SEC, the company said that the GPU services will be deployed in multiple tranches this year and next, and that the total contract value through 2031 is $17.4 billion, with Microsoft reserving the option to buy an additional $2 billion worth of services.

Founded in 1989, Nebius said in November that it has opened office space in San Francisco, Dallas and New York as it expands in the U.S. “Growing our presence in the US means we can be closer to our customers and support innovative American AI businesses on their journey into the future,” the company wrote in a blog post at the time.

Prior to the post-market pop, Nebius had already more than doubled in value this year, closing on Monday with a market cap of just over $15 billion.

— CNBC’s Ari Levy contributed to this report.

WATCH: AI infrastructure build-up is a long-term story as adoption is only consumer based now



Source link

AI Research

National Research Platform to Democratize AI Computing for Higher Ed

Published

on


As higher education adapts to artificial intelligence’s impact, colleges and universities face the challenge of affording the computing power necessary to implement AI changes. The National Research Platform (NRP), a federally funded pilot program, is trying to solve that by pooling infrastructure across institutions.

Running large language models or training machine learning systems requires powerful graphics processing units (GPUs) and maintenance by skilled staff, Frank Würthwein, NRP’s executive director and director of the San Diego Supercomputer Center, said. The demand has left institutions either reliant on temporary donations and collaborations with tech companies, or unable to participate at all.

“The moment Google no longer gives it for free, they’re basically stuck,” Würthwein said.


Cloud services like Amazon Web Services and Azure offer these tools, he said, but at a price not every school can afford.

Traditionally, universities have tried to own their own research computing resources, like the supercomputer center at the University of California, San Diego (UCSD). But individual universities are not large enough to make the cost of obtaining and maintaining those resources cost-effective.

“Almost nobody has the scale to amortize the staff appropriately,” he said.

Even UCSD has struggled to keep its campus cluster affordable. For Würthwein, scaling up is the answer.

“If I serve a million students, I can provide [AI] services for no more than $10 a year per student,” he said. “To me, that’s free, because if you think about in San Diego, $10 is about a beer.”

A NATIONAL APPROACH

NRP adds another option for acquiring AI computing resources through cross-institutional pooling. Built on the earlier Pacific Research Platform, the NRP organizes a distributed computing system called the Nautilus Hypercluster, in which participating institutions contribute access to servers and GPUs they already own.

Würthwein said that while not every college has spare high-end hardware, many research institutions do, and even smaller campuses often have at least a few machines purchased through grants. These can be federated into NRP’s pool, with NRP providing system management, training and support. He said NRP employs a small, skilled staff that automates basic operations, monitors security and provides example curricula to partner institutions so that campuses don’t need local teams for those tasks.

The result is a distributed cloud supercomputer running on community contributions. According to a March 2025 slide presentation by Seungmin Kim, a researcher from the Yonsei University College of Medicine in Korea, the cluster now includes more than 1,400 GPUs, quadruple the initial National Science Foundation-funded purchase, thanks to contributions from participating campuses.

Since the project’s official launch in March 2023, NRP has onboarded more than 50 colleges and 84 geographic sites, according to Würthwein. NRP’s pilot goal is to reach 100 institutions, but he is already planning for 1,000 colleges after that, which would provide AI access to 1 million students.

To reach these goals, Würthwein said, NRP tries to reach both IT staff who manage infrastructure and faculty who manage curriculum. Regional research and education networks, such as California’s CENIC, connect NRP with campus CIOs, while the Academic Data Science Alliance connects with leaders on the teaching side.

WHAT STUDENTS AND FACULTY SEE

From the user side, the system looks like a one-stop cloud environment. Platforms like JupyterHub and GitLab are preconfigured and ready to use. The platform also hosts collaboration tools for storage, chats and video meetings that are similar to commercial offerings.

Würthwein said the infrastructure is designed so students can log in and run assignments and personalized learning tools that would normally require expensive computing resources.

“At some point … education will be considered subpar if it doesn’t provide that,” he said. “Institutions who have not transitioned to provide education like this, in this individualized fashion for every student, will fundamentally offer a worse product.”

For faculty, the same infrastructure supports research. Classroom usage tends to leave servers idle outside of peak times, leaving capacity for faculty projects. NRP’s model expects institutions to own enough resources to cover classroom needs, but anything unused can be pooled nationally. This could allow even teaching-focused colleges with modest resources to offer AI research experiences previously out of reach.

According to Kim’s presentation, researchers have used the platform to predict the efficiency of gene editing without lab experimentation and to map and detect wildfire patterns.

The system has already enabled collaboration beyond its San Diego campus. At Sonoma State University, faculty are working with a local vineyard to pair the system with drones, robotics and AI to enable vineyard management, Würthwein said. Making AI for classroom applications, enhancing research and enabling industry collaboration at more higher-education institutions is the overall goal.

“To me, that is the perfect trifecta of positive effects,” he said. “This is ultimately what we’re trying to achieve.”





Source link

Continue Reading

AI Research

Lenovo research shows that AI investments in healthcare industry soar by 169%

Published

on


Research from Lenovo reveals that 96% of retail sector AI deployments are meeting or exceeding expectations – outpacing other industries. While finance and healthcare are investing heavily, their results show mixed returns, highlighting sharp differences in how AI is being applied across sectors.

Lenovo research has demonstrated a huge rise in AI investments across the retail, healthcare and financial services sectors.

The CIO Playbook 2025, Lenovo’s study of EMEA IT leaders in partnership with IDC, uncovers sharply different attitudes, investment strategies, and outcomes across the Healthcare, Retail, and, Banking, Financial Services & Insurance (BFSI) industries.

Caution Pays Off for EMEA BFSI and Retail sectors

Of all the sectors analysed, BFSI stands out for its caution. Potentially reflecting the highly regulated nature of the industry, only 7% of organisations have adopted AI, and just 38% of AI budgets allocated to Generative AI (GenAI) in 2025 – the lowest across all sectors surveyed.

While the industry is taking a necessarily measured approach to innovation, the strategy appears to be paying dividends: BFSI companies reported the highest rate of AI projects exceeding expectations (33%), suggesting that when AI is deployed, it’s well-aligned with specific needs and workloads.

A similar pattern is visible in Retail, where 61% of organisations are still in the pilot phase. Despite below-average projected spending growth (97%), the sector reported a remarkable 96% of AI deployments to date either meeting or exceeding expectations, the highest combined satisfaction score among all industries surveyed.

Healthcare: Rapid Investment, Uneven Results

In contrast, the healthcare sector is moving quickly to catch up, planning a 169% increase in AI spending over 2025, the largest increase of any industry. But spend doesn’t directly translate to success. Healthcare currently has the lowest AI adoption rate and the highest proportion of organisations reporting that AI fell short of expectations.

This disconnect suggests that, while the industry is investing heavily, it may lack the internal expertise or strategy needed to implement AI effectively and may require stronger external support and guidance to ensure success.

One Technology, Many Journeys

“These findings confirm that there’s no one-size-fits-all approach to AI,” said Simone Larsson, Head of Enterprise AI, Lenovo. “Whether businesses are looking to take a bold leap with AI, or a more measured step-by-step approach, every industry faces unique challenges and opportunities. Regardless of these factors, identification of business challenges and opportunity areas followed by the development of a robust plan provides a foundation on which to build a successful AI deployment.”

The CIO Playbook 2025 is designed to help IT leaders benchmark their progress and learn from peers across industries and geographies. The report provides actionable insights on AI strategy, infrastructure, and transformation priorities in 2025 and beyond. The full CIO Playbook 2025 report for EMEA can be downloaded here.

Europe and Middle East CIO Playbook 2025, It’s Time for AI-nomics features research from IDC, commissioned by Lenovo, which surveyed 620 IT decision-makers in nine markets, [Denmark, Eastern Europe, France, Germany, Italy, Middle East, Netherlands, Spain and United Kingdom]. Fieldwork was conducted in November 2024.

Explore the full EMEA Lenovo AInomics Report here.

 





Source link

Continue Reading

AI Research

Augment Raises $85 Million for AI Teammate for Logistics

Published

on

By


Augment raised $85 million in a Series A funding round to accelerate the development of its artificial intelligence teammate for logistics, Augie.

The company will use the new capital to hire more than 50 engineers to “push the frontier of agentic AI” and to expand Augie into more logistics workflows for shippers, brokers, carriers and distributors, according to a Sept. 4 press release.

Augie performs tasks in quoting, dispatch, tracking, appointment scheduling, document collection and billing, the release said. It understands the context of every shipment and acts across email, phone, TMS, portals and chat.

“Logistics runs on millions of decisions—under pressure, across fragmented systems and with too many tabs open,” Augment co-founder and CEO Harish Abbott said in the release. “Augie doesn’t just assist. It takes ownership.”

Augment launched out of stealth five months ago, and the Series A funding brings its total capital raised to $110 million, according to the release.

When announcing the company’s launch in a March 18 blog post, Abbott said Augie does all the tedious work so that staff can focus on more important tasks.

“What exactly does Augie do?” Abbott said in the post. “Augie can read/write documents, respond to emails, make calls and receive calls, log into systems, do data entry and document uploads.”

Augie is now used by dozens of third-party logistics providers and shippers and supports more than $35 billion in freight under management, per the Sept. 4 press release.

Customers have reported a 40% reduction in invoice delays, an eight-day acceleration in billing cycles, 5% or greater gross margin recovery per load and, across all customers, millions of dollars in track and trace payroll savings, the release said.

Jacob Effron, managing director at Redpoint Ventures, which led the funding round, said in the release that Augment is “creating the system of work the logistics industry has always needed.”

“Customers consistently highlight Augment’s speed, deeply collaborative approach and transformative impact on productivity,” Effron said.

In another development in the space, Authentica said Tuesday (Sept. 9) that it launched an AI platform designed to deliver real-time supply chain visibility and automate compliance.

In May, AI logistics software startup Pallet raised $27 million in a Series B funding round.

For all PYMNTS B2B coverage, subscribe to the daily B2B Newsletter.



Source link

Continue Reading

Trending