AI Research
Researchers using AI for weather forecasting
Weather forecasting is not easy. The truth is that predicting future weather conditions over broad, or even narrow, swaths of Earth’s surface comes down to complex microphysical processes, and as College of Engineering Associate Professor and UConn Atmospheric and Air Quality Modeling Group Leader Marina Astitha puts it, nature is chaotic.
Astitha and her research group are at the forefront of exploring ways to improve weather prediction using AI and machine learning to enhance existing physics-based models. They developed new methods for the prediction of snowfall accumulation and wind gusts associated with extreme weather events in three recent papers published in the Journal of Hydrology, Artificial Intelligence for the Earth Systems, and another in the Journal of Hydrology.
Postdoctoral researcher Ummul Khaira Ph.D. ’24 led the snowfall prediction work during her time as a Ph.D student. Ph.D candidate Israt Jahan is passionate about building models that improve predictions of damaging wind gusts from storms.
The researchers met with UConn Today to discuss the importance and everyday applications of enhanced forecasting capabilities using these new technologies.
Are there forecasting challenges that are unique to the Northeast?
Astitha: There are characteristics about the Northeast that make it particularly difficult to make weather predictions for. This is especially true for winter weather because we have Nor’easters that can come from either the center of the country or from the Gulf. Some move slowly, and they are highly predictable. Some can be what we call a bomb cyclone, where they rush up here and dump a lot of snow in a small amount of time.
For weather forecasting, we traditionally use numerical weather prediction models that are based on physics principles and have seen large improvements over the last 20–30 years. We have been running our own weather forecasting system at UConn since 2014, based on physical models. However, numerical weather prediction comes with its own challenges due to uncertainty in parameterizations that are necessary when no physical laws are known for a specific process.
For windstorms, wind gusts specifically are a complicated variable. It’s wind, but the way we observe it and the way we model it is different.
Can you explain more about the physics used in numerical weather prediction models?
Astitha: Precipitation is a microscale process. As air rises and cools, clouds form, and within those clouds, tiny cloud droplets develop through complex microphysical interactions. Over time, some of these droplets grow large enough to become raindrops or snowflakes. Once they reach a critical size, gravity causes them to fall to the ground as precipitation. This entire process is governed by microphysical processes.
We try to predict such microphysical processes embedded in numerical weather models by solving many equations and parameterizations. These models describe our atmosphere as a 3D grid, dividing it into discrete boxes where we solve equations based on first principles (motion, thermodynamics, and more). This approach poses a major challenge: even with increased resolution, each grid cell often represents a large volume of air, typically one to four square kilometers. Despite efforts to refine the grid, these cells still encompass vast areas, limiting the model’s ability to resolve smaller-scale processes.
Numerical prediction is what got me here. 20 years ago, I could run a code to numerically solve physics equations of the atmosphere, and then I could tell approximately what the weather would be like the next day. That, to me, was mind-blowing!
Once you run one deterministic model, you get one answer that the temperature is going to be, say 75 degrees tomorrow in Storrs. That’s one potential realization of the future. Models like that are not capable of giving us an exact answer, because nature is chaotic. I’ve always had the mindset of looking at multiple models to have an idea of that uncertainty and variability, and if 10 different realizations give you 74, 75, or 76 degrees, you know you’re close.
Khaira: Few things are more humbling than a snowfall that defies prediction. My work lies in embracing that uncertainty in the chaos and building models not to promise perfection, but to offer communities and decision-makers a clearer window into what might lie ahead.
How is your recent research helping with the challenges of numerical weather prediction?
Astitha: Imagine a Nor’easter coming our way during wintertime; they come with a lot of snow and wind. We work with the Eversource Energy Center and we’re interested not only in the scientific advancement, but also the impact and accuracy in predicting when and where that storm is going to happen in Connecticut. Weather prediction accuracy influences the estimation of impacts; for example, power outages. We might underestimate or overestimate the impact by a lot. That makes winter storms of particular interest because of the impact they have on our society, our transportation networks, and electrical power distribution networks.
Five years ago, we decided to test whether a machine learning framework could help with wind gust and snowfall prediction. It comes with its own challenges and uncertainties, but we quickly saw that there is a lot of promise for these tools to correct errors and do better than what numerical weather prediction can do and at a fraction of the time. Machine learning and AI can help improve the analysis of wind gusts and snowfall, but these systems are not perfect either.
We want to be able to better predict storms over Connecticut and the Northeast U.S., which is why we started this exploration with ML/AI, even though most of the research out there about how to implement AI in weather prediction is either at the global scale or much coarser resolution, but we’re getting there.
Can you talk about the everyday impact of the research?
Astitha: An example is when the trees are full of leaves like they are in late spring and summer, and a storm comes in with a lot of rain and intense wind. Whole trees can come down and topple the power lines, which causes many disruptions around the state.
Our close collaboration with the Eversource Energy Center involves our immediate collaborators taking this weather prediction information and operationally predicting power outages for Connecticut and other service territories. That information can go to the utility managers, so they can prepare two to three days in advance, indicating a direct link from science and engineering to the application and to the manager.
I understand people’s frustrations and the need for answers about weather forecasts and impacts of storms. You want to know if your family is going to be safe and if you should or should not be out during particular times of the day. We’re doing this research to improve the reliability and accuracy of weather forecasting, so communities and stakeholders are aware of what’s happening when the storm hits their area and can take appropriate actions.
Jahan: It’s incredibly rewarding to know that my work has the potential to improve early warnings and give communities more time to prepare. By combining AI and uncertainty analysis, we’re not just making gust predictions more accurate—we are helping decision-makers plan with greater confidence.
More information:
Ummul Khaira et al, Investigating the role of temporal resolution and multi-model ensemble data on WRF/XGB integrated snowfall prediction for the Northeast United States, Journal of Hydrology (2025). DOI: 10.1016/j.jhydrol.2025.133313
Israt Jahan et al, Storm Gust Prediction with the Integration of Machine Learning Algorithms and WRF Model Variables for the Northeast United States, Artificial Intelligence for the Earth Systems (2024). DOI: 10.1175/AIES-D-23-0047.1
Ummul Khaira et al, Integrating physics-based WRF atmospheric variables and machine learning algorithms to predict snowfall accumulation in Northeast United States, Journal of Hydrology (2024). DOI: 10.1016/j.jhydrol.2024.132113
Provided by
University of Connecticut
Citation:
Researchers using AI for weather forecasting (2025, July 8)
retrieved 8 July 2025
from https://phys.org/news/2025-07-ai-weather.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
AI Research
Accelerating discovery: The NVIDIA H200 and the transformation of university research
The global research landscape is undergoing a seismic shift. Universities worldwide are deploying NVIDIA’s H200 Tensor Core GPUs to power next-generation AI Factories, SuperPODs, and sovereign cloud platforms. This isn’t a theoretical pivot; it’s a real-time transformation redefining what’s possible in scientific discovery, medicine, climate analysis, and advanced education delivery.
The H200 is the most powerful GPU currently available to academia, delivering the performance required to train foundational models, run real-time inference at scale, and enable collaborative AI research across institutions. And with NVIDIA’s Blackwell-based B200 on the horizon, universities investing in H200 infrastructure today are setting themselves up to seamlessly adopt future architectures tomorrow.
Universities powering the AI revolution
This pivotal shift isn’t a future promise but a present reality. Forward-thinking institutions worldwide are already integrating the H200 into their research ecosystems.
Institutions leading the charge include:
- Oregon State University and Georgia Tech in the US, deploying DGX H200 and HGX clusters.
- Taiwan’s NYCU and University of Tokyo, pushing high-performance computing boundaries with DGX and GH200-powered systems.
- Seoul National University, gaining access to a GPU network of over 4,000 H200 units.
- Eindhoven University of Technology in the Netherlands, preparing to adopt DGX B200 infrastructure.
In Taiwan, national programs like NCHC are also investing in HGX H200 supercomputing capacity, making cutting-edge AI infrastructure accessible to researchers at scale.
Closer to home, La Trobe University is the first in Australia to deploy NVIDIA DGX H200 systems. This investment underpins the creation of ACAMI — the Australian Centre for Artificial Intelligence in Medical Innovation — a world-first initiative focused on AI-powered immunotherapies, med-tech, and cancer vaccine development.
It’s a leap that’s not only bolstering research output and commercial partnerships but also positioning La Trobe as a national leader in AI education and responsible deployment.
Universities like La Trobe are establishing themselves as part of a growing global network of AI research precincts, from Princeton’s open generative AI initiative to Denmark’s national AI supercomputer, Gefion. The question for others is no longer “if”, but “how fast?”
Redefining the campus: How H200 AI infrastructure transforms every discipline
The H200 isn’t just for computer science. Its power is unlocking breakthroughs across:
- Climate science: hyper-accurate modelling for mitigation and prediction
- Medical research: from genomics to diagnostics to drug discovery
- Engineering and material sciences: AI-optimised simulations at massive scale
- Law and digital ethics: advancing policy frameworks for responsible AI use
- Indigenous language preservation: advanced linguistic analysis and voice synthesis
- Adaptive education: AI-driven, personalised learning pathways
- Economic modelling: dynamic forecasts and decision support
- Civic AI: real-time, data-informed public service improvements
AI infrastructure is now central to the entire university mission — from discovery and education to innovation and societal impact.
Positioning Australia in the global AI race
La Trobe’s deployment is more than a research milestone — it supports the national imperative to build sovereign AI capability. Australian companies like Sharon AI and ResetData are also deploying sovereign H200 superclusters, now accessible to universities via cloud or direct partnerships.
Universities that move early unlock more than infrastructure. They strengthen research impact, gain eligibility for key AI grants, and help shape Australia’s leadership on the global AI stage.
NEXTDC indispensable role: The foundation for AI innovation
Behind many of these deployments is NEXTDC, Australia’s data centre leader and enabler of sovereign, scalable, and sustainable AI infrastructure.
NEXTDC is already:
- Hosting Sharon AI’s H200 supercluster in Melbourne in a high-density, DGX-certified, liquid-cooled facility
- Delivering ultra-low latency connectivity via the AXON fabric — essential for orchestrating federated learning, distributed training, and multi-institutional research
- Offering rack-ready infrastructure for up to 600kW+, with liquid and immersion cooling on the roadmap
- Enabling cross-border collaboration with facilities across every Australian capital and proximity to international subsea cable landings
The Cost of inaction: why delay is not an option in the AI race
The global AI race is accelerating fast, and for university leaders, the risk of falling behind is real and immediate. Hesitation in deploying advanced AI infrastructure could lead to lasting disadvantages across five critical areas:
- Grant competitiveness: Top-tier research funding increasingly requires access to state-of-the-art AI compute platforms.
- Research rankings: Leading publication output and global standing rely on infrastructure that enables high-throughput, data-intensive AI research.
- Talent attraction: Students want practical experience with cutting-edge tools. Institutions that can’t provide this will struggle to attract top talent.
- Faculty recruitment: The best AI researchers will favour universities with robust infrastructure that supports their work.
- Innovation and commercialisation: Without high-performance GPUs, universities risk slowing their ability to generate start-ups, patents, and economic returns.
Global counterparts are already deploying H100/H200 infrastructure and launching sovereign AI programs. The infrastructure gap is widening fast.
Now is the time to act—lead, don’t lag.
The universities that invest today won’t just stay competitive. They’ll define the future of AI research and discovery.
NEXTDC
What this means for your institution
For Chancellors, Deans, CTOs and CDOs, the message is clear: the global AI race is accelerating. Delay means risking:
- Lower grant competitiveness
- Declining global research rankings
- Talent loss among students and faculty
- Missed innovation and commercialisation opportunities
The infrastructure gap is widening — and it won’t wait.
Ready to lead?
The universities that act now will shape the future. Whether it’s training trillion-parameter LLMs, powering breakthrough medical research, or leading sovereign AI initiatives, H200-grade infrastructure is the foundation.
NEXTDC is here to help you build it.
Want to explore the full article?
Read the complete breakdown of the H200-powered university revolution and how NEXTDC is enabling it: Click here.
AI Research
Avalara unveils AI assistant Avi to simplify complex tax research
Avalara has announced the launch of Avi for Tax Research, a generative AI assistant embedded within Avalara Tax Research (ATR), aimed at supporting tax and trade professionals with immediate, reliable responses to complex tax law queries.
Avi for Tax Research draws on Avalara’s extensive library of tax content to provide users with rapid, comprehensive answers regarding the tax status of products, audit risk, and precise sales tax rates for specific addresses.
Capabilities outlined
The AI assistant offers several features to advance the workflow of tax and trade professionals.
Among its core capabilities, Avi for Tax Research allows users to instantly verify the taxability of products and services through straightforward queries. The tool delivers responses referencing Avalara’s comprehensive tax database, aiming to ensure both speed and reliability in answering enquiries.
Additional support includes access to up-to-date official guidance to help mitigate audit risks and reinforce defensible tax positions. By providing real-time insights, professionals can proactively adapt to changes in tax regulations without needing to perform extensive manual research.
For businesses operating across multiple locations, Avi for Tax Research enables the generation of precise, rooftop-level sales tax rates tailored to individual street addresses, which can improve compliance accuracy to the level of local jurisdiction requirements.
Designed for ease of use
The assistant is built with an intuitive conversational interface intended to be accessible to professionals across departments, including those lacking a formal tax background.
According to Avalara, this functionality should help improve operational efficiency and collaboration by reducing the skills barrier usually associated with tax research.
Avalara’s EVP and Chief Technology Officer, Danny Fields, described the new capabilities in the context of broader industry trends.
“The tax compliance industry is at the dawn of unprecedented innovation driven by rapid advancements in AI,” said Danny Fields, EVP and Chief Technology Officer of Avalara. “Avalara’s technology mission is to equip customers with reliable, intuitive tools that simplify their work and accelerate business outcomes.”
The company attributes Avi’s capabilities to its two decades of tax and compliance experience, which inform the AI’s underlying content and context-specific decision making. By making use of Avalara’s metadata, the solution is intended to shorten the time spent on manual analysis, offering instant and trusted answers to user questions and potentially allowing compliance teams to allocate more time to business priorities.
Deployment and access
The tool is available immediately to existing ATR customers without additional setup.
New customers have the opportunity to explore Avi for Tax Research through a free trial, which Avalara states is designed to reduce manual effort and deliver actionable information for tax research. Customers can use the AI assistant to submit tax compliance research questions and receive instant responses tailored to their requirements.
Avalara delivers technology aimed at supporting over 43,000 business and government customers across more than 75 countries, providing tax compliance solutions that integrate with leading eCommerce, ERP, and billing systems.
The release of Avi for Tax Research follows continued developments in AI applications for business compliance functions, reflecting the increasing demand for automation and accuracy in global tax and trade environments.
AI Research
Tenable Research Warns of Critical AI Tool Vulnerability That Requires Immediate Attention [CVE-2025-49596]
GUEST RESEARCH: Tenable Research has identified a critical remote code execution vulnerability (CVE-2025-49596) in Anthropic’s widely adopted MCP Inspector, an open-source tool crucial for AI development. With a CVSS score of 9.4, this flaw leverages default, insecure configurations, leaving organisations exposed by design. MCP Inspector is a popular tool with over 38,000 weekly downloads on npmjs and more than 4,000 stars on GitHub.
Exploitation is alarmingly simple. A visit to a malicious website can fully compromise a workstation, requiring no further user interaction. Attackers can gain persistent access, steal sensitive data, including credentials and intellectual property, and enable lateral movement or deploy malware.
“Immediate action is non-negotiable”, says Rémy Marot, Staff Research Engineer at Tenable. “Security teams and developers should upgrade MCP Inspector to version 0.14.1 or later. This update enforces authentication, binds services to localhost, and restricts trusted origins, closing critical attack vectors. Prioritise robust security policies before deploying AI tools to mitigate these inherent risks.”
For in-depth information about this research, please refer to the detailed blog post published by Tenable’s Research Team.
Please join our community here and become a VIP.
Subscribe to ITWIRE UPDATE Newsletter here
JOIN our iTWireTV our YouTube Community here
BACK TO LATEST NEWS here
Maximising Cloud Efficiency – LUMEN WEBINAR 23 April 2025
According to KPMG, companies typically spend 35% more on cloud than is required to deliver business objectives
The rush to the cloud has led to insufficient oversight, with many organisations struggling to balance the value of cloud agility and innovation against the need for guardrails to control costs.
Join us for an exclusive webinar on Cloud Optimisation.
In this event, the team from Lumen will explain how you can maximise cloud efficiency while reducing cost.
The session will reveal how to implement key steps for effective cloud optimisation.
Register for the event now!
PROMOTE YOUR WEBINAR ON ITWIRE
It’s all about Webinars.
Marketing budgets are now focused on Webinars combined with Lead Generation.
If you wish to promote a Webinar we recommend at least a 3 to 4 week campaign prior to your event.
The iTWire campaign will include extensive adverts on our News Site itwire.com and prominent Newsletter promotion https://itwire.com/itwire-update.html and Promotional News & Editorial. Plus a video interview of the key speaker on iTWire TV https://www.youtube.com/c/iTWireTV/videos which will be used in Promotional Posts on the iTWire Home Page.
Now we are coming out of Lockdown iTWire will be focussed to assisting with your webinars and campaigns and assistance via part payments and extended terms, a Webinar Business Booster Pack and other supportive programs. We can also create your adverts and written content plus coordinate your video interview.
We look forward to discussing your campaign goals with you. Please click the button below.
-
Funding & Business1 week ago
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries
-
Jobs & Careers1 week ago
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
-
Mergers & Acquisitions1 week ago
Donald Trump suggests US government review subsidies to Elon Musk’s companies
-
Funding & Business1 week ago
Rethinking Venture Capital’s Talent Pipeline
-
Jobs & Careers1 week ago
Why Agentic AI Isn’t Pure Hype (And What Skeptics Aren’t Seeing Yet)
-
Education2 days ago
9 AI Ethics Scenarios (and What School Librarians Would Do)
-
Education3 days ago
Teachers see online learning as critical for workforce readiness in 2025
-
Education3 days ago
Nursery teachers to get £4,500 to work in disadvantaged areas
-
Education5 days ago
How ChatGPT is breaking higher education, explained
-
Funding & Business6 days ago
Sakana AI’s TreeQuest: Deploy multi-model teams that outperform individual LLMs by 30%