Connect with us

AI Research

Wolters Kluwer Achieves Top Five Global Ranking for Excellence in Artificial Intelligence by Chartis Research – Business Wire

Published

on

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

AI Research

Accelerating discovery: The NVIDIA H200 and the transformation of university research

Published

on

By


The global research landscape is undergoing a seismic shift. Universities worldwide are deploying NVIDIA’s H200 Tensor Core GPUs to power next-generation AI Factories, SuperPODs, and sovereign cloud platforms. This isn’t a theoretical pivot; it’s a real-time transformation redefining what’s possible in scientific discovery, medicine, climate analysis, and advanced education delivery.

The H200 is the most powerful GPU currently available to academia, delivering the performance required to train foundational models, run real-time inference at scale, and enable collaborative AI research across institutions. And with NVIDIA’s Blackwell-based B200 on the horizon, universities investing in H200 infrastructure today are setting themselves up to seamlessly adopt future architectures tomorrow.

Universities powering the AI revolution

This pivotal shift isn’t a future promise but a present reality. Forward-thinking institutions worldwide are already integrating the H200 into their research ecosystems.

Institutions leading the charge include:

  • Oregon State University and Georgia Tech in the US, deploying DGX H200 and HGX clusters.
  • Taiwan’s NYCU and University of Tokyo, pushing high-performance computing boundaries with DGX and GH200-powered systems.
  • Seoul National University, gaining access to a GPU network of over 4,000 H200 units.
  • Eindhoven University of Technology in the Netherlands, preparing to adopt DGX B200 infrastructure.

In Taiwan, national programs like NCHC are also investing in HGX H200 supercomputing capacity, making cutting-edge AI infrastructure accessible to researchers at scale.

Closer to home, La Trobe University is the first in Australia to deploy NVIDIA DGX H200 systems. This investment underpins the creation of ACAMI — the Australian Centre for Artificial Intelligence in Medical Innovation — a world-first initiative focused on AI-powered immunotherapies, med-tech, and cancer vaccine development.

It’s a leap that’s not only bolstering research output and commercial partnerships but also positioning La Trobe as a national leader in AI education and responsible deployment.

Universities like La Trobe are establishing themselves as part of a growing global network of AI research precincts, from Princeton’s open generative AI initiative to Denmark’s national AI supercomputer, Gefion. The question for others is no longer “if”, but “how fast?”

Redefining the campus: How H200 AI infrastructure transforms every discipline

The H200 isn’t just for computer science. Its power is unlocking breakthroughs across:

  • Climate science: hyper-accurate modelling for mitigation and prediction
  • Medical research: from genomics to diagnostics to drug discovery
  • Engineering and material sciences: AI-optimised simulations at massive scale
  • Law and digital ethics: advancing policy frameworks for responsible AI use
  • Indigenous language preservation: advanced linguistic analysis and voice synthesis
  • Adaptive education: AI-driven, personalised learning pathways
  • Economic modelling: dynamic forecasts and decision support
  • Civic AI: real-time, data-informed public service improvements

AI infrastructure is now central to the entire university mission — from discovery and education to innovation and societal impact.

Positioning Australia in the global AI race

La Trobe’s deployment is more than a research milestone — it supports the national imperative to build sovereign AI capability. Australian companies like Sharon AI and ResetData are also deploying sovereign H200 superclusters, now accessible to universities via cloud or direct partnerships.

Universities that move early unlock more than infrastructure. They strengthen research impact, gain eligibility for key AI grants, and help shape Australia’s leadership on the global AI stage.

NEXTDC indispensable role: The foundation for AI innovation

Behind many of these deployments is NEXTDC, Australia’s data centre leader and enabler of sovereign, scalable, and sustainable AI infrastructure.

NEXTDC is already:

  • Hosting Sharon AI’s H200 supercluster in Melbourne in a high-density, DGX-certified, liquid-cooled facility
  • Delivering ultra-low latency connectivity via the AXON fabric — essential for orchestrating federated learning, distributed training, and multi-institutional research
  • Offering rack-ready infrastructure for up to 600kW+, with liquid and immersion cooling on the roadmap
  • Enabling cross-border collaboration with facilities across every Australian capital and proximity to international subsea cable landings

The Cost of inaction: why delay is not an option in the AI race

The global AI race is accelerating fast, and for university leaders, the risk of falling behind is real and immediate. Hesitation in deploying advanced AI infrastructure could lead to lasting disadvantages across five critical areas:

  • Grant competitiveness: Top-tier research funding increasingly requires access to state-of-the-art AI compute platforms.
  • Research rankings: Leading publication output and global standing rely on infrastructure that enables high-throughput, data-intensive AI research.
  • Talent attraction: Students want practical experience with cutting-edge tools. Institutions that can’t provide this will struggle to attract top talent.
  • Faculty recruitment: The best AI researchers will favour universities with robust infrastructure that supports their work.
  • Innovation and commercialisation: Without high-performance GPUs, universities risk slowing their ability to generate start-ups, patents, and economic returns.

Global counterparts are already deploying H100/H200 infrastructure and launching sovereign AI programs. The infrastructure gap is widening fast.

Now is the time to act—lead, don’t lag.
 The universities that invest today won’t just stay competitive. They’ll define the future of AI research and discovery.

NEXTDC

What this means for your institution

For Chancellors, Deans, CTOs and CDOs, the message is clear: the global AI race is accelerating. Delay means risking:

  • Lower grant competitiveness
  • Declining global research rankings
  • Talent loss among students and faculty
  • Missed innovation and commercialisation opportunities

The infrastructure gap is widening — and it won’t wait.

Ready to lead?

The universities that act now will shape the future. Whether it’s training trillion-parameter LLMs, powering breakthrough medical research, or leading sovereign AI initiatives, H200-grade infrastructure is the foundation.

NEXTDC is here to help you build it.

NextDC ad 7

NEXTDC

Want to explore the full article?
Read the complete breakdown of the H200-powered university revolution and how NEXTDC is enabling it: Click here.



Source link

Continue Reading

AI Research

Defining the AI research powerhouse: A strategic imperative for universities

Published

on

By


In today’s AI-driven world, universities must decide whether to passively observe or actively lead. The ambition to develop large language models, conduct data-intensive research, and accelerate innovation depends on one thing: advanced AI infrastructure.

This infrastructure isn’t just a technical asset. It’s your university’s AI research powerhouse: an environment built to process trillions of tokens, power cutting-edge simulations, and unlock next-generation breakthroughs.

What defines a university’s AI research Powerhouse?

A true powerhouse supports high-density GPU clusters, such as NVIDIA’s H100 and H200 chips, and scales to support AI Factory and SuperPOD deployments. Key characteristics include:

  • Compute performance: H200 GPUs double inference speeds compared to H100s, with 141GB HBM3e memory and 4.8 TB/s bandwidth.
  • Data movement: High-bandwidth, low-latency networking via NVLink and InfiniBand eliminates bottlenecks.
  • Massive storage: Petabyte-scale access to training datasets.
  • AI-optimised software: Tools and frameworks to accelerate time-to-discovery.
  • Scalability: Flexible environments that grow with research needs.
  • Specialist support: Operational excellence from dedicated AI infrastructure experts.

This isn’t just a lab upgrade. It’s the engine room for academic leadership in the AI era.

The peril of doing it alone: Why building on-premise is often a roadblock

While on-premise builds may seem ideal for control, they pose serious barriers:

  • CAPEX burden: SuperPOD-scale builds can cost tens to hundreds of millions.
  • Extreme power requirements: Each rack can demand over 50kW.
  • Cooling complexity: Traditional air cooling often fails at scale.
  • Staffing shortages: HPC-specialist recruitment is highly competitive.
  • Sustainability pressure: Dense clusters challenge campus energy goals.

Delays in hardware delivery due to global supply constraints only amplify the risks. Ultimately, building alone can stall discovery, limit talent attraction, and divert institutional focus from mission-critical outcomes.

NEXTDC

In an AI-led era, infrastructure is no longer just technical: it’s a strategic enabler. Universities aiming to lead in breakthrough research, attract global talent, and secure major grants must prioritise AI infrastructure now. As La Trobe University has shown, combining NVIDIA DGX H200 systems with purpose-built environments unlocks new levels of discovery.

However, building this alone is costly and complex. That’s why strategic colocation with trusted partners is the smart path forward.

Ready to accelerate your university’s AI leadership?
 Partnering with a specialist like NEXTDC for your H100/H200 deployments lets your university scale research faster without infrastructure burdens.
 Key benefits include:

  • NVIDIA DGX Certification for optimal performance and reliability of DGX SuperPODs and AI Factories
  • CapEx to OpEx shift, reducing upfront costs and long-term total cost of ownership
  • AI-optimised facilities built for high-density H100/H200 workloads with immediate access to power and cooling
  • Scalable, on-demand compute aligned with your research growth
  • Specialist operational support, relieving university IT teams and transferring infrastructure risk
  • Global interconnection via subsea cables and direct links to networks like AARNet
  • Sustainability alignment through energy-efficient, renewably powered facilities
  • Campus space gains by moving backend IT off-site, freeing room for academic priorities

 NEXTDC’s DGX-certified data centres are designed for the most intensive AI tasks — from model training to real-time inference. Our infrastructure supports every stage of the AI lifecycle with GPU-optimised power, cooling, and design.

Strategically located near major subsea cable hubs, we offer ultra-low-latency access to Asia-Pacific markets, enabling federated learning and multi-region AI deployment. For Australian universities scaling global AI platforms, this infrastructure is your unseen advantage.

Whether advancing medical research, attracting top AI talent, or building an innovation precinct, your infrastructure must match your ambition. The institutions that invest wisely today will lead the AI breakthroughs of tomorrow.

The Future Is Built Now

AI-powered discovery is not a future state — it’s happening now. Institutions that act today will lead tomorrow.

Whether you’re building an AI innovation precinct, training large-scale models, or empowering faculty with sovereign research environments, the infrastructure must match your ambition.

nextdc ad 10

NEXTDC



Source link

Continue Reading

AI Research

Avalara unveils AI assistant Avi to simplify complex tax research

Published

on


Avalara has announced the launch of Avi for Tax Research, a generative AI assistant embedded within Avalara Tax Research (ATR), aimed at supporting tax and trade professionals with immediate, reliable responses to complex tax law queries.

Avi for Tax Research draws on Avalara’s extensive library of tax content to provide users with rapid, comprehensive answers regarding the tax status of products, audit risk, and precise sales tax rates for specific addresses.

Capabilities outlined

The AI assistant offers several features to advance the workflow of tax and trade professionals.

Among its core capabilities, Avi for Tax Research allows users to instantly verify the taxability of products and services through straightforward queries. The tool delivers responses referencing Avalara’s comprehensive tax database, aiming to ensure both speed and reliability in answering enquiries.

Additional support includes access to up-to-date official guidance to help mitigate audit risks and reinforce defensible tax positions. By providing real-time insights, professionals can proactively adapt to changes in tax regulations without needing to perform extensive manual research.

For businesses operating across multiple locations, Avi for Tax Research enables the generation of precise, rooftop-level sales tax rates tailored to individual street addresses, which can improve compliance accuracy to the level of local jurisdiction requirements.

Designed for ease of use

The assistant is built with an intuitive conversational interface intended to be accessible to professionals across departments, including those lacking a formal tax background.

According to Avalara, this functionality should help improve operational efficiency and collaboration by reducing the skills barrier usually associated with tax research.

Avalara’s EVP and Chief Technology Officer, Danny Fields, described the new capabilities in the context of broader industry trends.

“The tax compliance industry is at the dawn of unprecedented innovation driven by rapid advancements in AI,” said Danny Fields, EVP and Chief Technology Officer of Avalara. “Avalara’s technology mission is to equip customers with reliable, intuitive tools that simplify their work and accelerate business outcomes.”

The company attributes Avi’s capabilities to its two decades of tax and compliance experience, which inform the AI’s underlying content and context-specific decision making. By making use of Avalara’s metadata, the solution is intended to shorten the time spent on manual analysis, offering instant and trusted answers to user questions and potentially allowing compliance teams to allocate more time to business priorities.

Deployment and access

The tool is available immediately to existing ATR customers without additional setup.

New customers have the opportunity to explore Avi for Tax Research through a free trial, which Avalara states is designed to reduce manual effort and deliver actionable information for tax research. Customers can use the AI assistant to submit tax compliance research questions and receive instant responses tailored to their requirements.

Avalara delivers technology aimed at supporting over 43,000 business and government customers across more than 75 countries, providing tax compliance solutions that integrate with leading eCommerce, ERP, and billing systems.

The release of Avi for Tax Research follows continued developments in AI applications for business compliance functions, reflecting the increasing demand for automation and accuracy in global tax and trade environments.



Source link

Continue Reading

Trending