Connect with us

AI Research

Energy-Efficient NPU Technology Cuts AI Power Use by 44%

Published

on


Researchers at the Korea Advanced Institute of Science and Technology (KAIST) have developed energy-efficient NPU technology that demonstrates substantial performance improvements in laboratory testing. 

Their specialised AI chip ran AI models 60% faster while using 44% less electricity than the graphics cards currently powering most AI systems, based on results from controlled experiments. 

To put it simply, the research, led by Professor Jongse Park from KAIST’s School of Computing in collaboration with HyperAccel Inc., addresses one of the most pressing challenges in modern AI infrastructure: the enormous energy and hardware requirements of large-scale generative AI models. 

Current systems such as OpenAI’s ChatGPT-4 and Google’s Gemini 2.5 demand not only high memory bandwidth but also substantial memory capacity, driving companies like Microsoft and Google to purchase hundreds of thousands of NVIDIA GPUs.

The memory bottleneck challenge

The core innovation lies in the team’s approach to solving memory bottleneck issues that plague existing AI infrastructure. Their energy-efficient NPU technology focuses on “lightweight” the inference process while minimising accuracy loss—a critical balance that has proven challenging for previous solutions.

PhD student Minsu Kim and Dr Seongmin Hong from HyperAccel Inc., serving as co-first authors, presented their findings at the 2025 International Symposium on Computer Architecture (ISCA 2025) in Tokyo. The research paper, titled “Oaken: Fast and Efficient LLM Serving with Online-Offline Hybrid KV Cache Quantization,” details their comprehensive approach to the problem.

The technology centres on KV cache quantisation, which the researchers identify as accounting for most memory usage in generative AI systems. By optimising this component, the team enables the same level of AI infrastructure performance using fewer NPU devices compared to traditional GPU-based systems.

Technical innovation and architecture

The KAIST team’s energy-efficient NPU technology employs a three-pronged quantisation algorithm: threshold-based online-offline hybrid quantisation, group-shift quantisation, and fused dense-and-sparse encoding. This approach allows the system to integrate with existing memory interfaces without requiring changes to operational logic in current NPU architectures.

The hardware architecture incorporates page-level memory management techniques for efficient utilisation of limited memory bandwidth and capacity. Additionally, the team introduced new encoding techniques specifically optimised for quantised KV cache, addressing the unique requirements of their approach.

“This research, through joint work with HyperAccel Inc., found a solution in generative AI inference light-weighting algorithms and succeeded in developing a core NPU technology that can solve the memory problem,” Professor Park explained. 

“Through this technology, we implemented an NPU with over 60% improved performance compared to the latest GPUs by combining quantisation techniques that reduce memory requirements while maintaining inference accuracy.”

Sustainability implications

The environmental impact of AI infrastructure has become a growing concern as generative AI adoption accelerates. The energy-efficient NPU technology developed by KAIST offers a potential path toward more sustainable AI operations. 

With 44% lower power consumption compared to current GPU solutions, widespread adoption could significantly reduce the carbon footprint of AI cloud services. However, the technology’s real-world impact will depend on several factors, including manufacturing scalability, cost-effectiveness, and industry adoption rates. 

The researchers acknowledge that their solution represents a significant step forward, but widespread implementation will require continued development and industry collaboration.

Industry context and future outlook

The timing of this energy-efficient NPU technology breakthrough is particularly relevant as AI companies face increasing pressure to balance performance with sustainability. The current GPU-dominated market has created supply chain constraints and elevated costs, making alternative solutions increasingly attractive.

Professor Park noted that the technology “has demonstrated the possibility of implementing high-performance, low-power infrastructure specialised for generative AI, and is expected to play a key role not only in AI cloud data centres but also in the AI transformation (AX) environment represented by dynamic, executable AI such as agentic AI.”

The research represents a significant step toward more sustainable AI infrastructure, but its ultimate impact will be determined by how effectively it can be scaled and deployed in commercial environments. As the AI industry continues to grapple with energy consumption concerns, innovations like KAIST’s energy-efficient NPU technology offer hope for a more sustainable future in artificial intelligence computing.

(Photo by Korea Advanced Institute of Science and Technology)

See also: The 6 practices that ensure more sustainable data centre operations

Want to learn more about cybersecurity and the cloud from industry leaders? Check out Cyber Security & Cloud Expo taking place in Amsterdam, California, and London.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.



Source link

AI Research

RRC getting real with artificial intelligence – Winnipeg Free Press

Published

on


Red River College Polytechnic is offering crash courses in generative artificial intelligence to help classroom teachers get more comfortable with the technology.

Foundations of Generative AI in Education, a microcredential that takes 15 hours to complete, gives participants guidance to explore AI tools and encourage ethical and effective use of them in schools.

Tyler Steiner was tasked with creating the program in 2023, shortly after the release of ChatGPT — a chatbot that generates human-like replies to prompts within seconds — and numerous copycat programs that have come online since.



MIKE DEAL / FREE PRESS

Lauren Phillips, a RRC Polytech associate dean, said it’s important students know when they can use AI.

“There’s no putting that genie back in the bottle,” said Steiner, a curriculum developer at the post-secondary institute in Winnipeg.

While noting teachers can “lock and block” via pen-and-paper tests and essays, the reality is students are using GenAI outside school and authentic experiential learning should reflect the real world, he said.

Steiner’s advice?

Introduce it with the caveat students should withhold personal information from prompts to protect their privacy, analyze answers for bias and “hallucinations” (false or misleading information) and be wary of over-reliance on technology.

RRC Polytech piloted its first GenAI microcredential little more than a year ago. A total of 109 completion badges have been issued to date.

The majority of early participants in the training program are faculty members at RRC Polytech. The Winnipeg School Division has also covered the tab for about 20 teachers who’ve expressed interest in upskilling.

“There was a lot of fear when GenAI first launched, but we also saw that it had a ton of power and possibility in education,” said Lauren Phillips, associate dean of RRC Polytech’s school of education, arts and sciences.

Phillips called a microcredential “the perfect tool” to familiarize teachers with GenAI in short order, as it is already rapidly changing the kindergarten to Grade 12 and post-secondary education sectors.

Manitoba teachers have told the Free Press they are using chatbots to plan lessons and brainstorm report card comments, among other tasks.

Students are using them to help with everything from breaking down a complex math equation to creating schedules to manage their time. Others have been caught cutting corners.

Submitted assignments should always disclose when an author has used ChatGPT, Copilot or another tool “as a partner,” Phillips said.

She and Steiner said in separate interviews the key to success is providing students with clear instructions about when they can and cannot use this type of technology.

Business administration instructor Nora Sobel plans to spend much of the summer refreshing course content to incorporate their tips; Sobel recently completed all three GenAI microcredentials available on her campus.

Two new ones — Application of Generative AI in Education and Integration of Generative AI in Education — were added to the roster this spring.

Sobel said it is “overwhelming” to navigate this transformative technology, but it’s important to do so because employers will expect graduates to have the know-how to use them properly.

It’s often obvious when a student has used GenAI because their answers are abstract and generic, she said, adding her goal is to release rubrics in 2025-26 with explicit direction surrounding the active rather than passive use of these tools.