Connect with us

AI Research

Pittsburgh Deploys the Future of AI – News

Published

on


Carnegie Mellon University — the birthplace of AI — unites research and education, to develop AI strategies that focus on pioneering safe AI technologies, integrating human-centered design in autonomous systems, and harnessing AI’s potential for societal good.

Carnegie Mellon leaders and faculty joined AI innovators, researchers and industry experts at the 2025 AI Horizons event at Bakery Square in Pittsburgh, Pennsylvania. The event focused on how to move AI from concept to real-world impact in critical fields like health care, manufacturing, defense, finance, robotics and more.


Opening Keynote

AI Horizon’s opening keynote, titled “Igniting the AI Deployment Era,” included contributions from Joanna Doven, AI Strike Team executive director and Carnegie Mellon alumna; Zico Kolter, head of CMU’s Machine Learning Department(opens in new window); and Devlin Robinson, a state senator in Pennsylvania.

“We’re telling a story here,” Doven said during the day’s introduction. “Pittsburgh is now the nation’s most concentrated AI hub outside of Silicon Valley.”

Kolter then took the stage to speak about the new age of rapid development and deployment of AI, and how research turns into real world impact. He used examples of CMU-based research, such as the autonomous vehicle designs that came from the DARPA Urban Challenge in 2007 and how the same team went on to develop Waymo.

Zico Kolter

“This is not the future. It’s happening right now,” he said.

Kolter then joined Andrew Moore, Lovelace CEO and former dean of CMU’s School of Computer Science(opens in new window), and Alison Snyder, a senior writer for Google DeepMind, for a fireside panel on the deployment of AI tools across society.


Art and AI Intersect: “The Endless Mile”

People walk by "The Endless Mile" art installation at AI Horizons.

“The Endless Mile,” an animation project by Johannes DeYoung in collaboration with Annie Hui-Hsin Hsieh plays on a wall during the summit.

Carnegie Mellon University’s College of Fine Arts(opens in new window) also had an established presence at AI Horizons.

CFA faculty members Johannes DeYoung(opens in new window) and Annie Hui-Hsin Hsieh(opens in new window) showcased their multimedia art project, “The Endless Mile(opens in new window),” an experimental animation which was projected onto a wall to respond to audio input of a musical composition by Hsieh. The stated goal of the piece is to complement the day’s themes of exploring human-centered approaches to artificial intelligence. The project was previously shown at the LUMA Festival in Binghamton, New York.


Public-Private Powerhouse: Owning Pennsylvania’s AI Moment

A panel of speakers at the AI Horizon conference.

From left, Dan Sumner, Robin Vince, Josh Shapiro and Farnam Jahanian during the summit.

Thursday’s final event focused on Pennsylvania’s role as a leader in AI deployment and the state’s role as a key energy source for future data centers. It featured a conversation between Pennsylvania Gov. Josh Shapiro, BNY CEO Robin Vince and Westinghouse Interim CEO Dan Sumner, and moderated by CMU President Farnam Jahanian. 

The panel unpacked the role of AI in government efficiency, its impacts on regional workforces and the importance of public-private partnerships working in tandem with higher education to further innovation. 


AI’s Future is Physical

The second day of AI Horizons kicked off with a presentation by Martial Hebert(opens in new window), dean of Carnegie Mellon’s School of Computer Science, on the history of robotics and computer vision, followed by a conversation with Skild AI co-founders and CMU faculty Abhinav Gupta(opens in new window) and Deepak Pathak(opens in new window) to discuss how AI is making its way into the physical world with robotics and mobility systems.

“We tend to lose track of the fact that the advances that we seek, and the unicorns (startups with valuations of over $1 billion) are the results of very long-term research and very long-term projects,” Hebert said. “Skild AI is the shining example of that, the culmination of that history.”

Pathak shared Skild AI’s efforts to develop a general-use brain for humanoid robots. 

“This model works on any robot, any task, one brain,” Pathak said.

He presented videos of robots putting dishes away, walking up steep hills, and walking up and down stairs holding boxes. “We can also do fun stuff,” he said, showing a video of a robot performing a front flip over a low obstacle.

Gupta said Skild AI’s work reflects what Pittsburgh offers to AI innovation and deployment.

“In Pittsburgh, we don’t just make fancy stuff, we make things that work,” he said. “I feel Pittsburgh brings a huge amount of dense talent — especially CMU — into one place. We have people doing robotics, we have people doing machine learning. It makes you think much bigger. This is tangible.”


Discovery to Defense: AI and Biomanufacturing at the Frontline

As biotechnology converges with AI, researchers can accelerate the pace of discovery and create lifesaving therapies, such as precise medical treatments that can be tailored to specific individuals. 

Barbara Shinn-Cunningham(opens in new window), the Glen de Vries Dean of CMU’s Mellon College of Science, discussed these advances in a conversation alongside ElevateBio Chief Technology Officer Michael Paglia, BioForge Vice President of Engineering Brennan Sellner, National Security Commission on Emerging Biotechnology Senior Policy Adviser Steven Moss, and Pittsburgh Life Sciences Alliance President and CEO Megan Shaw.

“We’re generating so much data that we can’t analyze it by hand the way we used to,” said Shinn-Cunningham. “AI is being used in all sorts of ways. We’re using AI in chemistry to find good molecules that are good candidates for drugs. It’s used in every kind of science.” 


Health Care Reimagined: Addressing Data Challenges with AI

Later Friday morning, Chenyan Xiong(opens in new window), an associate professor in the Language Technologies Institute, joined several health care technology and business experts to provide a technical perspective on challenges and opportunities for AI in the health industry.

Xiong spoke alongside Reuben Daniel, president of AI at UPMC Health Plan, Deepan Kamaraj, director of analytics and informatics at UPMC Enterprises, and Avi Goldberg, a partner of GreatPoint Venture. The discussion covered a wide range of topics, from the possible challenges and benefits of regulation to the potential of AI to bolster effectiveness in drug discovery, telehealth and patient care.

“There are a bunch of unique challenges,” Xiong said during the discussion. “A lot of AI research on health is unfortunately constrained or driven by data.”

Among the use cases of AI, Xiong noted, is its potential to utilize all of the structured information created throughout the medical process as doctors work with patients. “It can make the process of putting all of the data together less painful,” he said.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

AI Research

100x Faster Brain-Inspired AI Model

Published

on

By


In the rapidly evolving field of artificial intelligence, a new contender has emerged from China’s research labs, promising to reshape how we think about energy-efficient computing. The SpikingBrain-7B model, developed by the Brain-Inspired Computing Lab (BICLab) at the Chinese Academy of Sciences, represents a bold departure from traditional large language models. Drawing inspiration from the human brain’s neural firing patterns, this system employs spiking neural networks to achieve remarkable efficiency gains. Unlike conventional transformers that guzzle power, SpikingBrain-7B mimics biological neurons, firing only when necessary, which slashes energy consumption dramatically.

At its core, the model integrates hybrid-linear attention mechanisms and conversion-based training techniques, allowing it to run on domestic MetaX chips without relying on NVIDIA hardware. This innovation addresses a critical bottleneck in AI deployment: the high energy demands of training and inference. According to a technical report published on arXiv, the SpikingBrain series, including the 7B and 76B variants, demonstrates over 100 times faster first-token generation at long sequence lengths, making it ideal for edge devices in industrial control and mobile applications.

Breaking Away from Transformer Dominance

The genesis of SpikingBrain-7B can be traced to BICLab’s GitHub repository, where the open-source code reveals a sophisticated architecture blending spiking neurons with large-scale model training. Researchers at the lab, led by figures like Guoqi Li and Bo Xu, have optimized for non-NVIDIA clusters, overcoming challenges in parallel training and communication overhead. This approach not only enhances stability but also paves the way for neuromorphic hardware that prioritizes energy optimization over raw compute power.

Recent coverage in Xinhua News highlights how SpikingBrain-1.0, the foundational system, breaks from mainstream models like ChatGPT by using spiking networks instead of dense computations. This brain-inspired paradigm allows the model to train on just a fraction of the data typically required—reports suggest as little as 2%—while matching or exceeding transformer performance in benchmarks.

Efficiency Gains and Real-World Applications

Delving deeper, the model’s spiking mechanism enables asynchronous processing, akin to how the brain handles information dynamically. This is detailed in the arXiv report, which outlines a roadmap for next-generation hardware that could integrate seamlessly into sectors like healthcare and transportation. For instance, in robotics, SpikingBrain’s low-power profile supports real-time decision-making without the need for massive data centers.

Posts on X (formerly Twitter) from AI enthusiasts, such as those praising its 100x speedups, reflect growing excitement. Users have noted how the model’s hierarchical processing mirrors neuroscience findings, with emergent brain-like patterns in its structure. This sentiment aligns with broader neuromorphic computing trends, as seen in a Nature Communications Engineering article on advances in robotic vision, where spiking networks enable efficient AI in constrained environments.

Challenges and Future Prospects

Despite its promise, deploying SpikingBrain-7B isn’t without hurdles. The arXiv paper candidly discusses adaptations needed for CUDA and Triton operators in hybrid attention setups, underscoring the technical feats involved. Moreover, training on MetaX clusters required custom optimizations to handle long-sequence topologies, a feat that positions China at the forefront of independent AI innovation amid global chip restrictions.

In industry circles, this development is seen as a catalyst for shifting AI paradigms. A NotebookCheck report emphasizes its potential for up to 100x performance boosts over conventional systems, fueling discussions on sustainable AI. As neuromorphic computing gains traction, SpikingBrain-7B could inspire a wave of brain-mimicking models, reducing the environmental footprint of AI while expanding its reach to everyday devices.

Implications for Global AI Research

Beyond technical specs, the open-sourcing of SpikingBrain-7B via GitHub invites global collaboration, with the repository already garnering attention for its spike-driven transformer implementations. This mirrors earlier BICLab projects like Spike-Driven-Transformer-V2, building a continuum of research toward energy-efficient intelligence.

Looking ahead, experts anticipate integrations with emerging hardware, as outlined in PMC’s coverage of spike-based dynamic computing. With SpikingBrain’s bilingual capabilities and industry validations, it stands as a testament to how bio-inspired designs can democratize AI, challenging Western dominance and fostering a more inclusive technological future.



Source link

Continue Reading

AI Research

Exclusive | Cyberport may use Chinese GPUs at Hong Kong supercomputing hub to cut reliance on Nvidia

Published

on


Cyberport may add some graphics processing units (GPUs) made in China to its Artificial Intelligence Supercomputing Centre in Hong Kong, as the government-run incubator seeks to reduce its reliance on Nvidia chips amid worsening China-US relations, its chief executive said.

Cyberport has bought four GPUs made by four different mainland Chinese chipmakers and has been testing them at its AI lab to gauge which ones to adopt in the expanding facilities, Rocky Cheng Chung-ngam said in an interview with the Post on Friday. The park has been weighing the use of Chinese GPUs since it first began installing Nvidia chips last year, he said.

“At that time, China-US relations were already quite strained, so relying solely on [Nvidia] was no longer an option,” Cheng said. “That is why we felt that for any new procurement, we should in any case include some from the mainland.”

Cyberport’s AI supercomputing centre, established in December with its first phase offering 1,300 petaflops of computing power, will deliver another 1,700 petaflops by the end of this year, with all 3,000 petaflops currently relying on Nvidia’s H800 chips, he added.

Cyberport CEO Rocky Cheng Chung-ngam on September 12, 2025. Photo: Jonathan Wong

As all four Chinese solutions offer similar performance, Cyberport would take cost into account when determining which ones to order, according to Cheng, declining to name the suppliers.



Source link

Continue Reading

AI Research

Why do AI chatbots use so much energy?

Published

on


In recent years, ChatGPT has exploded in popularity, with nearly 200 million users pumping a total of over a billion prompts into the app every day. These prompts may seem to complete requests out of thin air.

But behind the scenes, artificial intelligence (AI) chatbots are using a massive amount of energy. In 2023, data centers, which are used to train and process AI, were responsible for 4.4% of electricity use in the United States. Across the world, these centers make up around 1.5% of global energy consumption. These numbers are expected to skyrocket, at least doubling by 2030 as the demand for AI grows.



Source link

Continue Reading

Trending