Connect with us

AI Research

Data centers for AI could require power equivalent to five Hoover Dams

Published

on


Across the country, Americans are using the internet at every hour of every day. According to a 2024 Pew Research Poll, 96% of adults reported using the internet at least occasionally on a mobile device. That number has risen gradually since May 2000, when just 48% reported occasional use. With more people online, energy providers have begun preparing for a higher demand for electricity.

“The internet use was overstated, as it turns out, at least in the early going. And then it caught up, and we saw the consumptive use later,” Constellation President and CEO Joseph Dominguez said.

But when it comes to new artificial intelligence, Dominguez says widespread usage happened almost immediately and has expanded faster than the internet boom. White House A.I. and Crypto Czar David Sacks agrees.

“The adoption is faster than any previous technology. It’s faster than the internet, it’s faster than the iPhone. So, it’s being adopted very quickly,” Sacks said. “Still, roughly half the public hasn’t tried it yet.”

Fox News Polling shows 57% of registered voters rarely or never use artificial intelligence. Twenty-seven percent said they use the technology daily. Usage could be driven by their opinion of the technology. Those who saw A.I. as bad for society were less familiar with it and said they used it rarely (77%). Those who consider A.I. a good thing used it more regularly (47%). Experts believe A.I. use will only increase.

“OpenAI’s ChatGPT, when they launched, was the fastest-growing adoption of any consumer technology product ever back in November 2022, but that’s a drop in the bucket as to what they have now,” said senior advisor Gregory Allen with the Wadhwani A.I. center at the Center for Strategic and International Studies.

In order to supply the increasing demand and continue advancing A.I. technology, data centers are providing a 24-hour connection.

ARTIFICIAL INTELLIGENCE FUELS BIG TECH PARTNERSHIPS WITH NUCLEAR ENERGY PRODUCERS

A graph of annual energy consumption (Fox News / Fox News)

“Running all of these computational resources that modern A.I. needs requires an awful lot of electricity,” Allen said.

A.I. models are frequently trained to remain relevant. Software requires regular updates and new data centers need large cooling systems to keep everything running. Allen says the largest A.I. algorithms will require between 1 and 5 gigawatts of electricity to operate.

“One gigawatt is about one Hoover Dam’s worth of electricity. So, imagine five Hoover Dams being used to just power one data center full of one company’s A.I.,” Allen said.

The growing complexity and need for updated infrastructure has put a strain on available resources.

“Data centers have become very large. So when you think about it, we need land that needs to be zoned. We need to get permits so that we can build these facilities, and we need to bring more electricity,” Microsoft President and Vice Chair Brad Smith said.

Data centers are often clustered in certain areas. According to the Northern Virginia Regional Commission, the area’s 250 facilities handle around 70% of global internet traffic. In areas with high concentration, tech companies can face delays in connecting to the grid. Overseas, some countries and localities have placed restrictions on how many data centers can be built. Stateside, Dominguez says President Donald Trump has taken some actions to help speed up some of the permitting processes.

“The executive orders are now cutting through a lot of the red tape, and effectively we’re not required to do things that we were required to in the past,” Dominguez said.

ELECTRICITY PRICES SPIKE FOR AMERICAN HOUSEHOLDS: HERE’S WHAT’S DRIVING COSTS HIGHER

Facebook parent Meta Platforms will invest $800 million in a nearly 1-million square foot hyperscale data center in Kansas City, Missouri. (Meta/Kansas City Area Development Counci)

Before a nuclear site is built, producers are required to obtain an early site permit that checks geology, site conditions and whether a new facility can be built. 

“It makes sense if you’ve never built a nuclear reactor in that place before. But in our case, we have existing reactors that have operated in these communities for decades,” Dominguez said. “Currently the NRC regulations require us to go through a laborious exercise that costs about $35 million a pop to verify what we already know and that is that nuclear could go there. As a result of the president’s executive orders, that’s no longer gonna be required.”

Once a nuclear site is up and running, future data centers could also plug in directly to the site. Electricity would be in constant supply.

“It runs like a freight train day or night, winter or summer, regardless of weather condition,” Dominguez said.

Nuclear plants operate at full capacity, more so than any other energy source, making it a reliable choice for tech companies.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

Steam coming out of a nuclear power plant

Susquehanna Nuclear Power Plant in Salem Township, Pennsylvania. (Fox News / Fox News)

“Nuclear power is a good source of electricity for A.I. and many other things as well,” Smith said. “In the United States, we’ve gone many decades without adding new sources of nuclear power.”

U.S. reactors supply nearly 20% of the nation’s power. The 93 nuclear generators create more electricity annually than the more than 8,000 wind, solar and geothermal power plants combined. Dominguez said that 24/7 energy supply may never be necessary and having a mix of sources is important. Constellation also develops solar energy along with nuclear.

“We have to develop 20 times as much solar to get the same impact as one megawatt of nuclear energy,” Dominguez said.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

AI Research

AI slows down some experienced software developers, study finds

Published

on


By Anna Tong

SAN FRANCISCO (Reuters) -Contrary to popular belief, using cutting-edge artificial intelligence tools slowed down experienced software developers when they were working in codebases familiar to them, rather than supercharging their work, a new study found.

AI research nonprofit METR conducted the in-depth study on a group of seasoned developers earlier this year while they used Cursor, a popular AI coding assistant, to help them complete tasks in open-source projects they were familiar with.

Before the study, the open-source developers believed using AI would speed them up, estimating it would decrease task completion time by 24%. Even after completing the tasks with AI, the developers believed that they had decreased task times by 20%. But the study found that using AI did the opposite: it increased task completion time by 19%.

The study’s lead authors, Joel Becker and Nate Rush, said they were shocked by the results: prior to the study, Rush had written down that he expected “a 2x speed up, somewhat obviously.”

The findings challenge the belief that AI always makes expensive human engineers much more productive, a factor that has attracted substantial investment into companies selling AI products to aid software development.

AI is also expected to replace entry-level coding positions. Dario Amodei, CEO of Anthropic, recently told Axios that AI could wipe out half of all entry-level white collar jobs in the next one to five years.

Prior literature on productivity improvements has found significant gains: one study found using AI sped up coders by 56%, another study found developers were able to complete 26% more tasks in a given time.

But the new METR study shows that those gains don’t apply to all software development scenarios. In particular, this study showed that experienced developers intimately familiar with the quirks and requirements of large, established open source codebases experienced a slowdown.

Other studies often rely on software development benchmarks for AI, which sometimes misrepresent real-world tasks, the study’s authors said.

The slowdown stemmed from developers needing to spend time going over and correcting what the AI models suggested.

“When we watched the videos, we found that the AIs made some suggestions about their work, and the suggestions were often directionally correct, but not exactly what’s needed,” Becker said.

The authors cautioned that they do not expect the slowdown to apply in other scenarios, such as for junior engineers or engineers working in codebases they aren’t familiar with.

Still, the majority of the study’s participants, as well as the study’s authors, continue to use Cursor today. The authors believe it is because AI makes the development experience easier, and in turn, more pleasant, akin to editing an essay instead of staring at a blank page.

“Developers have goals other than completing the task as soon as possible,” Becker said. “So they’re going with this less effortful route.”

(Reporting by Anna Tong in San Francisco; Editing by Sonali Paul)



Source link

Continue Reading

AI Research

China’s cloud services spending hits US$11.6 billion in first quarter on AI-related demand

Published

on


Alibaba Group Holding’s cloud computing unit continued to lead the industry in the March quarter, with a commanding 33 per cent market share and a 15 per cent year-on-year revenue growth, Canalys data showed. Hangzhou-based Alibaba owns the South China Morning Post.
Second-ranked Huawei Technologies’ cloud business expanded its market share to 18 per cent, while posting an 18 per cent revenue increase in the same period.
Tencent Holdings’ cloud unit, meanwhile, held a 10 per cent share, but recorded limited revenue growth in the quarter owing to graphics processing unit (GPU) supply constraints and prioritised use of these AI chips for the firm’s internal operations.
These results reflect the robust domestic demand for cloud infrastructure amid a surge in AI-related activities this year, even as service providers contend with US export restrictions that limit China’s access to advanced chips used in data centres.

“Leading cloud providers are actively exploring pathways for AI adoption, unlocking capabilities and building ecosystems through model open-sourcing, while accelerating task execution and scenario delivery via AI agent platforms,” Canalys senior analyst Yi Zhang said in a report on Thursday.

Alibaba Cloud remains the leading provider of cloud infrastructure services in mainland China. Photo: Shutterstock



Source link

Continue Reading

AI Research

Artificial intelligence used to improve speed and accuracy of autism and ADHD diagnoses: IU News

Published

on


A test subject completes a task by pressing a dot when it appears on a computer screen. Photo by James Brosher, Indiana University It can take as long as 18 months for children with suspected autism spectrum or attention-deficit-hyperactivity disorders to get a diagnostic appointment with a psychiatrist in Indiana. But an interdisciplinary team led by an Indiana University researcher has developed a new diagnostic approach using artificial intelligence that could speed up and improve the detection of neurodivergent disorders.

Psychiatrists, who currently use a variety of tests and patient surveys to analyze symptoms such as communication impairments, hyperactivity or repeated behaviors, have no widely available quantitative or biological tests to diagnose autism, ADHD or related disorders.

“The symptoms of neurodivergent disorders are very heterogeneous; psychiatrists call them ‘spectrum disorders’ because there’s no one observable thing that tells them if a person is neurotypical or not,” said Jorge José, the James H. Rudy Distinguished Professor of Physics in the College of Arts and Sciences at IU Bloomington and member of the Stark Neuroscience Research Institute at the IU School of Medicine in Indianapolis.

That’s why José — in collaboration with an interdisciplinary team of scholars, including IU School of Medicine Distinguished Professor Emeritus John I. Nurnberger and associate professor of psychiatry Martin Plawecki — dedicated his recent research to improving diagnostic tools for children with these symptoms.

A new study on the use of artificial intelligence to quickly diagnose autism and ADHD, published July 8 in Nature’s Scientific Reports, details the latest step in his team’s development of a data-driven approach to rapidly and accurately assess neurodivergent disorders using quantitative biomarkers and biometrics.

Their method — which has the potential to diagnose autism or ADHD in as little as 15 minutes — could be used in schools to triage students who might need further care, said Khoshrav Doctor, a Ph.D. student at the University of Massachusetts Amherst and former visiting research scholar at IU who has been a member of José’s team since 2016.

Both he and José said their approach is not meant to replace the role of psychiatrists in the diagnosis and treatment of neurodivergent disorders.

“It could help as an additional tool in the clinician’s toolbelt,” Doctor said. “It also gives us the ability to see who might need the quickest intervention and direct them to providers earlier.”

Finding the biomarkers

Jorge José portrait Jorge José, Indiana University Bloomington James H. Rudy Distinguished Professor of Physics. Photo by James Brosher, Indiana University

In 2018, José published an autism study in collaboration with Rutgers, revealing that there are “movement biomarkers” that, while imperceptible to the naked eye, can be identified and measured in severity by using sensors.

José and his team instructed a group of participants to reach for a target when it appeared on a computer touch screen in front of them. Using sensors attached to participants’ hands, researchers recorded hundreds of images of micromovements per second.

The images showed that neurotypical patients moved in a measurably different way than participants with autism. The researchers were able to correlate increased randomness in movement with the participants who had previously been diagnosed with autism.

Improving treatment

In the years since their landmark 2018 study, José and his present team took advantage of new high-definition kinematic Bluetooth sensors to collect information not just on the velocity of study participants’ movements, but also to measure acceleration, rotation and many other variables.

“We’re taking a physicist’s approach to looking at the brain and analyzing movement specifically,” said IU physics graduate student Chaundy McKeever, who recently joined José’s group. “We’re looking at how sporadic the movement of a patient is. We’ve found that, typically, the more sporadic their movement, the more severe a disorder is.”

The team also introduced the use of a specialized area of artificial intelligence known as deep learning to analyze the new measurements. Using a supervised deep-learning technique, the team studied raw movement data from participants with autism spectrum disorder, ADHD, comorbid autism and ADHD, and neurotypical development.

This enhanced method, detailed in their July 8 Scientific Reports paper, introduced an ability to better analyze a patient’s neurodivergent disorder.

“By studying the statistics of the motion fluctuations, invisible to the naked eye, we can assess the severity of a disorder in terms of a new set of biometrics,” José said. “No psychiatrist can currently tell you how serious a condition is.”

With the added ability to assess a neurodivergent disorder’s severity, health care providers can better set up and monitor the impact of their treatments.

“Some patients will need a significant number of services and specialized treatments,” José said. “If, however, the severity of a patient’s disorder is in the middle of the spectrum, their treatments can be more minutely adjusted, will be less demanding and often can be carried out at home, making their care more affordable and easier to carry out.”



Source link

Continue Reading

Trending