AI Insights
Google’s Nuclear Energy Bet Aims to Fuel Gemini and Hit Net-Zero Emissions by 2030

IN A NUTSHELL |
|
The rapid advancement of artificial intelligence (AI) is reshaping industries and society as a whole. While this technology promises to enhance productivity and innovation, it also demands significant energy resources. As AI models become more sophisticated, the energy consumption of data centers housing these models has skyrocketed. In response, tech giants are exploring alternative energy sources, including nuclear power, to meet these demands. Among these companies is Google, whose CEO, Sundar Pichai, has expressed interest in nuclear energy as a means to power their expansive data operations.
Exploring New Investments in Energy Sources
Google’s commitment to achieving net-zero emissions across its operations and value chain by 2030 underscores its dedication to sustainability. However, the company’s significant investments in AI present additional challenges in meeting this goal. Sundar Pichai acknowledges the ambitious nature of this target and highlights the necessity of exploring diverse energy options. In a recent interview, Pichai mentioned considering investments in solar energy and evaluating new technologies like small modular nuclear reactors. These efforts reflect Google’s proactive approach to balancing technological advancement with environmental responsibility.
Leading the AI Race
Google stands as a formidable player in the AI landscape, competing directly with entities like OpenAI, the creator of ChatGPT. Google’s offering, Gemini, is a testament to its capabilities in AI development. The company continues to integrate Gemini into its product ecosystem, enhancing its offerings with cutting-edge AI technologies. Sundar Pichai emphasizes the importance of substantial early-stage investments during major platform shifts, which is evident in Google’s ongoing commitment to AI innovation. This strategic approach enables Google to refine efficiency and maintain its leadership in the AI sector.
The Energy Demands of AI
The increasing energy requirements of AI are a significant concern for tech companies. As AI models grow more complex, the demand for computational power—and consequently, energy—intensifies. This has led companies like Google to consider unconventional energy sources, such as nuclear power, to ensure sustainable growth. By exploring nuclear options, Google aims to secure a reliable energy supply for its data centers, which are crucial for AI training and deployment. This shift could set a precedent for other tech companies seeking sustainable solutions to power their AI ambitions.
Balancing Innovation and Sustainability
Google’s exploration of nuclear energy highlights the broader challenge of balancing technological innovation with environmental sustainability. As AI continues to evolve, the demand for energy-efficient solutions becomes increasingly critical. By investing in nuclear and other alternative energy sources, Google demonstrates its commitment to addressing this challenge head-on. The company’s efforts to reduce its carbon footprint while advancing AI technologies serve as a model for the industry. Ultimately, Google’s journey reflects the broader imperative for tech companies to innovate sustainably and responsibly.
As Google pioneers new energy solutions to support its AI endeavors, the tech world watches closely. The integration of nuclear power into its energy strategy could revolutionize how data centers are powered, setting a new standard for the industry. Will other tech giants follow suit and explore nuclear energy to fuel their AI advancements, or will they find alternative paths to sustainability?
This article is based on verified sources and supported by editorial technologies.
Did you like it? 4.5/5 (21)
AI Insights
Artificial intelligence to make professional sports debut as Oakland Ballers manager – CBS News
AI Insights
‘It is a war of drones now’: the ever-evolving tech dominating the frontline in Ukraine | Ukraine

“It’s more exhausting,” says Afer, a deputy commander of the “Da Vinci Wolves”, describing how one of the best-known battalions in Ukraine has to defend against constant Russian attacks. Where once the invaders might have tried small group assaults with armoured vehicles, now the tactic is to try and sneak through on foot one by one, evading frontline Ukrainian drones, and find somewhere to hide.
Under what little cover remains, survivors then try to gather a group of 10 or so and attack Ukrainian positions. It is costly – “in the last 24 hours we killed 11,” Afer says – but the assaults that previously might have happened once or twice a day are now relentless. To the Da Vinci commander it seems that the Russians are terrified of their own officers, which is why they follow near suicidal orders.
Reconnaissance drones monitor a burnt-out tree line west of Pokrovsk; the images come through to Da Vinci’s command centre at one end of a 130-metre-long underground bunker. “It’s very dangerous to have even a small break on watching,” Afer says, and the team works round the clock. The bunker, built in four or five weeks, contains multiple rooms, including a barracks for sleep. Another is an army mess with children’s drawings, reminders of family. The menu for the week is listed on the wall.
It is three and a half years into the Ukraine war and Donald Trump’s August peace initiative has made no progress. Meanwhile the conflict evolves. Afer explains that such is the development of FPV (first person view) drones, remotely piloted using an onboard camera, that the so-called kill zone now extends “12 to 14 kilometres” behind the front – the range at which a $500 drone, flying at up to 60mph, can strike. It means, Afer adds, that “all the logistics [food, ammunition and medical supplies] we are doing is either on foot or with the help of ground drones”.
Further in the rear, at a rural dacha now used by Da Vinci’s soldiers, several types of ground drones are parked. The idea has moved rapidly from concept to trial to reality. They include remotely controlled machine guns, and flat bed robot vehicles. One, the $12,000 Termit, has tracks for rough terrain and can carry 300kg over 12 miles with a top speed of 7 miles an hour.
Land drones save lives too. “Last night we evacuated a wounded man with two broken legs and a hole in his chest,” Afer continues. The whole process took “almost 20 hours” and involved two soldiers lifting the wounded man more than a mile to a land drone, which was able to cart the victim to a safe village. The soldier survived.
While Da Vinci reports its position is stable, endless Russian attempts at infiltration have been effective at revealing where the line is thinly held or poorly coordinated between neighbouring units. Russian troops last month penetrated Ukraine’s lines north-east of Pokrovsk near Dobropillya by as much as 12 miles – a dangerous moment in a critical sector, just ahead of Trump’s summit with Vladimir Putin in Alaska.
At first it was said a few dozen had broke through, but the final tally appears to have been much greater. Ukrainian military sources estimate that 2,000 Russians got through and that 1,100 infiltrators were killed in a fightback led by the 14th Chervona Kalyna brigade from Ukraine’s newly created Azov Corps – a rare setback for an otherwise slow but remorseless Russian advance.
That evening at another dacha used by Da Vinci, people linger in the yard while moths target the light bulbs. Inside, a specialist drone jammer sits on a gaming chair surrounded by seven screens arranged in a fan and supported by some complex carpentry.
It is too sensitive to photograph, but the team leader Oleksandr, whose call sign is Shoni, describes the jammer’s task. Both sides can intercept each other’s feeds from FPV drones and three screens are dedicated to capturing footage that can then help to locate them. Once discovered, the operator’s task is to find the radio frequency the drone is using and immobilise it with jammers hidden in the ground (unless, that is, they are fibre optic drones that use a fixed cable up to 12 miles long instead of a radio connection).
“We are jamming around 70%,” Shoni says, though he acknowledges that the Russians achieve a similar success rate. In their sector, this amounts to 30 to 35 enemy drones a day. At times, the proportion downed is higher. “During the last month, we closed the sky. We intercepted their pilots saying on the radio they could not fly,” he continues, but that changed after Russian artillery destroyed jamming gear on the ground. The battle, Shoni observes, ebbs and flows: “It is a war of drones now and there is a shield and there is a sword. We are the shield.”
A single drone pilot can operate 20 missions in 24 hours says Sean, who flies FPVs for Da Vinci, for several days at a stretch in a crew of two or three, hidden a few miles behind the frontline. Because the Russians are on the attack the main target is their infantry. Sean frankly acknowledges he is “killing at least three Russian soldiers” during that time, in the deadly struggle between ground and air. Does it make it easier to kill the enemy, from a distance? “How can we tell, we only know this,” says Dubok, another FPV pilot, sitting alongside Sean.
Other anti-drone defences are more sophisticated. Ukraine’s third brigade holds the northern Kharkiv sector, east of the Oskil River, but to the west are longer-range defence positions. Inside, a team member watches over a radar, mostly looking for signs of Russian Supercam, Orlan and Zala reconnaissance drones. If they see a target, two dash out into fields ripe with sunflowers to launch an Arbalet interceptor: a small delta wing drone made of a black polystyrene, which costs $500 and can be held in one hand.
The Arbalet’s top speed is a remarkable 110 miles an hour, though its battery life is a shortish 40 minutes. It is flown by a pilot hidden in the bunker via its camera using a sensitive hobbyists’ controller. The aim is to get it close enough to explode the grenade it carries and destroy the Russian drone. Buhan, one of the pilots, says “it is easier to learn how to fly it if you have never flown an FPV drone”.
It is an unusually wet and cloudy August day, which means a rare break from drone activity as the Russians will not be flying in the challenging conditions. The crew don’t want to launch the Arbalet in case they lose it, so there is time to talk. Buhan says he was a trading manager before the war, while Daos worked in investments. “I would have had a completely different life if it had not been for the war,” Daos continues, “but we all need to gather to fight to be free.”
So do the pilots feel motivated to carry on fighting when there appears to be no end? The two men look in my direction, and nod with a resolution not expressed in words.
AI Insights
Tech giants pay talent millions of dollars

Meta CEO Mark Zuckerberg offered $100 million signing bonuses to top OpenAI employees.
David Paul Morris | Bloomberg | Getty Images
The artificial intelligence arms race is heating up, and as tech giants scramble to come out on top, they’re dangling millions of dollars in front of a small talent pool of specialists in what’s become known as the AI talent war.
It’s seeing Big Tech firms like Meta, Microsoft, and Google compete for top AI researchers in an effort to bolster their artificial intelligence divisions and dominate the multibillion-dollar market.
Meta CEO Mark Zuckerberg recently embarked on an expensive hiring spree to beef up the company’s new AI Superintelligence Labs. This included poaching Scale AI co-founder Alexander Wang as part of a $14 billion investment into the startup.
OpenAI’s Chief Executive Sam Altman, meanwhile, recently said the Meta CEO had tried to tempt top OpenAI talent with $100 million signing bonuses and even higher compensation packages.
If I’m going to spend a billion dollars to build a [AI] model, $10 million for an engineer is a relatively low investment.
Alexandru Voica
Head of Corporate Affairs and Policy at Synthesia
Google is also a player in the talent war, tempting Varun Mohan, co-founder and CEO of artificial intelligence coding startup Windsurf, to join Google DeepMind in a $2.4 billion deal. Microsoft AI, meanwhile, has quietly hired two dozen Google DeepMind employees.
“In the software engineering space, there was an intense competition for talent even 15 years ago, but as artificial intelligence became more and more capable, the researchers and engineers that are specialized in this area has stayed relatively stable,” Alexandru Voica, head of corporate affairs and policy at AI video platform Synthesia, told CNBC Make It.
“You have this supply and demand situation where the demand now has skyrocketed, but the supply has been relatively constant, and as a result, there’s the [wage] inflation,” Voica, a former Meta employee and currently a consultant at the Mohamed bin Zayed University of Artificial Intelligence, added.
Voica said the multi-million dollar compensation packages are a phenomenon the industry has “never seen before.”
Here’s what’s behind the AI talent war:
Building AI models costs billions
The inflated salaries for specialists come hand-in-hand with the billion-dollar price tags of building AI models — the technology behind your favorite AI products like ChatGPT.
There are different types of AI companies. Some, like Synthesia, Cohere, Replika, and Lovable, build products; others, including OpenAI, Anthropic, Google, and Meta, build and train large language models.
“There’s only a handful of companies that can afford to build those types of models,” Voica said. “It’s very capital-intensive. You need to spend billions of dollars, and not a lot of companies have billions of dollars to spend on building a model. And as a result, those companies, the way they approach this is: ‘If I’m going to spend a billion dollars to build a model, $10 million for an engineer is a relatively low investment.'”
Anthropic’s CEO Dario Amodei told Time Magazine in 2024 that he expected the cost of training frontier AI models to be $1 billion that year.
Stanford University’s AI Institute recently produced a report that showed the estimated cost of building select AI models between 2019 and 2024. OpenAI’s GPT-4 cost $79 million to build in 2023, for example, while Google’s Gemini 1.0 Ultra was $192 million. Meta’s Llama 3.1-405B cost $170 million to build in 2024.
“Companies that build products pay to use these existing models and build on top of them, so the capital expenditure is lower and there isn’t as much pressure to burn money,” Voica said. “The space where things are very hot in terms of salaries are the companies that are building models.”
AI specialists are in demand
The average salary for a machine learning engineer in the U.S. is $175,000 in 2025, per Indeed data.
Pixelonestocker | Moment | Getty Images
Machine learning engineers are the AI professionals who can build and train these large language models — and demand for them is high on both sides of the Atlantic, Ben Litvinoff, associate director at technology recruitment company Robert Walters, said.
“There’s definitely a heavy increase in demand with regards to both AI-focused analytics and machine learning in particular, so people working with large language models and people deploying more advanced either GPT-backed or more advanced AI-driven technologies or solutions,” Litvinoff explained.
This includes a “slim talent pool” of experienced specialists who have worked in the industry for years, he said, as well as AI research scientists who have completed PhDs at the top five or six universities in the world and are being snapped up by tech giants upon graduating.
It’s leading to mega pay packets, with Zuckerberg reportedly offering $250 million to a 24-year-old AI genius Matt Deitke, who dropped out of a computer science doctoral program at the University of Washington.
Meta directed CNBC to Zuckerberg’s comments to The Information, where the Facebook founder said there’s an “absolute premium” for top talent.
“A lot of the specifics that have been reported aren’t accurate by themselves. But it is a very hot market. I mean, as you know, and there’s a small number of researchers, which are the best, who are in demand by all of the different labs,” Zuckerberg told the tech publication.
“The amount that is being spent to recruit the people is actually still quite small compared to the overall investment and all when you talk about super intelligence.”
Litvinoff estimated that, in the London market, machine learning engineers and principal engineers are currently earning six-figure salaries ranging from £140,000 to £300,000 for more senior roles, on average.
In the U.S., the average salary for a machine learning engineer is $175,000, reaching nearly $300,000 at the higher end, according to Indeed.
Startups and traditional industries get left behind
As tech giants continue to guzzle up the best minds in AI with the lure of mammoth salaries, there’s a risk that startups get left behind.
“Some of these startups that are trying to compete in this space of building models, it’s hard to see a way forward for them, because they’re stuck in the space of: the models are very expensive to build, but the companies that are buying those models, I don’t know if they can afford to pay the prices that cover the cost of building the model,” Voica noted.
Mark Miller, founder and CEO of Insurevision.ai, recently told Startups Magazine that this talent war was also creating a “massive opportunity gap” in traditional industries.
“Entire industries like insurance, healthcare, and logistics can’t compete on salary. They need innovation but can’t access the talent,” Miller said. “The current situation is absolutely unsustainable. You can’t have one industry hoarding all the talent while others wither.”
Voica said AI professionals will have to make a choice. While some will take Big Tech’s higher salaries and bureaucracy, others will lean towards startups, where salaries are lower, but staff have more ownership and impact.
“In a large company, you’re essentially a cog in a machine, whereas in a startup, you can have a lot of influence. You can have a lot of impact through your work, and you feel that impact,” Voica said.
Until the price of building AI models comes down, however, the high salaries for AI talent are likely to remain.
“As long as companies will have to spend billions of dollars to build the model, they will spend tens of millions, or hundreds of millions, to hire engineers to build those models,” Voica added.
“If all of a sudden tomorrow, the cost to build those models decreases by 10 times, the salaries I would expect would come down as well.”
-
Business1 week ago
The Guardian view on Trump and the Fed: independence is no substitute for accountability | Editorial
-
Tools & Platforms4 weeks ago
Building Trust in Military AI Starts with Opening the Black Box – War on the Rocks
-
Ethics & Policy1 month ago
SDAIA Supports Saudi Arabia’s Leadership in Shaping Global AI Ethics, Policy, and Research – وكالة الأنباء السعودية
-
Events & Conferences4 months ago
Journey to 1000 models: Scaling Instagram’s recommendation system
-
Jobs & Careers2 months ago
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
-
Education2 months ago
VEX Robotics launches AI-powered classroom robotics system
-
Podcasts & Talks2 months ago
Happy 4th of July! 🎆 Made with Veo 3 in Gemini
-
Funding & Business2 months ago
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries
-
Education2 months ago
Macron says UK and France have duty to tackle illegal migration ‘with humanity, solidarity and firmness’ – UK politics live | Politics
-
Podcasts & Talks2 months ago
OpenAI 🤝 @teamganassi