AI Insights
How to avoid a puncture on the Moon
Technology Reporter
Going back to the Moon after half a century, and then to Mars, literally means reinventing the wheel.
After all, Mars is a long way to come back if you get a flat.
“One thing you cannot have is a puncture,” says Florent Menegaux, chief executive of the French tyre-maker Michelin.
The tough conditions on Mars have been underlined by the experience of the unmanned Curiosity rover.
Just a year after landing in 2012, its six rigid aluminium tyres were visibly ripped through with punctures and tears.
As for the Moon, the US Artemis missions aim to return astronauts there, perhaps by 2027.
Later Artemis missions plan to use a lunar rover to explore the Moon’s south pole starting with Artemis V, currently scheduled for 2030.
The Artemis astronauts will be driving much further than their Apollo forebears, who in six landings between 1969 and 1972 never ventured more than 25 miles (40km) across the Moon’s surface.
“The target is to cover 10,000 kilometres in 10 years,” says Sylvain Barthet, who runs Michelin’s lunar airless wheel programme in the central French town of Clermont Ferrand.
“We’re not talking about short, week-long durations, we’re talking about decades of utilisation,” says Dr Santo Padula, who has a PhD in materials science, and works for Nasa as an engineer at the John Glenn Research Centre in Cleveland, Ohio.
One big challenge for anyone developing technology for the Moon are the huge temperature ranges.
At the lunar poles temperatures can plunge lower than -230C, that’s not far off absolute zero, where atoms stop moving.
And that’s a problem for tyres.
“Without atom motion you have a hard time having the material be able to deform and return,” says Dr Padula.
The tyres need to be able to deform as they go over rocks and then ping back to their original shape.
“If we permanently deform a tyre, it doesn’t roll efficiently, and we have issues with power loss,” says Dr Padula.
The new wheels will also carry much bigger loads than the lightweight rovers Apollo astronauts cruised around in.
The next space missions will need to drive round “bigger science platforms and mobile habitats that get larger and larger”, he says.
And that will be an even heftier problem on Mars, where gravity is double that on the Moon.
Apollo’s lunar rovers used tyres made from zinc-coated piano wire in a woven mesh, with a range of around 21 miles.
Since extreme temperatures and cosmic rays break down rubber or turn it to a brittle glass, metal alloys and high-performance plastic are chief contenders for airless space tyres.
“In general, metallic or carbon fibre-based materials are used for these wheels,” says Pietro Baglion, team leader of the European Space Agency’s (ESA) Rosalind Franklin Mission, which aims to send its own rover to Mars by 2028.
One promising material is nitinol, an alloy of nickel and titanium.
“Fuse these and it makes a rubber-acting metal that can bend all these different ways, and it will always stretch back to its original shape, says Earl Patrick Cole, chief executive of The Smart Tire Company.
He calls nitinol’s flexible properties “one of the craziest things you will ever see”.
Nitinol is a potentially “revolutionary” material says Dr Padula, because the alloy also absorbs and releases energy as it changes states. It may even have solutions to heating and refrigeration, he says.
However, Mr Barthet at Michelin thinks that a material closer to a high-performance plastic will be more suitable for tyres that need to cover long distances on the Moon.
Bridgestone has meanwhile taken a bio-mimicry approach, by making a model of the footpads of camels.
Camels have soft, fatty footpads that disperse their weight on to a wider surface area, keeping their feet from sinking into loose sandy soil.
Inspired by that, Bridgestone is using a felt-like material for its tread, while the wheel comprises thin metal spokes that can flex.
The flexing divides the lunar module’s weight into a larger contact area, so it can drive without getting stuck in the fragments of rock and dust on the Moon’s surface.
Michelin and Bridgestone are each part of different consortiums that, along with California’s Venturi Astrolab, are presenting their proposed tyre tech to Nasa at the John Glenn Centre this month (May).
Nasa is expected to make a decision later this year – it might choose one proposal or adopt elements of several of them.
Meanwhile, Michelin is testing its tyres by driving a sample rover around on a volcano near Clermont, whose powdery terrain resembles the Moon’s surface.
Bridgestone is doing the same on western Japan’s Tottori Sand Dunes.
ESA is also exploring the possibility of whether Europe might make a rover on its own for other missions, says Mr Barthet.
The work might have some useful applications here on Earth.
While working on his doctorate at the University of Southern California, Dr Cole joined a Nasa entrepreneurial programme to work on commercialising some of the technology from the Mars super-elastic rover tyre.
An early product this year will be nickel-titanium bicycle tyres.
Priced around $150 (£120) each, the tyres are much more expensive than regular ones, but would be extremely durable.
He also plans to work this year on durable tyres for motorbikes, aimed at areas with rough roads.
For all this, his “dream” remains to play a part in humanity’s return to the Moon.
“So, I can tell my kids, look up there on the Moon,” he says. “Daddy’s tyres are up there.”
AI Insights
Real or AI: Band confirms use of artificial intelligence for its music on Spotify
The Velvet Sundown, a four-person band, or so it seems, has garnered a lot of attention on Spotify. It started posting music on the platform in early June and has since released two full albums with a few more singles and another album coming soon. Naturally, listeners started to accuse the band of being an AI-generated project, which as it now turns out, is true.
The band or music project called The Velvet Sundown has over a million monthly listeners on Spotify. That’s an impressive debut considering their first album called “Floating on Echoes” hit the music streaming platform on June 4. Then, on June 19, their second album called “Dust and Silence” was added to the library. Next week, July 14, will mark the release of the third album called “Paper Sun Rebellion.” Since their debut, listeners have accused the band of being an AI-generated project and now, the owners of the project have updated the Spotify bio and called it a “synthetic music project guided by human creative direction, and composed, voiced, and visualized with the support of artificial intelligence.”
It goes on to state that this project challenges the boundaries of “authorship, identity, and the future of music itself in the age of AI.” The owners claim that the characters, stories, music, voices, and lyrics are “original creations generated with the assistance of artificial intelligence tools,” but it is unclear to what extent AI was involved in the development process.
The band art shows four individuals suggesting they are owners of the project, but the images are likely AI-generated as well. Interestingly, Andrew Frelon (pseudonym) claimed to be the owner of the AI band initially, but then confirmed that was untrue and that he pretended to run their Twitter because he wanted to insert an “extra layer of weird into this story,” of this AI band.
As it stands now, The Velvet Sundown’s music is available on Spotify with the new album releasing next week. Now, whether this unveiling causes a spike or a decline in monthly listeners, remains to be seen.
I have always been passionate about gaming and technology, which drove me towards pursuing a career in the tech writing industry. I have spent over 7 years in the tech space and about a decade in content writing. I hope to continue to use this passion and generate informative, entertaining, and accurate content for readers.
AI Insights
How to Choose Between Deploying an AI Chatbot or Agent
In artificial intelligence, the trend du jour is AI agents, or algorithmic bots that can autonomously retrieve data and act on it.
AI Insights
Do AI systems socially interact the same way as living beings?
Key takeaways
- A new study that compares biological brains with artificial intelligence systems analyzed the neural network patterns that emerged during social and non-social tasks in mice and programmed artificial intelligence agents.
- UCLA researchers identified high-dimensional “shared” and “unique” neural subspaces when mice interact socially, as well as when AI agents engaged in social behaviors.
- Findings could help advance understanding of human social disorders and develop AI that can understand and engage in social interactions.
As AI systems are increasingly integrated into from virtual assistants and customer service agents to counseling and AI companions, an understanding of social neural dynamics is essential for both scientific and technological progress. A new study from UCLA researchers shows biological brains and AI systems develop remarkably similar neural patterns during social interaction.
The study, recently published in the journal Nature, reveals that when mice interact socially, specific brain cell types create synchronize in “shared neural spaces,” and artificial intelligence agents develop analogous patterns when engaging in social behaviors.
The new research represents a striking convergence of neuroscience and artificial intelligence, two of today’s most rapidly advancing fields. By directly comparing how biological brains and AI systems process social information, scientists can now better understand fundamental principles that govern social cognition across different types of intelligent systems. The findings could advance understanding of social disorders like autism while simultaneously informing the development of more sophisticated, socially aware AI systems.
This work was supported in part by , the National Science Foundation, the Packard Foundation, Vallee Foundation, Mallinckrodt Foundation and the Brain and Behavior Research Foundation.
Examining AI agents’ social behavior
A multidisciplinary team from UCLA’s departments of neurobiology, biological chemistry, bioengineering, electrical and computer engineering, and computer science across the David Geffen School of Medicine and UCLA Samueli School of Engineering used advanced brain imaging techniques to record activity from molecularly defined neurons in the dorsomedial prefrontal cortex of mice during social interactions. The researchers developed a novel computational framework to identify high-dimensional “shared” and “unique” neural subspaces across interacting individuals. The team then trained artificial intelligence agents to interact socially and applied the same analytical framework to examine neural network patterns in AI systems that emerged during social versus non-social tasks.
The research revealed striking parallels between biological and artificial systems during social interaction. In both mice and AI systems, neural activity could be partitioned into two distinct components: a “shared neural subspace” containing synchronized patterns between interacting entities, and a “unique neural subspace” containing activity specific to each individual.
Remarkably, GABAergic neurons — inhibitory brain cells that regulate neural activity —showed significantly larger shared neural spaces compared with glutamatergic neurons, which are the brain’s primary excitatory cells. This represents the first investigation of inter-brain neural dynamics in molecularly defined cell types, revealing previously unknown differences in how specific neuron types contribute to social synchronization.
When the same analytical framework was applied to AI agents, shared neural dynamics emerged as the artificial systems developed social interaction capabilities. Most importantly, when researchers selectively disrupted these shared neural components in artificial systems, social behaviors were substantially reduced, providing the direct evidence that synchronized neural patterns causally drive social interactions.
The study also revealed that shared neural dynamics don’t simply reflect coordinated behaviors between individuals, but emerge from representations of each other’s unique behavioral actions during social interaction.
“This discovery fundamentally changes how we think about social behavior across all intelligent systems,” said Weizhe Hong, professor of neurobiology, biological chemistry and bioengineering at UCLA and lead author of the new work. “We’ve shown for the first time that the neural mechanisms driving social interaction are remarkably similar between biological brains and artificial intelligence systems. This suggests we’ve identified a fundamental principle of how any intelligent system — whether biological or artificial — processes social information. The implications are significant for both understanding human social disorders and developing AI that can truly understand and engage in social interactions.”
Continuing research for treating social disorders and training AI
The research team plans to further investigate shared neural dynamics in different and potentially more complex social interactions. They also aim to explore how disruptions in shared neural space might contribute to social disorders and whether therapeutic interventions could restore healthy patterns of inter-brain synchronization. The artificial intelligence framework may serve as a platform for testing hypotheses about social neural mechanisms that are difficult to examine directly in biological systems. They also aim to develop methods to train socially intelligent AI.
The study was led by UCLA’s Hong and Jonathan Kao, associate professor of electrical and computer engineering. Co-first authors Xingjian Zhang and Nguyen Phi, along with collaborators Qin Li, Ryan Gorzek, Niklas Zwingenberger, Shan Huang, John Zhou, Lyle Kingsbury, Tara Raam, Ye Emily Wu and Don Wei contributed to the research.
-
Funding & Business7 days ago
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries
-
Jobs & Careers7 days ago
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
-
Mergers & Acquisitions7 days ago
Donald Trump suggests US government review subsidies to Elon Musk’s companies
-
Funding & Business6 days ago
Rethinking Venture Capital’s Talent Pipeline
-
Jobs & Careers6 days ago
Why Agentic AI Isn’t Pure Hype (And What Skeptics Aren’t Seeing Yet)
-
Funding & Business4 days ago
Sakana AI’s TreeQuest: Deploy multi-model teams that outperform individual LLMs by 30%
-
Jobs & Careers6 days ago
Astrophel Aerospace Raises ₹6.84 Crore to Build Reusable Launch Vehicle
-
Funding & Business7 days ago
From chatbots to collaborators: How AI agents are reshaping enterprise work
-
Funding & Business4 days ago
HOLY SMOKES! A new, 200% faster DeepSeek R1-0528 variant appears from German lab TNG Technology Consulting GmbH
-
Tools & Platforms6 days ago
Winning with AI – A Playbook for Pest Control Business Leaders to Drive Growth