Education
Cost of children’s homes doubles but care can be poor

Alison HoltSocial affairs editor and
James Melley and Judith Burns

The cost of residential care for vulnerable children in England has nearly doubled in five years but many children still do not receive appropriate care, says a report from the independent public spending watchdog.
The National Audit Office (NAO) says councils on average spent £318,400 on each child placed in a children’s home in the year ending March 2024.
But these huge sums do not represent value for money, the report concludes.
“I do not know where the money is being spent,” says Ezra Quinton, now 20, who recalls smashed windows and broken glass in the showers of one of the care homes he was placed in.
Ezra, who now works for Become, a care leavers’ charity, first went into care aged nine.
Originally from Greater Manchester, he remembers being moved to a different home every few months, often many miles from where he originally lived.
He thinks he had up to 60 different placements and although he has spent most of his life in Salford and Stockport, he has lived in Wales, Liverpool, Crewe and Leeds.
His education was considerably disrupted but he did achieve C grades in all of his GCSEs.
At one home the windows were boarded up because of smashed windows.
“We were told to wear shoes if we wanted to shower because they didn’t clean up the glass properly,” he told BBC News.

The NAO report found rising costs were driven by a record number of children in care, the increasing complexity of their needs – and a profit driven market.
In 2023-24 councils spent £3.1bn on residential placements, in a market the report describes as “dysfunctional”.
It says councils are struggling to find enough appropriate placements, arguing that this allows many private care providers to cherry pick the children they take, based on how much support they need and how much profit this allows.
The report draws on previous research which showed the 15 largest providers of children’s homes making average profits of more than 22%.
Report author Emma Wilson says several factors contribute to rising costs but with the overwhelming majority (84%) of children’s homes run for profit: “It’s really important to get right that balance between supply of available care home places and demand.”
She wants the Department for Education to do more to oversee a market which she says is failing children in residential care.
“The NAO report concludes that the system of residential care for looked after children is not delivering value for money. On the one hand, costs have doubled to over three billion in the last five years, whilst many children are not in appropriate settings,” Ms Wilson told BBC News.
The report highlights how in March 2024 two thirds of children in residential care were in homes outside their local authority and almost half (49%) were more than 20 miles from home.
The Department for Education said in a statement: “Vulnerable children across the country have long been let down by years of drift and neglect in children’s social care, which this report lays bare.”
It added that it was “driving the largest ever reform of children’s social care” to “break the cycle of crisis for children” – pointing to its planned recruitment of more family help workers and new legislation aimed at ending profiteering in care homes.

Claire Bracey, interim chief executive of Become, says the report “is once again lifting the lid on the extortionate profits that are being made from providing homes for our most vulnerable children”.
“This market failure is leading to the most unforgivable failure [for] the futures of the children in our care…
“Children in care can’t wait. Urgent steps must be taken now,” she argues.
But some small, privately run, children’s homes insist they don’t make excessive profits.
Sara Milner, who set up Cherry Wood children’s home in Surrey four years ago, after a career in local authority care, says staffing accounts for 80% of costs.
“The fees we charge the local authority are reflective of our direct costs and we make moderate margins… but obviously we have to be able to make profits to be a viable business and to offer security for the young people’s future which is obviously really important when you’re doing this type of work,” she told BBC News.
With demand for places high, she had also hoped to invest in a second children’s home, but says current pressures, including rising costs and difficulties recruiting staff, mean that has been delayed.

The government has already said it plans to limit the profits private companies can make, however the Children’s Homes Association, which represents providers paying tax in the UK, argues that council-run homes can in fact be more expensive.
“We know that official data shows that local authority costs are higher,” said the association’s chief executive Mark Kerr.
“So if there’s a value for money question then the independent sector arguably demonstrates more value for money than local authorities,” he added.
Education
OpenAI inks deal with Greece for AI innovation in schools

Greece and OpenAI have signed a Memorandum of Understanding (MoU) to expand the integration of artificial intelligence (AI) in the country, eyeing education and SME utility.
Dubbed “OpenAI for Greece,” the collaboration between Greece and OpenAI was signed at the Hellenic Expo event, with key government officials in attendance. Prime Minister Kyriakos Mitsotakis, Onassis Foundation President Anthony Papadimitriou, and OpenAI Chief Global Affairs Officer Chris Lehane inked their signatures on the document.
Greece will pioneer the mainstream use of ChatGPT Edu, a tailor-made version of the AI chatbot for educational institutions. The MoU will back a phased pilot starting in the next academic session to integrate ChatGPT Edu into its educational system.
Under the first phase, authorities will focus on improving AI literacy among students and teachers in select institutions, with the second phase featuring a nationwide rollout. The Onassis Foundation will lead the implementation of ChatGPT Edu with The Tipping Point, which will be handling teacher onboarding.
“From Plato’s Academy to Aristotle’s Lyceum—Greece is the historical birthplace of western education,” said Lehane. “Today, with millions of Greeks using ChatGPT on a regular basis, the country is once again showing its dedication to learning and ideas.”
Going forward, a joint task force comprising representatives from the Prime Minister’s Office and the Ministry of Education will supervise the pilot project. Meanwhile, OpenAI will provide technical support and co-design a teacher’s training manual focusing on safety and productivity.
Launched in 2024, ChatGPT Edu has succeeded at leading universities, including Harvard and Oxford. OpenAI executives are confident of extended adoption levels in Europe, given its GDPR compliance, while offering students and teachers access to OpenAI’s latest models.
MoU poised to support the local startup ecosystem
Besides pushing to improve the local educational landscape, the MoU will launch the Greek AI Accelerator Program. The program will support Greek firms that are starting to build AI products and emerging technologies in partnership with Endeavor Greece.
Successful firms will have access to OpenAI technology and credits while receiving technical mentorship from OpenAI engineers. Furthermore, the program will offer tailored workshops on regulatory compliance and global safety standards while providing international exposure to local firms.
Before the MoU with OpenAI, Greece had taken early steps with AI to future-proof key industries. The country rolled out a national blueprint for AI while backing an AI-based platform to stifle the spread of fake news in Greek cyberspaces.
Indonesia unveils evaluation mechanism for responsible AI development
Months after launching a Center of Excellence for Artificial Intelligence (AI), Indonesian authorities have developed an evaluation mechanism for safe AI innovation for service providers.
According to a report by local news outlet Antara, the Ministry of Communications and Digital Affairs is the brainchild behind the evaluation mechanism. The newly minted mechanism is an attempt by the Ministry to ensure that AI innovation remains aligned with ethics and international best practices.
At the moment, Indonesia’s AI Ethics Guideline is still under development, but a self-assessment by local AI developers via an incident reporting system provided the Ministry with data for the evaluation mechanism.’
The Ministry’s Director of AI and New Technology Ecosystems, Aju Widya Sari, disclosed that the ethical guidelines will promote inclusiveness, safety, transparency, and accessibility. Following the above, the evaluation mechanism will reflect the ethical guidelines, allowing AI developers to operate under the highest global standards.
“The evaluation of the ethical guidelines will be carried out gradually to ensure ethical and responsible AI governance,” said Sari.
While not expressly stated, the evaluation mechanism will involve a multi-layer process involving rigorous checks from pre-deployment to post-deployment. Furthermore, pundits opine that the mechanism will feature metrics to measure fairness, transparency, privacy, and safety.
Sari disclosed that the evaluation mechanism will offer Indonesia the benefits of sustainability in its economy, social demographics, and environment. Indonesia has since launched an AI Center of Excellence to boost adoption demographics amid a keen stance for public safety.
Furthermore, the Southeast Asian country has made a significant play to deepen its talent pool for emerging technologies via a raft of initiatives.
AI regulation sweeps through the ecosystem
Amid the global push for AI innovation, attempts at regulation have gathered significant steam over the last year. While the EU has surged ahead with a regulatory playbook, other regions are adopting a cautious stance on AI rules for service providers.
Japan has adopted a friendly stance, while Switzerland is opting to remain neutral toward regulations, allowing the ecosystem to develop at its own pace. Meanwhile, UNESCO is riding a wave of international collaborations to promote ethical AI development, signing MoUs with Jamaica, Bangladesh, and the Netherlands.
In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.
Watch | Treechat AI: Empowering Super Creators
title=”YouTube video player” frameborder=”0″ allow=”accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share” referrerpolicy=”strict-origin-when-cross-origin” allowfullscreen=””>
Education
Medical education needs rigorous trials to validate AI’s role

The use of artificial intelligence in medical education is raising urgent questions about quality, ethics, and oversight. A new study explores how AI is being deployed in training programs and the risks of implementing poorly validated systems.
The research, titled “Artificial Intelligence in Medical Education: A Narrative Review on Implementation, Evaluation, and Methodological Challenges” and published in AI in 2025, presents a sweeping review of the ways AI is already embedded in undergraduate and postgraduate medical education. It highlights both the opportunities offered by AI-driven tutoring, simulations, diagnostics, and assessments, and the methodological and ethical shortcomings that could undermine its long-term effectiveness.
How AI is being implemented in medical education
The study found four major areas where AI is reshaping medical training: tutoring and content generation, simulation and practice, diagnostic skill-building, and competency assessment.
AI-driven tutoring is already gaining ground through large language models such as ChatGPT, which generate quizzes, exam preparation tools, and academic writing support. These tools have been shown to improve student engagement and test performance. However, the research underscores that such systems require constant human supervision to prevent factual errors and discourage students from outsourcing critical thinking.
Simulation and practice environments are another area of rapid development. Machine learning and virtual reality platforms are being deployed to train students in surgery, anesthesia, and emergency medicine. These systems deliver real-time performance feedback and can differentiate between novice and expert performance. Yet challenges persist, including scalability issues, lack of interpretability, and concerns that students may lose self-confidence if they rely too heavily on automated guidance.
Diagnostic training has also been revolutionized by AI. In specialties such as radiology, pathology, dermatology, and ultrasound, AI systems often outperform students in visual recognition tasks. While this demonstrates significant potential, the study warns that biased datasets and privacy concerns linked to biometric data collection could reinforce inequities. Over-reliance on automated diagnosis also risks weakening clinical judgment.
Competency assessment is the fourth area of innovation. Deep learning and computer vision tools now enable objective and continuous evaluation of motor, cognitive, and linguistic skills. They can identify expertise levels, track errors, and deliver adaptive feedback. Still, most of these tools suffer from limited validation, lack of generalizability across contexts, and weak clinical integration.
What risks and challenges are emerging
Enthusiasm for AI must be tempered by a recognition of its limitations, the study asserts. Methodologically, fewer than one-third of published studies rely on randomized controlled trials. Many evaluations are exploratory, small-scale, or short-term, limiting the evidence base for AI’s real impact on education.
There are also risks of passive learning. When students turn to AI systems for ready-made solutions, they may bypass the critical reasoning that medical training is designed to foster. This dynamic raises concerns about the erosion of clinical decision-making skills and the creation of over-dependent learners.
Ethical challenges are equally pressing. Training data for AI systems is often incomplete, unrepresentative, or biased, leading to disparities in how well these tools perform across different populations. Compliance with privacy frameworks such as GDPR remains inconsistent, especially when biometric or sensitive patient data is used in educational platforms. Unequal access to AI resources also risks widening the gap between well-resourced and low-resource institutions, exacerbating inequalities in global medical training.
The study also highlights gaps in faculty preparedness. Many educators lack sufficient AI literacy, leaving them unable to properly supervise or critically evaluate AI-assisted teaching. This threatens to create an uneven landscape in which some institutions adopt AI thoughtfully while others deploy it without adequate safeguards.
What must be done to ensure responsible adoption
The study provides a clear roadmap for addressing these challenges. At its core is the principle of human-in-the-loop supervision. AI should complement but never replace instructors, ensuring that students continue to develop critical reasoning alongside digital support.
The authors call for more rigorous research designs. Longitudinal, multicenter studies and randomized controlled trials are needed to generate evidence that is both reliable and generalizable. Without such studies, AI’s promise in medical education remains speculative.
Curriculum reform is another priority. AI literacy, ethics, and critical appraisal must become standard components of medical training so that students can understand not only how to use AI but also how to question and evaluate it. Educators, too, require training to guide responsible use and prevent misuse.
Finally, the study presses for inclusivity. Access to AI-driven tools must be extended to low-resource settings, ensuring that medical education worldwide benefits from innovation rather than reinforcing divides. Regulatory frameworks should also evolve to cover privacy, fairness, and accountability in AI-assisted learning.
Education
Children and teenagers share impact of pandemic in new report

Branwen JeffreysEducation Editor and
Erica Witherington

When lockdown started, college student Sam was living with his mum because his parents were separated.
Then his dad died unexpectedly, leaving him feeling that “something had been stolen” from him.
His experience is one of many being highlighted as the Covid-19 public inquiry prepares to look at the pandemic’s impact on children and young people.
A new report – seen exclusively by the BBC – includes individual accounts of 600 people who were under 18 during the pandemic.
They include happy memories of time spent with family, as well as the impact of disruption to schools being moved online, social isolation and the loss of relatives.
The inquiry will start hearing evidence on these issues from Monday 29 September.
‘I lost a relationship’

Wigan resident Sam was 12 during the first lockdowns and says he found it hard to understand the rules that prevented him spending more time with his dad.
His dad’s death left him struggling with regrets that he had “lost a relationship” because of the isolation before his father’s death.
“I do feel deep down that something has been stolen from me,” he says.
“But I do know that the procedures that we had to go through were right. It was a bad situation.”
Now 17, Sam’s resilience has sadly been tested further after the loss of his mum, who recently died from cancer.
But Sam says that strength he built up during Covid has helped give him “the tools to deal with grief alone”.
‘Trying to catch up on the lost moments’
Kate Eisenstein, who is part of the team leading the inquiry, says the pandemic was a “life-changing set of circumstances” for the children and teenagers who lived through it.
The impact of the pandemic set out in the testimony is hugely varied and includes happier memories from those who flourished in secure homes, enjoying online learning.
Other accounts capture the fears of children in fragile families with no escape from mental health issues or domestic violence.
Some describe the devastating sudden loss of parents or grandparents, followed by online or physically distanced funerals.
Grief for family members lost during the pandemic is an experience shared with some of Sam’s college classmates.
Student Ella told the BBC that losing her granddad during Covid had made her value spending more time with her grandma.
It is one of the ways in which Ella says she is trying to “catch up on the lost moments” she missed during Covid.
Living life online
One almost universal experience for children living through the pandemic was much of life shifting to online platforms.
While this allowed family connections and friendships to be maintained, Ms Eisenstein said some children had darker experiences, spending up to 19 hours a day online, leaving them “really anxious”.
“Some told us how they started comparing their body image to people online, how video games and social media distracted from their learning,” she said.
Most worrying, she said, were the accounts revealing an increased risk of adults seeking to exploit young children online, including sending nude images and inappropriate messages.
The remarkable variety of experiences, both positive and stressful, adds up to what she describes as “an unprecedented insight into children’s inner world”.
Aaliyah, a student at Winstanley College near Wigan, says the social isolation she experienced aged 11 led to her spending hours looking at social media, which began altering her self-confidence.
“With the content I was seeing online, I’d start to look in the mirror and go, ‘I could change that about myself,’ or ‘I don’t really like that about myself,'” she says.
Lasting effects

The inquiry is also expected to hear about the experiences of children still living with long Covid, like Avalyn, now 16, who became ill with the virus in October 2021.
While schools were beginning to return to normal, Avalyn was struggling with a deep and debilitating fatigue, and eventually left school for home education.
It took a year to get a formal diagnosis of long Covid and specialist advice.
“I enjoyed being in school, I enjoyed being social and seeing people, and then suddenly that was taken away from me very quickly,” Avalyn says.
Before long Covid, Avalyn says she was sporty at primary school and enjoyed acrobatics.
Like lots of other children her age, Avalyn has shown determination and resilience to achieve the things that might not have been so difficult in other circumstances, and she has now passed four GCSEs.
“I knew I wanted to do GCSEs to prove to myself especially that I still had the ability to do what everyone else was doing,” she says.
She still goes to a performing arts group, which allows her to join in as much or as little as she can manage.
Avalyn admits “it’s weird to say”, but in some ways she is “grateful” to have had long Covid, because of the things she has achieved during her long spells at home.
She has written, illustrated and self-published two children’s books and spent more time on her art.
While the path ahead is not straightforward, she says she is optimistic of finding a way to study and get into work.
The inquiry plans to hear evidence on the impact of children and young people across four weeks from 29 September to 23 October.
-
Business2 weeks ago
The Guardian view on Trump and the Fed: independence is no substitute for accountability | Editorial
-
Tools & Platforms1 month ago
Building Trust in Military AI Starts with Opening the Black Box – War on the Rocks
-
Ethics & Policy2 months ago
SDAIA Supports Saudi Arabia’s Leadership in Shaping Global AI Ethics, Policy, and Research – وكالة الأنباء السعودية
-
Events & Conferences4 months ago
Journey to 1000 models: Scaling Instagram’s recommendation system
-
Jobs & Careers3 months ago
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
-
Podcasts & Talks2 months ago
Happy 4th of July! 🎆 Made with Veo 3 in Gemini
-
Education3 months ago
VEX Robotics launches AI-powered classroom robotics system
-
Education2 months ago
Macron says UK and France have duty to tackle illegal migration ‘with humanity, solidarity and firmness’ – UK politics live | Politics
-
Podcasts & Talks2 months ago
OpenAI 🤝 @teamganassi
-
Funding & Business3 months ago
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries