Education
Why I Believe We Need to Redesign Schools Around Decision-Making
School shouldn’t just be a place to learn academic skills, but a place for students to practice making meaningful decisions about their learning and lives. I personally never faced a weighty decision about my learning until I had to declare a college major. In grade school, I was a confident student who knew how to ace tests and please my teachers. Once I got to college, however, my A-student record failed me. I had no idea what major I was passionate about, nor any of the steps to figure it out. I considered majoring in English since I loved reading, or maybe pre-med for financial stability. My practical immigrant parents talked me out of the first, and a terrible grade on a chemistry midterm out of the second.
It seems like luck that I eventually found my way to a lifelong profession as a K-8 educator, which has kept me eager to grow within it, unlike the spin-the-wheel decision-making I had during college. But it didn’t have to be this way. What if grade school were designed to teach students how to make decisions and know themselves deeply as much as it taught them math and literacy? What if school had helped me figure out early, often and intentionally what I wanted to learn or accomplish, and how I would do it?
My experiences as a student and later as a teacher in traditional K-8 schools convinced me that the entire purpose of school needed to be different. So, in 2019, when I found out that a former manager of mine was starting a school that answered the same questions that plagued me, I knew I wanted in. She brought me on board to help launch the school, and Red Bridge, a private, K-8 school, opened its doors in September 2020.
As a founding school leader, I’ve helped design systems and a student-initiated promotion process that gives students a voice in their education. While students don’t make every decision and still participate in teacher-driven parts of the day, what’s different in our design is that the school curriculum pushes them to explore three questions: “What do I want to learn?”; “When and how will I learn it?”; and “Is my learning the right level of challenge?” By asking these questions, we instill the importance of decision-making skills in students and a sense of responsibility for their learning that traditional school models otherwise lack.
What Do You Want to Learn?
Asking students what they want to learn shows them that their questions about the world are valuable, and hopefully gets them fired up to learn.
When I taught fifth grade at a school in Nevada, I had to follow the curriculum in the provided textbooks, and there was no room to deviate. One time, I planned a novel study around a book my students selected, but I was forced by an administrator to trade it in for standardized test prep. In contrast, at Red Bridge, we spend two weeks of each term immersed in a “deep dive”: a project-based learning unit designed around a question of students’ interest instead of regular instruction.
Two years ago, as we approached our last deep dive of the year, my team noticed students launching entrepreneurial endeavors during recess. Some were crafting bracelets and setting up bartering systems with them, and others expressed an interest in bake sales. To harness that curiosity, we designed the deep dive around the question, “How do you build a small business?” In week one, we created lessons for students on everything from organizational structure and ethical decision-making to budgeting; we then took students to visit local businesses to interview the owners. In week two, students collaborated with peers to pitch their own small business ideas; once their pitches were approved, they wrote business plans.
Walking around the culminating marketplace experience, I could see students brimming with pride as they presented their inventory, budgets and logos. We had taken their interests seriously and made room in the school experience to study a topic of their choice. The results were joyful, a little messy, but entirely theirs. If I had experiences in grade school that supported me in pursuing topics of my own interest, I would have known how to navigate the sudden responsibility I had over my learning when I got to college.
When and How Will You Learn It?
As a classroom teacher in traditional schools that focused heavily on compliance, I frequently wondered if my students could succeed in the future without me telling them what to do constantly. At one school, I was trained to have all 33 of my students place their pencils on their desks in the same spot at the same time, drill sergeant-style. I couldn’t foster ownership if the system itself required passivity, and I was convinced there had to be another way.
When it came time to design Red Bridge, our founder told me we would balance teacher-led time with student-led time by implementing a self-directed learning block. We designed the block so that, for an hour each day, students make their own learning plans, keep track of time, mark what they accomplish and transition between activities with relative independence. Our teachers explicitly teach students how to make time- and goal-management decisions during daily morning meetings.
A few years ago, three second graders at my school approached me after school, excited to show me their plan to launch an environmental club. The paper had a list of tasks: make signs, start a protest, pick up trash and write a book about the environment. They labeled each task “done,” “in progress,” or “not yet” — similar to the type of learning plan they made in self-directed time during the day. These young students took what they learned about setting goals, worked toward them and applied that sense of ownership to their personal lives. Their initiative gave me confidence they could navigate future goals, and that our school’s design was actually working.
Is Your Learning the Right Level of Challenge?
Perhaps the most powerful decision-making opportunity we’ve created at my school is a space for students to assess whether their learning is appropriately challenging and if they’re ready for the next step. Students’ primary cohorts are determined by their level of independence and self-directedness. When a student believes they are ready to move up, they complete a series of tasks and gather evidence of their readiness for greater responsibility.
Repeatedly, I’ve seen previously unmotivated students rise to the challenge. A parent once shared with me, “I was so worried the first few times about how disappointed he’d be if he failed. But when he finally succeeded, his pride in accomplishing something himself was amazing.”
Recently, a teacher reflected on a student who went through the process successfully and said, “Her whole attitude changed when she realized that her goals were in her own hands. She just started showing up differently for her learning.” Tackling this big decision lets students experience success and failure in a safe environment and develops self-reliant individuals who can handle any obstacle — whether it be academic, professional or personal — that comes their way in the future.
Building Student Confidence in Their Lives
Being a founding leader of this school has given me the opportunity to build the school of my dreams. These moments of student growth, fueled by ownership over their learning, are the reason I believe this kind of educational design matters for students of all backgrounds. School shouldn’t be a place where students listen passively to adults for the majority of their days. Schools should be designed to give students meaningful opportunities to make big decisions — that is how we set kids up for lifelong success. By emphasizing the what, how and challenge our students seek in their learning in the framework of our school design, we give students space to determine the pathway of their education.
I hope the students I once taught don’t have to stumble into their passions like I did, and I sincerely hope school helps them know themselves sooner and trust themselves more.
Education
OpenAI, Microsoft, and Anthropic pledge $23 million to help train American teachers on AI
Teachers are pulling up a chair to implement AI in the classroom.
The American Federation of Teachers (AFT) announced on Tuesday that it will open a training center in New York City devoted to teaching educators how to responsibly use AI systems in their work.
Also: Can AI save teachers from a crushing workload? There’s new evidence it might
Dubbed the National Center for AI Instruction, the training center will open this fall and kick off with a series of workshops on practical uses of AI for K-12 teachers. Representing close to two million members, the AFT is the second-largest teachers’ union in the United States. The effort is being launched in partnership with OpenAI, Microsoft, and Anthropic, who have pledged a cumulative $23 million for the hub.
“Now is the time to ensure Al empowers educators, students, and schools,” OpenAI wrote in a company blog post published Tuesday, announcing its plan to invest $10 million in the Center over the next five years. “For this to happen, teachers must lead the conversation around how to best harness its potential.”
Backlash and acceptance
The rise of generative AI chatbots like ChatGPT in recent years has sparked widespread concern among educators. These systems can write essays and responses to homework questions in seconds, suddenly making it difficult to determine if assignments have been completed by hand or by machine.
Also: The best free AI courses and certificates in 2025 – and I’ve tried many
At the same time, however, many teachers have actively embraced the technology: a recent Gallup poll found that six-in-ten teachers used AI at work in the most recent school year, helping them save time on tasks like preparing lesson plans and providing feedback on student assignments. To make educators feel more comfortable about using AI, companies including Anthropic and OpenAI have launched education-specific versions of their chatbots: Claude for Education and ChatGPT Edu, respectively.
Like many other industries that have suddenly had to contend with the ubiquity of powerful AI systems, the US education system has struggled to achieve a healthy balance with the technology. Some school systems, like New York City’s public schools, initially opted to ban its employees and students from using ChatGPT.
But over time, it’s become clear that AI isn’t going away, and that there is yet to be a long-term benefit in ignoring it. The NYC public school system later changed its no-ChatGPT policy, and some universities, like Duke University and the schools belonging to the California State University system, have begun providing premium ChatGPT services for free to students. Similarly, the Miami-Dade Public School system started deploying Google’s Gemini chatbot to 100,000 of its high school students earlier this year.
Also: Claude might be my new favorite AI tool for Android – here’s why
Like those university initiatives, the new partnership with the AFT will also benefit the AI companies sponsoring the effort, as it will place their technology into the hands of many thousands of new users.
President Trump issued an executive order in April focused on equipping students and teachers with AI literacy skills, signaling efforts like this one with AFT are in line with the administration’s forthcoming AI Action Plan, set to be released later this month.
Impact on critical thinking
Apologists for AI in the classroom will sometimes compare it to previous technologies, such as digital calculators or the internet, which felt disruptive at the time of their debut but have since become foundational to modern education.
Also: Heavy AI use at work has a surprising relationship to burnout, new study finds
A new body of research, however, is starting to show that using AI tools can inhibit critical thinking skills in human users. The technology’s long-term impacts on human cognition and education, therefore, could be far more pronounced than we can know today.
A recent study conducted by researchers from Carnegie Mellon University and Microsoft, for example, found that “while GenAI can improve worker efficiency, it can inhibit critical engagement with work and can potentially lead to long-term overreliance on the tool and diminished skill for independent problem-solving. Higher confidence in GenAI’s ability to perform a task is related to less critical thinking effort.”
An MIT Media Lab study yielded similar findings: that using AI “undeniably reduced the friction involved in answering participants’ questions,” but that “this convenience came at a cognitive cost, diminishing users’ inclination to critically evaluate the LLM’s output or ‘opinions’ (probabilistic answers based on the training datasets).”
Finding benefits while avoiding risks
The National Academy for AI Instruction aims to chart a path forward for educators in the age of AI, one that embraces the technology’s benefits while steering clear of the potential risks that are very much still coming into focus.
Also: Samsung just answered everyone’s biggest question about its AI strategy
“The direct connection between a teacher and their kids can never be replaced by new technologies,” Randi Weingarten, president of the AFT, said in a statement included in OpenAI’s blog post, “but if we learn how to harness it, set commonsense guardrails and put teachers in the driver’s seat, teaching and learning can be enhanced.”
Education
Pearson’s AI driven edtech
A Pearson veteran of 25 years, Sharon Hague has an extensive background in education including eight years of secondary school teaching. From the coal face of education to her strategic leadership role within Pearson, Hague’s guiding principal has always been harnessing the transformational power of education.
In this vein, she believes that AI is once in a generation opportunity to transform education at all ages and stages. According to Hague, AI’s power lies in its potential to “amplify the teacher, not to replace the teacher, but to reduce the administrative burden, manual data collection, and really support the teacher in concentrating on what they do best; interacting with young people, supporting, motivating and helping them with their learning.”
The Covid-19 pandemic was an inflection point—borne of necessity—for the global edtech market when teaching quickly pivoted online. Global edtech revenue in 2020 increased by some 23% to $158bn. And though the exceptionally strong growth rate did not hold, GlobalData expects the industry to grow at a compound annual growth rate of 9.8% between 2022 and 2030, reaching $535bn in 2030.
Pearson is looking towards the next big technology driven market shift by developing a suite of AI enabled products. The multinational corporation, headquartered in the UK, was founded in 1856 and has undergone many iterations which included national and international subsidiaries in manufacturing, electricity, oil, coal, banking and financial services, publishing (periodicals and books), and aviation. In 1998, Pearson plc formed Pearson Education, and by 2016, education publishing and services had become the company’s exclusive focus.
AI will unlock personalised education
Hague is also president of Pearson’s English Language Learning (ELL) division which provides English language assessments and learning materials for English learners globally. The ELL business has launched a smart lesson generator which enables teachers to identify lesson priorities and create lesson content and activities.
Aside from reducing the administrative burden for teaching staff, Hague notes that AI will unlock personalised learning in a way that has hitherto been unavailable without the physical presence of a teacher.
She is excited about AI addressing individual student needs, particularly when access arrangements and accessibility issues are proving to be a barrier, “. “It will enable children with different types of learning needs to be able to access more learning and “to create a better experience”.
Pearson is also developing an AI driven GCSE revision tool which will allow students to receive feedback and suggestions outside the classroom in the absence of a teacher. “The child will write a response, and then the app gives feedback on what they’ve covered, as well as things to think about, and links to further practice,” explains Hague.
The tool is being developed in collaboration with teachers and, though still in development, the feedback is reportedly positive. The company is developing a similar product for use in higher education to deliver instantaneous feedback or suggestions when a tutor is unavailable.
Though adoption is not guaranteed, Pearson’s own research published in Dec 2024, found that some 58% of US Higher Education students say they are using generative AI for academic purposes (up 8% percentage points from Spring 2024).
Can AI replace teachers?
AI hallucinations and mishaps make striking headlines and carry reputational risk for any business launching AI tools. So, what of accuracy? Pearson’s background in pedagogy and learning science, combined with the high-quality and trusted content means that the company is well placed to deliver products in an industry that requires a high level of accuracy, says Hague. “I think we are drawing on content that we know is high quality, is proprietary content, so anyone using the tool can be assured it’s accurate,” she says.
When pressed on the potential for hallucinations in feedback offered to students, Hague reiterates that the tools in development are not being designed to replace teachers, but only to support existing teaching processes. “We’re not letting it [LLM] go out into the wild internet and just put anything in. With both our revision tool, and our smart lesson generator, we thought really carefully about how it’s designed, because we’re conscious that, particularly when young people are preparing for GCSE where there are certain requirements, that it’s not just pulling from anywhere.”
Looking forward, Hague envisions the technology component in education to increase with use cases including anything from teaching support to a student’s experience of examinations. “At the moment, if you take a colour coded paper-based examination, you could do that much more seamlessly on a screen. The child could actually personalise how they’re viewing the exam paper, so that it would meet their needs,” says Hague.
In the same way, speech recognition could enable many different tools for children with different accessibility requirements. “There are lots of opportunities within school-based education and higher education,” says Hague.
Will AI erode students’ critical thinking?
As AI infiltrates traditional learning processes, concerns are growing around the erosion of the human capacity for critical thinking. Pearson conducted analyses of over 128,000 student queries to Pearson’s AI study tools in Campbell Biology, the most popular title used in introductory biology courses.
The company categorised student inputs using Bloom’s Taxonomy, a framework used to understand the cognitive complexity of students’ questions and found that one-third of student queries were at higher levels of cognitive complexity, with 20% of student inputs reflecting the critical thinking skills essential for deeper learning.
Edtech for the enterprise
Pearson is also working with enterprise clients in upskilling and training staff around how they leverage AI. “There’s an increasing skills gap that we’re all aware of,” notes Hague.
Pearson’s Fathom tool analyses automation opportunities within the enterprise. The tool can be used for re-skilling purposes, for talent planning and assessing these skills gaps. This is particularly relevant to the UK market, says Hague, as UK companies spend 50% less on continuous training than their European counterparts.
“There’s some really great opportunities around continuous review people’s work whilst they’re working, by giving them feedback and address skills gaps to improve people’s performance as they’re working.
And technology enables you to do that, rather than more traditional routes, where you might have a training for a couple of days,” says Hague.
Ongoing credentialing is another AI application for the enterprise market. Pearson’s certification system Credly is digital badging that recognises workplace achievement and continuous learning. This employee certification earned as they learn while in the workflow is something they can take with them into their career.
Hague’s advice to businesses looking at implementing AI tools is to analyse where the opportunities lie for automation, develop skills and re-skill within the workforce for future needs. “Hiring is not going to be the way to solve everything you’ve really got to focus on training and re-skilling at the same time,” she says. And as the edtech landscape becomes increasingly AI driven, the need for companies to address their skill requirements only grows more urgent.
Education
Anthropic Continue The Push For AI In Education
Anthropic Continue The Push For AI In Education
Let’s be honest. AI has already taken a seat in the classroom. Google, Microsoft, OpenAI, Anthropic have all been pushing hard. Today brings more announcements from Athropic, the company behind the AI chatbot Claude, adding even more momentum. The shift isn’t subtle anymore. It’s fast, loud and it’s happening whether schools are ready or not.
It’s not only big tech. The U.S. government is also driving efforts to integrate A1 into education.
The Balance of Innovation and Safety
There’s real concern, and for good reason. Sure, the benefits are hard to ignore. AI tutoring, lighter workloads for teachers, more personalized learning paths for students. It all sounds great. But there’s a flip side. Missteps here could make existing education gaps worse. And once the damage is done, it’s tough to undo.
Many policymakers are stepping in early. They’re drafting ethical guardrails, pushing for equitable access, and starting to fund research into what responsible use of AI in education really looks like. Not as a PR move, but because the stakes are very real.
Meanwhile, the tech companies are sprinting. Google is handing out AI tools for schools at no cost, clearly aiming for reach. The strategy is simple: remove barriers and get in early. Just yesterday Microsoft, OpenAI, and Anthropic teamed up to build a national AI academy for teachers. An acknowledgment that it’s not the tools, but the people using them, that determine success. Teachers aren’t optional in this equation. They’re central.
Claude’s New Education Efforts
Claude for Education’s recent moves highlight what effective integration could look like. Its Canvas integration means students don’t need to log into another platform or juggle windows. Claude just works inside what they’re already using. That kind of invisible tech, could be the kind that sticks.
Then there’s the Panopto partnership. Students can now access lecture transcripts directly in their Claude conversations. Ask a question about a concept from class and Claude can pull the relevant sections right away. No need to rewatch an entire lecture or scrub through timestamps. It’s like giving every student their own research assistant.
And they’ve gone further. Through Wiley, Claude can now pull from a massive library of peer-reviewed academic sources. That’s huge. AI tools are often criticized for producing shaky or misleading information. But with access to vetted, high-quality content, Claude’s answers become more trustworthy. In a world overflowing with misinformation, that matters more than ever.
Josh Jarrett, senior vice president of AI growth at Wiley, emphasized this: “The future of research depends on keeping high-quality, peer-reviewed content central to AI-powered discovery. This partnership sets the standard for integrating trusted scientific content with AI platforms.”
Claude for Education are building a grassroots movement on campuses, too. Their student ambassador program is growing fast and new Claude Builder Clubs are popping up at universities around the world. Rather than being coding bootcamps or formal classes, they’re open spaces where students explore what they can actually make with AI. Workshops, demo nights and group builds.
These clubs are for everyone. Not just computer science majors. Claude’s tools are accessible enough that students in any field, from philosophy to marketing, can start building. That kind of openness helps make AI feel less like elite tech and more like something anyone can use creatively.
Privacy is a big theme here, too. Claude seems to be doing things right. Conversations are private, they’re not used for model training and any data-sharing with schools requires formal approvals.cStudents need to feel safe using AI tools. Without that trust, none of this works long term.
At the University of San Francisco School of Law, students are working with Claude to analyze legal arguments, map evidence and prep for trial scenarios. This is critical training for the jobs they’ll have after graduation. In the UK, Northumbria University is also leaning in. Their focus is on equity, digital access and preparing students for a workplace that’s already being shaped by AI
Graham Wynn, vice-chancellor for education at Northumbria University, puts the ethical side of AI front and center: “The availability of secure and ethical AI tools is a
significant consideration for our applicants, and our investment in Claude for Education
will position Northumbria as a forward-thinking leader in ethical AI innovation.”
They see tools like Claude not just as educational add-ons, but as part of a broader strategy to drive social mobility and reduce digital poverty. If you’re serious about AI in education, that’s the level of thinking it takes.
Avoiding Complexity and Closing Gaps
The core truth here is simple. AI’s role in education is growing whether we plan for it or not. The technology is getting more capable. The infrastructure is being built. But what still needs to grow, is a culture of responsible use. The challenge for education isn’t chasing an even smarter tool, but ensuring the tools we have serve all students equally.
That means listening to educators. It means designing for inclusion from the ground up. It means making sure AI becomes something that empowers students, not just another layer of complexity.
The next few years will shape everything. If we get this right, AI could help close long-standing gaps in education. If we don’t, we risk deepening them in ways we’ll regret later.
This is more than a tech story. It’s a human one. And the decisions being made today will echo for a long time.
-
Funding & Business1 week ago
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries
-
Jobs & Careers1 week ago
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
-
Mergers & Acquisitions1 week ago
Donald Trump suggests US government review subsidies to Elon Musk’s companies
-
Funding & Business1 week ago
Rethinking Venture Capital’s Talent Pipeline
-
Jobs & Careers1 week ago
Why Agentic AI Isn’t Pure Hype (And What Skeptics Aren’t Seeing Yet)
-
Education2 days ago
9 AI Ethics Scenarios (and What School Librarians Would Do)
-
Education2 days ago
Teachers see online learning as critical for workforce readiness in 2025
-
Education3 days ago
Nursery teachers to get £4,500 to work in disadvantaged areas
-
Education4 days ago
How ChatGPT is breaking higher education, explained
-
Education2 days ago
Labour vows to protect Sure Start-type system from any future Reform assault | Children