Education
Ready your teaching for the AI era with this six-part framework
I see AI framed as a marvel or a menace. What’s missing is a mirror. If AI can complete an assignment as well as a student, you have to ask, was that really meaningful? Sadly, many of our lessons, passive lectures, rote memorisation and formulaic writing were weak long before AI arrived.
I was mid-lecture on the “future of AI” when ChatGPT launched. I remember it clearly. My students, giddy. Me, queasy. “Can I use this to write my essay?” “Does this mean we still have finals?”
AI didn’t creep into education, it thrashed in, like the “gradually then suddenly” tipping point Malcolm Gladwell warned about. If you’re reading this, you’re now in one of two tribes: those who have taught students to work only with humans and those who never will. As Salesforce CEO Marc Benioff said, “Every CEO going forward is going to manage humans and [AI] agents together.”
Too many of us either resist or surrender to AI, but neither serves our students or schools.
AI is already reshaping our economy: 89 per cent of CEOs rank AI as key to future profitability but nearly half (47 per cent) say employees lack needed skills. They had better learn quickly. Shopify’s CEO recently announced staff will need to prove jobs can’t be done by AI before getting more headcount.
The World Economic Forum says creativity and problem solving are essential for future jobs. Salesforce’s head of AI describes this as “agency”, your initiative to see and solve real problems. AI can do many things, but we must creatively direct it: sage advice for ourselves and our students.
This is why I designed the THRIVE framework, to offer educators a guiding path that sees both AI potential and the irreplaceable value of your teaching. It’s a framework that puts you, not algorithms, on pace.
THRIVE with AI: a teacher-centred framework for adopting GenAI
There’s a moment, maybe you’ve had it too, when I’m staring at my stale lesson plans, and the clock insists there’s no time to revise. Students are coming in. And somewhere between my fifth tab and third coffee, I wonder, “Is this what teaching has become?”
For me, THRIVE began not as a framework but with a feeling – one I know many of you share. We’re all striving to spend less time on tasks that deplete us, and more on the moments that matter: the flash of insight in a student’s eyes; the conversation that shifts a view; the creativity no rubric could capture.
Despite the headlines, AI doesn’t need to replace those moments. It might help us reclaim them. These tools we fear could bring us back to what we love most: guiding, connecting and inspiring.
THRIVE can be a checklist but it’s more a mindset – an invitation to design your relationship with AI with the same care and intentionality you bring to your classroom. Each letter is a guidepost to help you navigate AI without losing your bearings or your purpose.
T – Transformative engagement
Imagine a class where students don’t sit absorbing information – they question it, reshape it, argue with it. With AI as a partner, engagement becomes less about consumption and more about creation.
I watched a teacher use AI to simulate a debate between Benjamin Franklin and King George III on freedom, governance and liberty. Students didn’t just watch, they participated, challenged biases and left with sharper minds. That’s not automation. That’s liberation.
Transformative engagement is about using AI to make learning more meaningful for students and for you. That means using AI to personalise content, support different learning styles or give students more agency in how they learn.
H – High productivity
I think often about what it means to be “productive” in education. The papers graded? The hours on slides? Or is it the quality of our presence with students, with ideas, with ourselves? When we let AI help our prep, we aren’t abdicating responsibility. We’re reclaiming time for feedback, for thought, for meaningful work of learning design.
With AI, I’ve revised lectures into a case format, saving hours. The focus shifts from formatting slides to facilitating discussion; from content delivery to engagement. High Productivity uses AI to clear the clutter, so you can focus on what really matters. By improving organisation and lesson planning, AI can help keep you on track, aligned with learning objectives.
R – Resilient adaptability
This pillar begins with humility. The kind that lets you say, “I don’t know this yet,” and keep going. AI changes fast – faster than classes or diplomas. But we’ve always been adapters. Think of how quickly you pivot when a lesson falls flat or a student walks in with a crisis. Resilience isn’t about mastery. It’s about curiosity tethered to care for yourself.
At my university, we’ve formed an AI Lunch and Learn where colleagues pilot tools, compare results and share dilemmas. A colleague shared, “We need to know more about what we can do with AI.” Agreed.
I – Imagination and creativity support
In my own MBA classes, students role-play as stakeholders in a space tourism firm, interacting with AI bots simulating C-suite executives. They build strategies and AI-assisted prototypes with plans we’d never get with traditional lectures. Used well, AI becomes a cognitive amplifier.
Challenge your students, and yourself, to use generative AI to make something new. Reimagine how it can be a rival, a customer, a collaborator to spark innovation. In an age where creativity is currency, educators become facilitators of possibility.
V – Value through ethics
AI, like many things, reflects the values we embed in it. What data is it trained on? Who does it include, or exclude? Ethics in the AI age is not a policy document, it’s a practice – one that begs the question: “How do I use this power without losing my principles?”
This is an opportunity to co-design an AI use policy with students. Together, you can explore algorithmic bias, plagiarism and transparency. I know you want to protect what matters: student trust, equity, privacy and dignity. And I think they want that too.
E – Efficient optimisation
There is a grace in simplicity. AI can help you spot which students are slipping, where confusion hides in data or how to restructure lessons for better flow. But optimisation isn’t about shaving minutes off the clock. It’s about clarity and alignment – letting the noise fall away so the learning signal comes through.
A provost uses an AI dashboard to spot dropout risks and intervene before it hits. They bring us into the loop, co-designing a solution supported by what the data means, and how to act on it. Brilliant.
What kind of teaching thrives with AI?
Bill Gates predicts AI in education will close the gap in teacher shortages. A London secondary school started using AI tools to help students prepare for exams. So, will AI replace teachers?
The better question is: what kind of teaching will flourish in a world of AI?
My hope is that THRIVE helps you find your answer, not by offering easy solutions, but by guiding you toward the ones that align with your values, your goals, your joy. Because if you approach AI with intention, it might just give you more time to be the educator you always wanted to be – the one you see in the mirror.
Patrick Lynch is AI faculty lead at Hult International Business School.
If you’d like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.
Education
The Guardian view on special needs reform: children’s needs must be the priority as the system is redesigned | Editorial
Children with special educational needs and disabilities (Send) must be supported through the education system to fulfil their potential as fully as possible. This is the bottom line for the families of the 1.6 million children with a recognised additional learning need in England, and all those who support them. It needs to be the government’s priority too.
There is no question that the rising number of children receiving extra help has placed pressure on schools and councils. There is wide agreement that the current trajectory is not sustainable. But if plans for reform are shaped around the aim of saving money by removing entitlements, rather than meeting the needs of children by improving schools, they should be expected to fail.
If ministers did not already know this, the Save Our Children’s Rights campaign launched this week ought to help. As it stands, there is no policy of restricting access to the education, health and care plans (EHCPs) that impose a legal duty on councils to provide specified support. But ministers’ criticisms of the adversarial aspects of the current system have led families to conclude that they should prepare for an attempt to remove their enforceable rights. Christine Lenehan, who advises the government, has indicated that the scope of EHCPs could be narrowed, while stressing a commitment to consultation. Tom Rees, who chairs the department for education’s specialist group, bluntly terms it “a bad system”.
Mr Rees’s panel has had its term extended until April. The education select committee will present the conclusions of its inquiry into the Send crisis in the autumn. Both should be listened to carefully. But the education secretary, Bridget Phillipson, and her team also need to show that they are capable of engaging beyond the circle of appointed experts and parliamentarians. Parents can make their views known through constituency MPs. Their voices and perspectives need to be heard in Whitehall too.
This is a hugely sensitive policy area. There is nothing parents care more about than the opportunities provided to their children, and this concern is intensified when those children have additional needs. Some positive steps have been taken during Labour’s first year. Increased capital spending on school buildings should make a difference to in-house provision, which relies on the availability of suitable spaces. Ministers are right, too, to focus on teacher training, while inclusion has been given greater prominence in the inspection framework. As with the NHS, there is a welcome emphasis on spreading best practice.
But big questions remain. Families are fearful that accountability mechanisms are going to be removed, and want to know how the new “inclusive mainstream” will be defined and judged. Councils are concerned about what happens to their £5bn in special needs budget deficits, when the so-called statutory override expires in 2028. The concerning role of private equity in special education – which mirrors changes in the children’s social care market – also needs addressing.
Schools need to adapt so that a greater range of pupils can be accommodated. The issue is how the government manages that process. The hope must be that the lesson ministers take from their failure on welfare is that consultation on highly sensitive changes, affecting millions of lives, must be thorough. In order to make change, they must build consensus.
-
Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.
Education
How AI is Transforming Education in Africa
Artificial Intelligence (AI) is reshaping industries across the globe, and education in Africa is no exception. From personalized learning platforms to AI-driven teacher training, the continent is witnessing a surge in innovative solutions tackling longstanding challenges. In this Q&A Insights piece, we dive into how AI is revolutionizing education, addressing questions from our iAfrica community about its impact, opportunities, and hurdles.
What are the biggest challenges in African education that AI can address?
Africa’s education sector faces issues like limited access to quality resources, teacher shortages, and diverse linguistic needs. AI can bridge these gaps in practical ways. For instance, AI-powered platforms like Eneza Education provide mobile-based learning in local languages, reaching students in remote areas with affordable, interactive content. Adaptive learning systems analyze student performance to tailor lessons, ensuring kids in overcrowded classrooms get personalized attention. AI also supports teacher training through virtual simulations, helping educators refine skills without costly in-person workshops.
“AI can democratize education by making high-quality resources accessible to students in rural areas.” – Dr. Aisha Mwinyi, EdTech Researcher
How is AI being used to improve access to education?
Access is a critical issue, with millions of African children out of school due to distance, poverty, or conflict. AI is stepping in with scalable solutions. Chatbots and virtual tutors, like those developed by Ustad Mobile, deliver bite-sized lessons via SMS or WhatsApp, working on basic phones for low-income communities. In Nigeria, uLesson uses AI to stream offline-capable video lessons, bypassing unreliable internet. These tools ensure learning continues in areas with limited infrastructure, from refugee camps to rural villages.
Can AI help with language barriers in education?
Absolutely. Africa’s linguistic diversity—over 2,000 languages—creates unique challenges. AI-driven translation tools, such as those integrated into Kolibri by Learning Equality, adapt content into local languages like Swahili, Yoruba, or Amharic. Speech-to-text and text-to-speech systems also help non-literate learners engage with digital materials. These innovations make education inclusive, especially for marginalized groups who speak minority languages.
What are some standout African AI education startups?
The continent is buzzing with homegrown talent. M-Shule in Kenya uses AI to deliver personalized SMS-based learning, focusing on primary school students. Chalkboard Education, operating in Ghana and Côte d’Ivoire, offers offline e-learning platforms for universities, using AI to track progress. South Africa’s Siyavula combines AI with open-source textbooks to provide math and science practice, serving millions of learners. These startups show Africa isn’t just adopting AI—it’s innovating with it.
What concerns exist about AI in education?
While the potential is huge, concerns linger. Data privacy is a big one—students’ personal information must be protected, especially in regions with weak regulations. There’s also the risk of over-reliance on tech, which could sideline human teachers. Affordability is another hurdle; AI solutions must be low-cost to scale. Experts emphasize the need for ethical AI frameworks, like those being developed by AI4D Africa, to ensure tools are culturally relevant and equitable.
“We must balance AI’s efficiency with the human touch that makes education transformative.” – Prof. Kwame Osei, Education Policy Expert
How can policymakers support AI in education?
Policymakers play a pivotal role. Investing in digital infrastructure—think affordable internet and device subsidies—is crucial. Governments should also fund local AI research, as seen in Rwanda’s Digital Skills Program, which trains youth to build EdTech solutions. Public-private partnerships can scale pilots, while clear regulations on data use build trust. Our community suggests tax incentives for EdTech startups to spur innovation.
What’s next for AI in African education?
The future is bright but demands action. AI could power virtual reality classrooms, making immersive learning accessible in underfunded schools. Predictive analytics might identify at-risk students early, reducing dropout rates. But scaling these requires collaboration—between governments, startups, and communities. As iAfrica’s Q&A Forum shows, Africans are eager to shape this future, asking sharp questions and sharing bold ideas.
Got more questions about AI in education? Drop them in our Q&A Forum and join the conversation shaping Africa’s tech-driven future.
Got more questions about AI in education? Drop them in an email to ai@africa.com and join the conversation shaping Africa’s tech-driven future.
Education
How Ivy League Schools Are Navigating AI In The Classroom
Harvard University
The widespread adoption and rapid advancement of Artificial Intelligence (AI) has had far-reaching consequences for education, from student writing and learning outcomes to the college admissions process. While AI can be a helpful tool for students in and outside of the classroom, it can also stunt students’ learning, autonomy, and critical thinking, and secondary and higher education institutions grapple with the promises and pitfalls of generative AI as a pedagogical tool. Given the polarizing nature of AI in higher education, university policies for engaging with AI vary widely both across and within institutions; however, there are some key consistencies across schools that can be informative for students as they prepare for college academics, as well as the parents and teachers trying to equip high school students for collegiate study amidst this new technological frontier.
Here are five defining elements of Ivy League schools’ approach to AI in education—and what they mean for students developing technological literacy:
1. Emphasis on Instructor and Course Autonomy
First and foremost, it is important to note that no Ivy League school has issued blanket rules on AI use—instead, like many other colleges and secondary schools, Ivy League AI policies emphasize the autonomy of individual instructors in setting policies for their courses. Princeton University’s policy states: “In some current Princeton courses, permitted use of AI must be disclosed with a description of how and why AI was used. Students might also be required to keep any recorded engagement they had with the AI tool (such as chat logs). When in doubt, students should confirm with an instructor whether AI is permitted and how to disclose its use.” Dartmouth likewise notes: “Instructors, programs, and schools may have a variety of reasons for allowing or disallowing Artificial Intelligence tools within a course, or course assignment(s), depending on intended learning outcomes. As such, instructors have authority to determine whether AI tools may be used in their course.”
With this in mind, high school students should be keenly aware that a particular teacher’s AI policies should not be viewed as indicative of all teachers’ attitudes or policies. While students may be permitted to use AI in brainstorming or editing papers at their high school, they should be careful not to grow reliant on these tools in their writing, as their college instructors may prohibit the technology in any capacity. Further, students should note that different disciplines may be more or less inclined toward AI tolerance—for instance, a prospective STEM student might have a wider bandwidth for using the technology than a student who hopes to study English. Because of this, the former should devote more of their time to understanding the technology and researching its uses in their field, whereas the latter should likely avoid employing AI in their work in any capacity, as collegiate policies will likely prohibit its use.
2. View of AI Misuse as Plagiarism / Academic Dishonesty
Just as important as learning to use generative AI in permissible and beneficial ways is learning how generative AI functions. Many Ivy League schools, including UPenn and Columbia, clearly state that AI misuse—whatever that may be in the context of a particular class or project, constitutes academic dishonesty and will be subject to discipline as such. The more students can understand the processes conducted by large language models, the more equipped they will be to make critical decisions about where its use is appropriate, when they need to provide citations, how to spot hallucinations, and how to prompt the technology to cite its sources, as well. Even where AI use is permitted, it is never a substitute for critical thinking, and students should be careful to evaluate all information independently and be transparent about their AI use when permitted.
Parents and teachers can help students in this regard by viewing the technology as a pedagogical tool; they should not only create appropriate boundaries for AI use, but also empower students with the knowledge of how AI works so that they do not view the technology as a magic content generator or unbiased problem-solver.
Relatedly, prestigious universities also emphasize privacy and ethics concerns related to AI usage in and outside of the classroom. UPenn, for instance, notes: “Members of the Penn community should adhere to established principles of respect for intellectual property, particularly copyrights when considering the creation of new data sets for training AI models. Avoid uploading confidential and/or proprietary information to AI platforms prior to seeking patent or copyright protection, as doing so could jeopardize IP rights.” Just as students should take a critical approach to evaluating AI sources, they should also be aware of potential copyright infringement and ethical violations related to generative AI use.
3. Openness to Change and Development in Response to New Technologies
Finally, this is an area of technology that is rapidly developing and changing—which means that colleges’ policies are changing too. Faculty at Ivy League and other top schools are encouraged to revisit their course policies regularly, experiment with new pedagogical methods, and guide students through the process of using AI in responsible, reflective ways. As Columbia’s AI policy notes, “Based on our collective experience with Generative AI use at the University, we anticipate that this guidance will evolve and be updated regularly.”
Just as students should not expect AI policies to be the same across classes or instructors, they should not expect these policies to remain fixed from year to year. The more that students can develop as independent and autonomous thinkers who use AI tools critically, the more they will be able to adapt to these changing policies and avoid the negative repercussions that come from AI policy violations.
Ultimately, students should approach AI with a curious, critical, and research-based mentality. It is essential that high school students looking forward to their collegiate career remember that schools are looking for dynamic, independent thinkers—while the indiscriminate use of AI can hinder their ability to showcase those qualities, a critical and informed approach can distinguish them as a knowledgeable citizen of our digital world.
-
Funding & Business7 days ago
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries
-
Jobs & Careers7 days ago
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
-
Mergers & Acquisitions6 days ago
Donald Trump suggests US government review subsidies to Elon Musk’s companies
-
Funding & Business6 days ago
Rethinking Venture Capital’s Talent Pipeline
-
Jobs & Careers6 days ago
Why Agentic AI Isn’t Pure Hype (And What Skeptics Aren’t Seeing Yet)
-
Funding & Business4 days ago
Sakana AI’s TreeQuest: Deploy multi-model teams that outperform individual LLMs by 30%
-
Funding & Business7 days ago
From chatbots to collaborators: How AI agents are reshaping enterprise work
-
Jobs & Careers6 days ago
Astrophel Aerospace Raises ₹6.84 Crore to Build Reusable Launch Vehicle
-
Jobs & Careers6 days ago
Telangana Launches TGDeX—India’s First State‑Led AI Public Infrastructure
-
Funding & Business6 days ago
Europe’s Most Ambitious Startups Aren’t Becoming Global; They’re Starting That Way