Everyone in education, from K-12 teachers to university presidents, is well aware that AI is transforming the classroom. That presents all the challenges you’ve probably already heard of: students using ChatGPT to cheat, churning out papers and assignments without a second thought. But there’s also the more underreported development — teachers are deploying the technology to write lesson plans, make quizzes, and streamline administrative tasks, saving them hours of grunt work.
Education
ChatGPT isn’t just for cheating. Teachers are using AI to save time.

In the best-case scenario, AI promises to make teachers better at their jobs. And ultimately, if AI becomes the transformative force optimists hope it will, that will help students get smarter, becoming a tireless teaching aid and providing 24-hour tutoring assistance. That’s a big if, of course.
At the very least, the time saving element for teachers is real, and it’s a big deal. A recent survey from Gallup and the Walton Family Foundation found that 6 in 10 teachers used AI for their work in the 2023-2024 school year. Those that used AI weekly — about a third of the teachers surveyed — estimated it saved them about 6 hours each week, which, in the best of circumstances, could mean that’s 6 more hours of face time with students.
“This is not plugging students in front of computers, engaging with a chat bot,” Chris Agnew, director of the Generative AI in Education Hub at Stanford, said. “This is supporting teacher practice and then enabling this trained, experienced adult that’s in front of kids.”
Of course, giving teachers some time back doesn’t necessarily curb AI cheating. The savviest educators have clear guidelines for when AI can be used and when it can’t, as well as a good system in place for discussing the technology’s evolving role in school. After all, this is hardly the first time a new technology has swept into schools and upended old ways of doing things — educators used to worry about calculators in the classroom.
“We went from the phase of, ‘Ban AI, it’s a cheating tool,’ to now, the majority of the market really is, ‘How do we leverage these tools in really productive ways?’”
This also isn’t the first time a new technology has opened up a huge business opportunity for tech companies to reach young, inquiring minds and make a lot of money in the process. Google, for instance, now offers its Workspace for Education with Gemini built-in for up to $66 per teacher per month. In a school district of 500 teachers, that could easily add up to an extra $400,000 a year. For school districts that use a learning management system, like Canvas by Instructure, or an AI-powered tutor, like Khanmigo by Khan Academy, the cost of tech-centric education could keep growing.
“We went from the phase of, ‘Ban AI, it’s a cheating tool,’ to now, the majority of the market really is, ‘How do we leverage these tools in really productive ways?’” said Ryan Lufkin, vice president of global academic strategy at Instructure, whose Canvas software is used by half of North American college students and over a third of K-12 students.
What the classroom experience will look like in a decade, much further into the AI revolution, is anyone’s guess. In corporate America, companies are pouring billions of dollars into AI, hoping for transformative profits. So far, that’s not going great.
If you’re a parent, you might feel a bit powerless in this situation. Tech companies and school districts are making decisions that will impact your kid, who may or may not be using ChatGPT already to do their homework. But because we’re in the early days of this technology, now is the time to learn about how it works and what your school district is doing with it.
Beating the cheating problem
If you set aside the idea that large language models could reinvent the American education system — which is not great, by global standards — you might be curious about the ChatGPT cheating problem, especially if you’re a parent.
It’s hard to tell just how many students are cheating with robots. A Pew survey of teens found that 26 percent of middle and high school students were using ChatGPT — for both nefarious and less nefarious purposes — in 2024, a percentage that had doubled since 2023. Another study from 2024 that tracked high school students’ cheating from before and after ChatGPT’s release found no indications that it had “dramatically changed the prevalence of cheating.” Regardless, a New York magazine feature earlier this year declared that “ChatGPT has unraveled the entire academic project.”
Proposed solutions to the cheating problem, however serious it is, are kind of funny. As the use of ChatGPT has increased on college campuses, for instance, so have the sales of blue books, according to the Wall Street Journal. Students can’t use AI when they’re locked in a room with nothing but a pencil and paper, after all. Then there’s the call to bring back oral exams, including proposals to use video conferencing software to conduct hundreds of them at once. Researchers at the Georgia Institute of Technology even invented a platform for oral exams that, somewhat ironically, uses AI to grade the students. There are other creative workarounds, too, like requiring students to show their work by tracking changes in Google Docs or asking them to generate ChatGPT essays and then critique them.
Banning AI completely is increasingly unpopular. New York City Public Schools, the nation’s largest school district, banned ChatGPT not long after its release in 2022 and then lifted that ban a few months later. “The knee-jerk fear and risk overlooked the potential of generative AI to support students and teachers, as well as the reality that our students are participating in and will work in a world where understanding generative AI is crucial,” then-chancellor David Banks wrote in an op-ed. “While initial caution was justified, it has now evolved into an exploration and careful examination of this new technology’s power and risks.”
That exploration period seems to be ongoing for many K-12 schools. By the end of last year, the city’s comptroller, Brad Lander, called on the city’s Department of Education to pull nearly $2 million in funding for AI software, because it had not studied the efficacy of AI in the classroom.
The next edtech gold rush
Schools are nevertheless spending money on AI tools, whether they’re for teachers or for students. This represents just the latest raft of investment in education technology, or edtech. For the past four decades, putting computers into classrooms and screens in front of students has promised to transform learning. And for 40 years, it’s failed to fulfill that promise. Student performance has remained flat, while spending on edtech and training teachers how to use it has grown.
It’s unclear if AI can change this trend. Once you look beyond trying to stop students from using ChatGPT to cheat, you can get pretty creative with how AI might play a role in the classroom. You could imagine, for instance, that students will write fewer essays and might instead interact with a chatbot the way they’d talk to a human tutor. Khan Academy, a major edtech company, is piloting a chatbot it built with OpenAI called Khanmigo in 266 school districts nationwide. Khan Academy founder Sal Khan recently told Anderson Cooper that his dream is to give every student a private tutor. Khanmigo currently costs $4 per month per student.
What’s more promising in the near future is giving teachers access to AI that can lead to new classroom experiences. Aside from its private tutor powers, Khanmigo can help create lesson plans and then integrate the chatbot into them, according to Kristen DiCerbo, chief learning officer at Khan Academy. She explained a scenario to me in which several Khanmigo agents essentially worked like teacher’s assistants, checking in on groups of students during a lesson. “We think of it as like a force multiplier for the teacher, giving them just a little more reach in terms of what they can get done in the classroom,” DiCerbo said.
Aside from powering tools like this, OpenAI recently announced an education effort of its own in ChatGPT called study mode. This effectively turns ChatGPT into a tutor that replies to questions with more questions rather than answers. This is in addition to ChatGPT Edu, which launched last year and offers a version of ChatGPT built just for universities at a discount. Google is similarly marketing its Gemini Pro plan to students, who can currently get one year for free. Anthropic is selling a version of its Claude chatbot to universities, too. All of these education-specific products work a lot like the consumer versions but don’t train their models on student data.
That all sounds good in theory, and it all costs money. It goes without saying that schools with more resources will be able to take better advantage of these new AI tools, possibly improving teachers’ lives and student performance.
“Technology is not and never has been a silver bullet to address some of these more structural issues that exist in our education system,” said Robbie Torney, senior director of AI programs at Common Sense Media.
This is assuming that AI in education actually delivers the desired results, which would defy the decades-long trend in edtech. Despite initiatives that date back to the ’90s to give schools cheap and easy access to the internet, a quarter of the school districts in the US don’t even have broadband that’s fast enough to support some of these applications. It’s hard to have an education revolution when the page won’t load.
So, for a number of reasons, chatbots won’t be replacing teachers any time soon. More teachers may enlist AI to mix up their lesson plans, and students will inevitably try to find high-tech ways to get homework help. A chatbot that refuses to tell them answers might be their best hope.
A version of this story was also published in the User Friendly newsletter. Sign up here so you don’t miss the next one!
Education
US Education Department is all for using AI in classrooms: Key guidelines explained

Artificial intelligence (AI) has moved from being a futuristic concept to an active part of classrooms across the United States. From adaptive learning platforms to AI-powered lesson planning, schools are integrating technology to improve learning outcomes and ease teacher workloads. However, the challenge lies in adopting these tools without violating federal and state regulations.
Federal guidance: Innovation with safeguards
In July 2025, the US Department of Education issued guidance confirming that AI can be used in schools when aligned with federal laws. The framework focuses on three core principles—privacy, equity, and human oversight.AI tools must comply with the Family Educational Rights and Privacy Act (FERPA) to protect student data. Algorithms should be designed to prevent bias or discrimination under civil rights regulations. Human decision-making must remain central, ensuring that AI supports educators rather than replacing them.The Department also encouraged schools seeking federal grants to propose AI-driven projects, provided they meet these compliance standards.
State-level action: Rapid policy development
Since the federal guidance, more than half of US states have introduced their own AI frameworks for schools. Ohio now mandates that all districts adopt an AI-use policy by mid-2026, while Rhode Island has published detailed recommendations for responsible classroom integration.These local rules aim to ensure innovation while safeguarding student interests. However, the pace of policy development and the diversity of approaches have created a complex regulatory environment for schools.
Mixed practices at the local level
Despite progress, many districts still operate in a gray area. Policies differ widely between schools, and families often face uncertainty about what is permissible. Some institutions allow AI on personal devices while banning it on school-owned systems. In certain cases, schools have reverted to traditional measures, such as requiring handwritten essays in class to prevent AI-assisted work.This variation highlights the need for consistent guidelines and clear communication with students and parents.
AI as a classroom resource
Educators are increasingly using AI as a tool for efficiency and creativity. AI platforms assist in lesson planning, assessment design, and content generation, enabling teachers to save significant time on administrative work. These efficiencies allow more focus on interactive teaching and student engagement.AI-powered tutoring systems are also being introduced to provide personalised support, particularly for students who need extra academic help. States such as New Hampshire are experimenting with AI-driven tools to enhance math and reading instruction.
Responsible AI use: Best practices for schools
To remain compliant and maximise benefits, schools should adopt structured approaches to AI integration:
- Personalised Learning: Use adaptive platforms to tailor lessons while ensuring compliance with privacy regulations.
- Teacher Support: Allow educators to use AI for planning and administrative tasks with mandatory human review.
- Assessment Integrity: Shift from take-home essays to in-class writing or oral presentations to discourage misuse.
- Career Guidance: Deploy AI-driven counselling tools while retaining human oversight for final decisions.
Managing risks and ensuring compliance
AI adoption brings challenges that schools must address proactively:
- Bias Prevention: Regular audits are necessary to eliminate algorithmic bias.
- Privacy Protection: All tools should meet FERPA standards and undergo security checks.
- Avoiding Over-Reliance: AI should support, not replace, teacher judgment in academic and disciplinary matters.
Comprehensive district-level policies, continuous teacher training, and stakeholder engagement are essential for responsible use.
The road ahead
The Department of Education is collecting public feedback on AI-related policies and exploring ways to integrate AI into its own operations. States will continue rolling out new requirements in the coming months, making 2025 a critical year for AI in education.The future of AI in classrooms depends on a balanced approach—leveraging its potential to improve learning while upholding legal and ethical standards. Schools that integrate AI responsibly will not only enhance student outcomes but also prepare learners for a technology-driven world.
Education
State Superintendent Thurmond Convenes Statewide AI in Education Workgroup for Public Schools – Van Nuys News Press

SACRAMENTO—State Superintendent of Public Instruction Tony Thurmond hosted the first meeting today of the Public Schools: Artificial Intelligence (AI) Workgroup at the California Department of Education (CDE) Headquarters in Sacramento. Established after last year’s passage of Senate Bill 1288, a bill authored by Senator Josh Becker (13th District) and sponsored by Superintendent Thurmond, the workgroup marks California as one of the first states in the nation to establish a legislatively mandated statewide effort focused on AI in K–12 education.
“There is an urgent need for clear direction on AI use in schools to ensure technology enhances, rather than replaces, the vital role of educators,” said Superintendent Thurmond. “Workgroup members are representatives from various organizations, including technology leaders. The majority are educators, and this workgroup also includes students. We want to ensure that those who will be affected by this guidance and policy have a voice in creating it.”
The workgroup is a model of Superintendent Thurmond’s efforts to develop strong public–private partnerships that power innovation in public education. It will develop the statewide guidance and a model policy to ensure AI benefits students and educators while safeguarding privacy, data security, and academic integrity. The group includes teachers, students, administrators, classified staff, higher education leaders, and industry experts. At least half of the members are current classroom teachers, elevating educator expertise as the foundation for decision-making.
The launch of the Public Schools: Artificial Intelligence Workgroup directly advances Superintendent Thurmond’s priorities, which include
- Transforming Education with Innovation: equipping schools with equitable, forward-looking approaches to technology;
- Equity and Access for All Students: ensuring AI tools do not exacerbate inequities but instead expand opportunities for every student;
- Whole Child Support: safeguarding against bias, misuse, and misinformation in AI systems while protecting student well-being;
- Elevating Educator Voice: centering teachers in decision-making about AI in classrooms; and
- Transparency and Public Engagement: committing to openness through public meetings and shared resources.
Today was the initial meeting of the Public Schools: Artificial Intelligence Workgroup. The second meeting will take place in October, followed by a third meeting in February.
The CDE has released initial guidance for schools and educators regarding the use of AI, which will be enhanced by the work of this group. The initial guidance can be found on the CDE Learning With AI, Learning About AI web page.
Education
The Guardian view on GCSE resits: admitting the problem is just the first step | Editorial

For years, rigid rules and a shocking failure rate in compulsory GCSE retakes have been one of the exam system’s dirty secrets. At last this dire situation is getting some of the attention it deserves. This year, nearly a quarter of all maths and English language entries in England, Wales and Northern Ireland were for students aged 17 or older on a repeat attempt – with just one in six of those retaking maths managing to pass.
By calling this a crisis, Jill Duffy, who heads the OCR exam board, has thrown a spotlight on the problem. But admitting that there is an issue with resits, as officials are now doing, is only the first step. There are differing views about what ought to happen next.
Reforming GCSEs is outside the scope of the review being led by Prof Becky Francis. But a proposal to ditch compulsory resits is on the table. The Sixth Form Colleges Association wants a second attempt to be followed – for those who fail – by a modular alternative. This would mean students not being forced to endlessly repeat the parts of the courses they have mastered, and focusing instead on the gaps.
Nick Gibb, the former Conservative schools minister, has predictably set his face against change and demanded that all schools follow the example of the best. But while big variations in results should be drilled into, and successes learned from, this is not an adequate response. Many subject experts believe that the qualifications are poorly designed if their purpose is to serve as a universal gateway to the world of work. Rather than sticking to vital competencies (such as numeracy, statistics and reading comprehension), the current versions include calculus and geometry (in maths) and quasi-literary analysis (in English language).
It is a great shame that these issues were not grasped more effectively by Labour in opposition. Changes to the curriculum and exam system are a painstaking process. Prof Francis’s review is the best chance of breaking a destructive cycle. But the Department for Education’s recent record of engagement with the further education sector – where most resits are taken – is not good. There is no secondary English specialist on the review, and teacher shortages and challenges around provision for special educational needs and disabilities remain concerning.
Resits must also be seen in the context of a wider debate around the future of post-16 education, including the pledge by ministers to abolish courses that they see as unwelcome competition to T-levels. As with resits, critics of this policy are most worried about less academically able pupils with lower test scores. Even the government’s own figures show a gap, with tens of thousands of students on the threatened courses, including some BTecs, potentially unsuited to newer alternatives.
With a skills white paper due in the autumn, it is not too late to tackle unanswered questions. A better balance between ambition and pragmatism can surely be found. Plenty of jobs in the UK do not require calculus or textual analysis. T-levels were meant to boost less academic, more practical teenagers. This year’s resit figures are a worrying addition to existing evidence that these are the pupils for whom the system works least well. Ministers must be absolutely confident that any changes they introduce make things better, and not worse.
-
Tools & Platforms3 weeks ago
Building Trust in Military AI Starts with Opening the Black Box – War on the Rocks
-
Business2 days ago
The Guardian view on Trump and the Fed: independence is no substitute for accountability | Editorial
-
Ethics & Policy1 month ago
SDAIA Supports Saudi Arabia’s Leadership in Shaping Global AI Ethics, Policy, and Research – وكالة الأنباء السعودية
-
Events & Conferences3 months ago
Journey to 1000 models: Scaling Instagram’s recommendation system
-
Jobs & Careers2 months ago
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
-
Funding & Business2 months ago
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries
-
Education2 months ago
VEX Robotics launches AI-powered classroom robotics system
-
Podcasts & Talks2 months ago
Happy 4th of July! 🎆 Made with Veo 3 in Gemini
-
Podcasts & Talks2 months ago
OpenAI 🤝 @teamganassi
-
Mergers & Acquisitions2 months ago
Donald Trump suggests US government review subsidies to Elon Musk’s companies