Connect with us

Education

Artificial intelligence isn’t ruining education; it’s exposing what’s already broken

Published

on


Credit: Allison Shelley/The Verbatim Agency for EDUimages

A few weeks ago, my high school chemistry class sat through an “AI training.” We were told it would teach us how to use ChatGPT responsibly. We worked on worksheets with questions like, “When is it permissible to use ChatGPT on written homework?” and “How can AI support and not replace your thinking?” Another asked, “What are the risks of relying too heavily on ChatGPT?”

Most of us just used ChatGPT to finish the worksheet. Then we moved on to other things.

Schools have rushed to regulate AI based on a hopeful fiction: that students are curious, self-directed learners who’ll use technology responsibly if given the right guardrails. But most students don’t use AI to brainstorm or refine ideas — they use it to get assignments done faster. And school policies, built on optimism rather than observation, have done little to stop it.

Like many districts across the country, our school policy calls students to use ChatGPT to brainstorm, organize, and even generate ideas — but not to write. If we use generative AI to write the actual content of an assignment, we’re supposed to get a zero.

In practice, that line is meaningless. Later, I spoke to my chemistry teacher, who confided that she’d started checking Google Docs histories of papers she’d assigned and found that huge chunks of student writing were being pasted in. That is, AI-generated slop, dropped all at once with no edits, no revisions and no sign of actual real work. “It’s just disappointing,” she said. “There’s nothing I can do.”

In Bible class, students quoted ChatGPT outputs verbatim during presentations. One student projected a slide listing the Minor Prophets alongside the sentence: “Would you like me to format this into a table for you?” Another spoke confidently about the “post-exilic” period— having earlier that week mispronounced “patriarchy.” At one point, Mr. Knoxville paused during a slide and asked, “Why does it say BCE?” Then, chuckling, answered his own question: “Because it’s ChatGPT using secular language.” Everyone laughed and moved on.

It’s safe to say that in reality, most students aren’t using AI to deepen their learning. They’re using it to get around the learning process altogether. And the real frustration isn’t just that students are cutting corners, but that schools still pretend they aren’t.

That doesn’t mean AI should be banned. I’m not an AI alarmist. There’s enormous potential for smart, controlled integration of these tools into the classroom. But handing students unrestricted access with little oversight is undermining the core purpose of school.

This isn’t just a high school problem. At CSU, administrators have doubled down on AI integration with the same blind optimism: assuming students will use these tools responsibly. But widespread adoption doesn’t equal responsible use. A recent study from the National Education Association found that 72% of high school students use AI to complete assignments without really understanding the material.

“AI didn’t corrupt deep learning,” said Tiffany Noel, education researcher and professor at SUNY Buffalo. “It revealed that many assignments were never asking for critical thinking in the first place. Just performance. AI is just the faster actor; the problem is the script.”

Exactly. AI didn’t ruin education; it exposed what was already broken. Students are responding to the incentives the education system has given them. We’re taught that grades matter more than understanding. So if there’s an easy shortcut, why wouldn’t we take it?

This also penalizes students who don’t cheat. They spend an hour struggling through an assignment another student finishes in three minutes with a chatbot and a text humanizer. Both get the same grade. It’s discouraging and painfully absurd.

Of course, this is nothing new. Students have always found ways to lessen their workload, like copying homework, sharing answers and peeking during tests. But this is different because it’s a technology that should help schools — and under the current paradigm, it isn’t. This leaves schools vulnerable to misuse and students unrewarded for doing things the right way.

What to do, then?

Start by admitting the obvious: if an assignment is done at home, it will likely involve AI. If students have internet access in class, they’ll use it there, too. Teachers can’t stop this: they see phones under desks and tabs flipped the second their backs are turned. Teachers simply can’t police 30 screens at once, and most won’t try. Nor should they have to.

We need hard rules and clearer boundaries. AI should never be used to do a student’s actual academic work — just as calculators aren’t allowed on multiplication drills or Grammarly isn’t accepted on spelling tests. School is where you learn the skill, not where you offload it.

AI is built to answer prompts. So is homework. Of course students are cheating. The only solution is to make cheating structurally impossible. That means returning to basics: pen-and-paper essays, in-class writing, oral defenses, live problem-solving, source-based analysis where each citation is annotated, explained and verified. If an AI can do an assignment in five seconds, it was probably never a good assignment in the first place.

But that doesn’t mean AI has no place. It just means we put it where it belongs: behind the desk, not in it. Let it help teachers grade quizzes. Let it assist students with practice problems, or serve as a Socratic tutor that asks questions instead of answering them. Generative AI should be treated as a useful aid after mastery, not a replacement for learning.

Students are not idealized learners. They are strategic, social, overstretched, and deeply attuned to what the system rewards. Such is the reality of our education system, and the only way forward is to build policies around how students actually behave, not how educators wish they would.

Until that happens, AI will keep writing our essays. And our teachers will keep grading them.

•••

William Liang is a high school student and education journalist living in San Jose, California.

The opinions expressed in this commentary represent those of the author. EdSource welcomes commentaries representing diverse points of view. If you would like to submit a commentary, please review our guidelines and contact us.





Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Education

9 AI Ethics Scenarios (and What School Librarians Would Do)

Published

on


A common refrain about artificial intelligence in education is that it’s a research tool, and as such, some school librarians are acquiring firsthand experience with its uses and controversies.

Leading a presentation last week at the ISTELive 25 + ASCD annual conference in San Antonio, a trio of librarians parsed appropriate and inappropriate uses of AI in a series of hypothetical scenarios. They broadly recommended that schools have, and clearly articulate, official policies governing AI use and be cautious about inputting copyrighted or private information.

Amanda Hunt, a librarian at Oak Run Middle School in Texas, said their presentation would focus on scenarios because librarians are experiencing so many.


“The reason we did it this way is because these scenarios are coming up,” she said. “Every day I’m hearing some other type of question in regards to AI and how we’re using it in the classroom or in the library.”

  • Scenario 1: A class encourages students to use generative AI for brainstorming, outlining and summarizing articles.

    Elissa Malespina, a teacher librarian at Science Park High School in New Jersey, said she felt this was a valid use, as she has found AI to be helpful for high schoolers who are prone to get overwhelmed by research projects.

    Ashley Cooksey, an assistant professor and school library program director at Arkansas Tech University, disagreed slightly: While she appreciates AI’s ability to outline and brainstorm, she said, she would discourage her students from using it to synthesize summaries.

    “Point one on that is that you’re not using your synthesis and digging deep and reading the article for yourself to pull out the information pertinent to you,” she said. “Point No. 2 — I publish, I write. If you’re in higher ed, you do that. I don’t want someone to put my work into a piece of generative AI and an [LLM] that is then going to use work I worked very, very hard on to train its language learning model.”

  • Scenario 2: A school district buys an AI tool that generates student book reviews for a library website, which saves time and promotes titles but misses key themes or introduces unintended bias.

    All three speakers said this use of AI could certainly be helpful to librarians, but if the reviews are labeled in a way that makes it sound like they were written by students when they weren’t, that wouldn’t be ethical.

  • Scenario 3: An administrator asks a librarian to use AI to generate new curriculum materials and library signage. Do the outputs violate copyright or proper attribution rules?

    Hunt said the answer depends on local and district regulations, but she recommended using Adobe Express because it doesn’t pull from the Internet.

  • Scenario 4: An ed-tech vendor pitches a school library on an AI tool that analyzes circulation data and automatically recommends titles to purchase. It learns from the school’s preferences but often excludes lesser-known topics or authors of certain backgrounds.

    Hunt, Malespina and Cooksey agreed that this would be problematic, especially because entering circulation data could include personally identifiable information, which should never be entered into an AI.

  • Scenario 5: At a school that doesn’t have a clear AI policy, a student uses AI to summarize a research article and gets accused of plagiarism. Who is responsible, and what is the librarian’s role?

    The speakers as well as polled audience members tended to agree the school district would be responsible in this scenario. Without a policy in place, the school will have a harder time establishing whether a student’s behavior constitutes plagiarism.

    Cooksey emphasized the need for ongoing professional development, and Hunt said any districts that don’t have an official AI policy need steady pressure until they draft one.

    “I am the squeaky wheel right now in my district, and I’m going to continue to be annoying about it, but I feel like we need to have something in place,” Hunt said.

  • Scenario 6: Attempting to cause trouble, a student creates a deepfake of a teacher acting inappropriately. Administrators struggle to respond, they have no specific policy in place, and trust is shaken.

    Again, the speakers said this is one more example to illustrate the importance of AI policies as well as AI literacy.

    “We’re getting to this point where we need to be questioning so much of what we see, hear and read,” Hunt said.

  • Scenario 7: A pilot program uses AI to provide instant feedback on student essays, but English language learners consistently get lower scores, leading teachers to worry the AI system can’t recognize code-switching or cultural context.

    In response to this situation, Hunt said it’s important to know whether the parent has given their permission to enter student essays into an AI, and the teacher or librarian should still be reading the essays themselves.

    Malespina and Cooksey both cautioned against relying on AI plagiarism detection tools.

    “None of these tools can do a good enough job, and they are biased toward [English language learners],” Malespina said.

  • Scenario 8: A school-approved AI system flags students who haven’t checked out any books recently, tracks their reading speed and completion patterns, and recommends interventions.

    Malespina said she doesn’t want an AI tool tracking students in that much detail, and Cooksey pointed out that reading speed and completion patterns aren’t reliably indicative of anything that teachers need to know about students.

  • Scenario 9: An AI tool translates texts, reads books aloud and simplifies complex texts for students with individualized education programs, but it doesn’t always translate nuance or tone.

    Hunt said she sees benefit in this kind of application for students who need extra support, but she said the loss of tone could be an issue, and it raises questions about infringing on audiobook copyright laws.

    Cooksey expounded upon that.

    “Additionally, copyright goes beyond the printed work. … That copyright owner also owns the presentation rights, the audio rights and anything like that,” she said. “So if they’re putting something into a generative AI tool that reads the PDF, that is technically a violation of copyright in that moment, because there are available tools for audio versions of books for this reason, and they’re widely available. Sora is great, and it’s free for educators. … But when you’re talking about taking something that belongs to someone else and generating a brand-new copied product of that, that’s not fair use.”

Andrew Westrope is managing editor of the Center for Digital Education. Before that, he was a staff writer for Government Technology, and previously was a reporter and editor at community newspapers. He has a bachelor’s degree in physiology from Michigan State University and lives in Northern California.





Source link

Continue Reading

Education

Bret Harte Superintendent Named To State Boards On School Finance And AI

Published

on






Bret Harte Superintendent Named To State Boards On School Finance And AI – myMotherLode.com

































































 




Source link

Continue Reading

Education

Blunkett urges ministers to use ‘incredible sensitivity’ in changing Send system in England | Special educational needs

Published

on


Ministers must use “incredible sensitivity” in making changes to the special educational needs system, former education secretary David Blunkett has said, as the government is urged not to drop education, health and care plans (EHCPs).

Lord Blunkett, who went through the special needs system when attending a residential school for blind children, said ministers would have to tread carefully.

The former home secretary in Tony Blair’s government also urged the government to reassure parents that it was looking for “a meaningful replacement” for EHCPs, which guarantee more than 600,000 children and young people individual support in learning.

Blunkett said he sympathised with the challenge facing Bridget Phillipson, the education secretary, saying: “It’s absolutely clear that the government will need to do this with incredible sensitivity and with a recognition it’s going to be a bumpy road.”

He said government proposals due in the autumn to reexamine Send provision in England were not the same as welfare changes, largely abandoned last week, which were aimed at reducing spending. “They put another billion in [to Send provision] and nobody noticed,” Blunkett said, adding: “We’ve got to reduce the fear of change.”

Earlier Helen Hayes, the Labour MP who chairs the cross-party Commons education select committee, called for Downing Street to commit to EHCPs, saying this was the only way to combat mistrust among many families with Send children.

“I think at this stage that would be the right thing to do,” she told BBC Radio 4’s Today programme. “We have been looking, as the education select committee, at the Send system for the last several months. We have heard extensive evidence from parents, from organisations that represent parents, from professionals and from others who are deeply involved in the system, which is failing so many children and families at the moment.

“One of the consequences of that failure is that parents really have so little trust and confidence in the Send system at the moment. And the government should take that very seriously as it charts a way forward for reform.”

A letter to the Guardian on Monday, signed by dozens of special needs and disability charities and campaigners, warned against government changes to the Send system that would restrict or abolish EHCPs.

Labour MPs who spoke to the Guardian are worried ministers are unable to explain essential details of the special educational needs shake-up being considered in the schools white paper to be published in October.

Downing Street has refused to rule out ending EHCPs, while stressing that no decisions have yet been taken ahead of a white paper on Send provision to be published in October.

Keir Starmer’s deputy spokesperson said: “I’ll just go back to the broader point that the system is not working and is in desperate need of reform. That’s why we want to actively work with parents, families, parliamentarians to make sure we get this right.”

skip past newsletter promotion

Speaking later in the Commons, Phillipson said there was “no responsibility I take more seriously” than that to more vulnerable children. She said it was a “serious and complex area” that “we as a government are determined to get right”.

The education secretary said: “There will always be a legal right to the additional support children with Send need, and we will protect it. But alongside that, there will be a better system with strengthened support, improved access and more funding.”

Dr Will Shield, an educational psychologist from the University of Exeter, said rumoured proposals that limit EHCPs – potentially to pupils in special schools – were “deeply problematic”.

Shield said: “Mainstream schools frequently rely on EHCPs to access the funding and oversight needed to support children effectively. Without a clear, well-resourced alternative, families will fear their children are not able to access the support they need to achieve and thrive.”

Paul Whiteman, general secretary of the National Association of Head Teachers, said: “Any reforms in this space will likely provoke strong reactions and it will be crucial that the government works closely with both parents and schools every step of the way.”



Source link

Continue Reading

Trending