Connect with us

Education

Trump Wants Teachers Trained How to Use AI. Will It Work?

Published

on


A new executive order signed by President Donald Trump calls for infusing artificial intelligence throughout K-12 education. A major focus of that plan is training teachers on how to integrate AI into their instruction and workflows.

This was one of seven executive orders Trump signed on April 23 focused on education, including one on “ensuring commonsense school discipline policies” and a handful focused on higher education institutions.

Some educators and education organizations have welcomed the order, saying that AI literacy for educators and students is important and much needed.

“I’m very excited about it,” said Pam Amendola, an English teacher at Dawson County High School in Dawsonville, Ga. “AI is not the future. AI is right now, and we need direction from the federal government.”

While her district has yet to provide training on AI, Amendola has attended AI trainings on her own time and has started teaching her students what AI is, how it works, and how to use AI-powered tools responsibly.

But other experts and educators are skeptical the federal government will be able to actualize the policy goals of the order, given that so much funding and expertise have been cut from the departments tasked with carrying out this work.

If you’re talking about how to successfully make these connections from the federal level to the field, I think a lot of that expertise is now gone.

Bernadette Adams, former senior policy advisor, Education Department’s office of educational technology

The executive order calls for the secretaries of education and agriculture, as well as the director of the National Science Foundation, to prioritize discretionary grant funds and existing programs for teacher training. The Education Department is tasked with supporting professional development both for teachers of computer science and AI-focused classes, as well as for all educators to integrate the fundamentals of AI into all subjects.

The order also directs the agriculture secretary and the NSF director to leverage existing programs to create teacher-training opportunities to help teachers “effectively integrate AI-based tools” into their classrooms.

“As Artificial Intelligence (AI) reshapes every industrial sector, it is vitally important that the next generation of students is prepared to leverage this technology in all aspects of their professional lives,” Education Secretary Linda McMahon said in a statement. “The Trump Administration will lead the way in training our educators to foster early and responsible AI education in our classrooms to keep up American leadership in the global economy.”

But it is a tall order given that most teachers have yet to receive any professional development on AI, as the EdWeek Research Center has found in surveys of teachers. In an October 2024 EdWeek Research Center survey, 58 percent of teachers said they had not received any professional development on using generative AI in the classroom, and 68 percent said they are not currently using AI tools in their classrooms.

School districts need help wading through the flood of AI products

Schools are in desperate need of support to train teachers on the rapidly changing technology that is cropping up seemingly everywhere, said Dusty Strickland, the assistant principal at North Murray High School in Chatsworth, Ga.

“My teachers who are doing everything they can to make sure our kids know the standards that they have to know, they don’t have time to dig into just [AI],” he said. “It’s a very fast-moving train, so how can we make sure our teachers can get on it?”

Right now, teachers at Murray High School can volunteer to participate in training from the district’s technology specialists on using the AI technology already embedded in the tools and programs the district uses, Strickland said. And then teachers who participate in the voluntary training often share what they learn with peers.

Strickland said he would like to see the federal government provide schools with more money for AI training for teachers, as well as resources to help administrators like him determine which professional development programs and AI tools are high-quality.

“A lot of people are popping up saying, ‘hi, I’m an expert,’ but I don’t know how to [have them] prove that [they’re] an expert in such a new field,” he said.

While Amendola, the teacher in Georgia, is optimistic about the executive order, she is wary of how much influence ed-tech companies will have on the federal AI task force to be established by the executive order and its responsibilities.

Nationwide, districts have been slow to adopt guidelines and provide training around AI because the technology is evolving so quickly and because of a lack of expertise. As a result, educators’ exposure to AI has come mostly from ed-tech companies that are “shoving their products out there for districts to use,” Amendola said.

That is why she emphasizes that the federal task force should rely on organizations whose primary goal isn’t to sell AI-related products and services.

Randi Weingarten, the president of the American Federation of Teachers, panned the executive order in a statement, saying it opens up schools to “unaccountable tech companies” and “unproven software.”

“While AI can be a helpful and important tool for educators and students in classrooms, we’ve instead seen systems that produce disinformation, impinge on privacy, and tell inaccurate accounts of history,” Weingarten said.

Instead, Weingarten said, the administration should be “investing in classrooms and instruction designed by educators who work directly with students and who have the knowledge and expertise to meet their needs.”

The executive order doesn’t address data privacy or bias in AI

The aims of the executive order are largely bipartisan in nature. There’s broad support for giving schools more resources for harnessing this powerful technology.

But there are also significant omissions in the directive and potential hurdles to converting policy into reality, say some experts.

One concern is that money that could have been diverted to supporting the goals of the executive order—as well as many people with expertise in the subject—are being cut from the federal government, said Bernadette Adams, the former senior policy adviser at the U.S. Department of Education’s office of educational technology and an expert in AI. The entire staff at the OET, including Adams, was dismissed as part of the Education Department’s recent staff cuts. The Education Department now has about half the number of staff as it did when Trump took office.

From researchers at the National Center for Education Research to specialists from outside industries who took temporary roles in the government, “those people were also pushed out and dismissed,” said Adams. “So, if you’re talking about how to successfully make these connections from the federal level to the field, I think a lot of that expertise is now gone.”

It’s not just the elimination of the office of educational technology and other people who previously provided AI expertise that will hurt efforts. There are also significant gaps in the directive, said Adams. Most notably absent is any mention of student data privacy or bias in AI—two major issues experts frequently raise about the safety and efficacy of AI.

Finally, Adams said, the executive order focuses on AI as a labor and workforce issue: training today’s students for future jobs. Both Democratic and Republican administrations have had a tendency to view AI this way, and that’s a missed opportunity, she said.

“I feel like the executive order as it’s written, and maybe as the work goes forward people will consider this, but it does sideline, in my view, teaching and learning, which is the heart of education,” she said. “I think there’s real educational opportunities that go untapped when AI is framed only as a content area or a career path.”





Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Education

9 AI Ethics Scenarios (and What School Librarians Would Do)

Published

on


A common refrain about artificial intelligence in education is that it’s a research tool, and as such, some school librarians are acquiring firsthand experience with its uses and controversies.

Leading a presentation last week at the ISTELive 25 + ASCD annual conference in San Antonio, a trio of librarians parsed appropriate and inappropriate uses of AI in a series of hypothetical scenarios. They broadly recommended that schools have, and clearly articulate, official policies governing AI use and be cautious about inputting copyrighted or private information.

Amanda Hunt, a librarian at Oak Run Middle School in Texas, said their presentation would focus on scenarios because librarians are experiencing so many.


“The reason we did it this way is because these scenarios are coming up,” she said. “Every day I’m hearing some other type of question in regards to AI and how we’re using it in the classroom or in the library.”

  • Scenario 1: A class encourages students to use generative AI for brainstorming, outlining and summarizing articles.

    Elissa Malespina, a teacher librarian at Science Park High School in New Jersey, said she felt this was a valid use, as she has found AI to be helpful for high schoolers who are prone to get overwhelmed by research projects.

    Ashley Cooksey, an assistant professor and school library program director at Arkansas Tech University, disagreed slightly: While she appreciates AI’s ability to outline and brainstorm, she said, she would discourage her students from using it to synthesize summaries.

    “Point one on that is that you’re not using your synthesis and digging deep and reading the article for yourself to pull out the information pertinent to you,” she said. “Point No. 2 — I publish, I write. If you’re in higher ed, you do that. I don’t want someone to put my work into a piece of generative AI and an [LLM] that is then going to use work I worked very, very hard on to train its language learning model.”

  • Scenario 2: A school district buys an AI tool that generates student book reviews for a library website, which saves time and promotes titles but misses key themes or introduces unintended bias.

    All three speakers said this use of AI could certainly be helpful to librarians, but if the reviews are labeled in a way that makes it sound like they were written by students when they weren’t, that wouldn’t be ethical.

  • Scenario 3: An administrator asks a librarian to use AI to generate new curriculum materials and library signage. Do the outputs violate copyright or proper attribution rules?

    Hunt said the answer depends on local and district regulations, but she recommended using Adobe Express because it doesn’t pull from the Internet.

  • Scenario 4: An ed-tech vendor pitches a school library on an AI tool that analyzes circulation data and automatically recommends titles to purchase. It learns from the school’s preferences but often excludes lesser-known topics or authors of certain backgrounds.

    Hunt, Malespina and Cooksey agreed that this would be problematic, especially because entering circulation data could include personally identifiable information, which should never be entered into an AI.

  • Scenario 5: At a school that doesn’t have a clear AI policy, a student uses AI to summarize a research article and gets accused of plagiarism. Who is responsible, and what is the librarian’s role?

    The speakers as well as polled audience members tended to agree the school district would be responsible in this scenario. Without a policy in place, the school will have a harder time establishing whether a student’s behavior constitutes plagiarism.

    Cooksey emphasized the need for ongoing professional development, and Hunt said any districts that don’t have an official AI policy need steady pressure until they draft one.

    “I am the squeaky wheel right now in my district, and I’m going to continue to be annoying about it, but I feel like we need to have something in place,” Hunt said.

  • Scenario 6: Attempting to cause trouble, a student creates a deepfake of a teacher acting inappropriately. Administrators struggle to respond, they have no specific policy in place, and trust is shaken.

    Again, the speakers said this is one more example to illustrate the importance of AI policies as well as AI literacy.

    “We’re getting to this point where we need to be questioning so much of what we see, hear and read,” Hunt said.

  • Scenario 7: A pilot program uses AI to provide instant feedback on student essays, but English language learners consistently get lower scores, leading teachers to worry the AI system can’t recognize code-switching or cultural context.

    In response to this situation, Hunt said it’s important to know whether the parent has given their permission to enter student essays into an AI, and the teacher or librarian should still be reading the essays themselves.

    Malespina and Cooksey both cautioned against relying on AI plagiarism detection tools.

    “None of these tools can do a good enough job, and they are biased toward [English language learners],” Malespina said.

  • Scenario 8: A school-approved AI system flags students who haven’t checked out any books recently, tracks their reading speed and completion patterns, and recommends interventions.

    Malespina said she doesn’t want an AI tool tracking students in that much detail, and Cooksey pointed out that reading speed and completion patterns aren’t reliably indicative of anything that teachers need to know about students.

  • Scenario 9: An AI tool translates texts, reads books aloud and simplifies complex texts for students with individualized education programs, but it doesn’t always translate nuance or tone.

    Hunt said she sees benefit in this kind of application for students who need extra support, but she said the loss of tone could be an issue, and it raises questions about infringing on audiobook copyright laws.

    Cooksey expounded upon that.

    “Additionally, copyright goes beyond the printed work. … That copyright owner also owns the presentation rights, the audio rights and anything like that,” she said. “So if they’re putting something into a generative AI tool that reads the PDF, that is technically a violation of copyright in that moment, because there are available tools for audio versions of books for this reason, and they’re widely available. Sora is great, and it’s free for educators. … But when you’re talking about taking something that belongs to someone else and generating a brand-new copied product of that, that’s not fair use.”

Andrew Westrope is managing editor of the Center for Digital Education. Before that, he was a staff writer for Government Technology, and previously was a reporter and editor at community newspapers. He has a bachelor’s degree in physiology from Michigan State University and lives in Northern California.





Source link

Continue Reading

Education

Bret Harte Superintendent Named To State Boards On School Finance And AI

Published

on






Bret Harte Superintendent Named To State Boards On School Finance And AI – myMotherLode.com

































































 




Source link

Continue Reading

Education

Blunkett urges ministers to use ‘incredible sensitivity’ in changing Send system in England | Special educational needs

Published

on


Ministers must use “incredible sensitivity” in making changes to the special educational needs system, former education secretary David Blunkett has said, as the government is urged not to drop education, health and care plans (EHCPs).

Lord Blunkett, who went through the special needs system when attending a residential school for blind children, said ministers would have to tread carefully.

The former home secretary in Tony Blair’s government also urged the government to reassure parents that it was looking for “a meaningful replacement” for EHCPs, which guarantee more than 600,000 children and young people individual support in learning.

Blunkett said he sympathised with the challenge facing Bridget Phillipson, the education secretary, saying: “It’s absolutely clear that the government will need to do this with incredible sensitivity and with a recognition it’s going to be a bumpy road.”

He said government proposals due in the autumn to reexamine Send provision in England were not the same as welfare changes, largely abandoned last week, which were aimed at reducing spending. “They put another billion in [to Send provision] and nobody noticed,” Blunkett said, adding: “We’ve got to reduce the fear of change.”

Earlier Helen Hayes, the Labour MP who chairs the cross-party Commons education select committee, called for Downing Street to commit to EHCPs, saying this was the only way to combat mistrust among many families with Send children.

“I think at this stage that would be the right thing to do,” she told BBC Radio 4’s Today programme. “We have been looking, as the education select committee, at the Send system for the last several months. We have heard extensive evidence from parents, from organisations that represent parents, from professionals and from others who are deeply involved in the system, which is failing so many children and families at the moment.

“One of the consequences of that failure is that parents really have so little trust and confidence in the Send system at the moment. And the government should take that very seriously as it charts a way forward for reform.”

A letter to the Guardian on Monday, signed by dozens of special needs and disability charities and campaigners, warned against government changes to the Send system that would restrict or abolish EHCPs.

Labour MPs who spoke to the Guardian are worried ministers are unable to explain essential details of the special educational needs shake-up being considered in the schools white paper to be published in October.

Downing Street has refused to rule out ending EHCPs, while stressing that no decisions have yet been taken ahead of a white paper on Send provision to be published in October.

Keir Starmer’s deputy spokesperson said: “I’ll just go back to the broader point that the system is not working and is in desperate need of reform. That’s why we want to actively work with parents, families, parliamentarians to make sure we get this right.”

skip past newsletter promotion

Speaking later in the Commons, Phillipson said there was “no responsibility I take more seriously” than that to more vulnerable children. She said it was a “serious and complex area” that “we as a government are determined to get right”.

The education secretary said: “There will always be a legal right to the additional support children with Send need, and we will protect it. But alongside that, there will be a better system with strengthened support, improved access and more funding.”

Dr Will Shield, an educational psychologist from the University of Exeter, said rumoured proposals that limit EHCPs – potentially to pupils in special schools – were “deeply problematic”.

Shield said: “Mainstream schools frequently rely on EHCPs to access the funding and oversight needed to support children effectively. Without a clear, well-resourced alternative, families will fear their children are not able to access the support they need to achieve and thrive.”

Paul Whiteman, general secretary of the National Association of Head Teachers, said: “Any reforms in this space will likely provoke strong reactions and it will be crucial that the government works closely with both parents and schools every step of the way.”



Source link

Continue Reading

Trending