Tools & Platforms
The Rewards—And Risks—Of Using AI In The Classroom

September 16, 2025
By Elizabeth Tucker–
Many of us lacking experience with Artificial Intelligence (AI) find it unnerving. There is the prospect that it can do everything we do, only better—or at least adequately and cheaper. “You won’t be replaced by AI; you’ll be replaced by someone who knows how to use AI better,” quoted Irvington Superintendent Mara Rasevic in a recent conversation. If AI is changing the contours of the workforce, then schools need to train students to use it effectively. The American Federation of Teachers (AFT) this summer established the National Academy for AI Instruction so that teachers can stay on top of their new responsibility to impart AI literacy and also learn how to employ it to make their own working routines more efficient. The AFT has also published Commonsense Guardrails for Using Advanced Technology in Schools, now already in a revised edition.
Jerrod Blair, Director of Technology and Integration in the Irvington school district, articulated an overview of how teachers there are using AI. “Some teachers are exploring how AI can save time on routine tasks, like generating practice questions, lesson ideas, or feedback prompts, so they can focus more on engaging with students,” he said. “Others are experimenting with AI as a discussion starter in class, for example, asking students to evaluate an AI-generated response for accuracy, bias, or completeness.”
Support our Sponsors
Many districts, including those in the rivertowns, are using the AI teaching platform MagicSchool. Hastings has introduced it at the elementary level. In Sleepy Hollow, according to Technology Integration Specialist Jean O’Brien, AI is being used in instruction from elementary to high school and in math and science classes as well as in the humanities.
At Sleepy Hollow High School, social studies teacher Alyson Nawrocki finds MagicSchool especially useful in providing students individualized feedback on their writing. “The AI feedback is never in place of teacher feedback,” Nawrocki says, “but it’s useful as a checkpoint if I’m conferencing with another student so the rest of the class can keep working productively.”
MagicSchool assesses students’ writing according to Nawrocki’s own rubric. For example, under “thesis,” the AI software might opine that the writing is clear but too general, or under “analysis,” it might note that the writing successfully reports what happened but needs to explain why those events mattered. Students can revise on their own and then submit a final version to Nawrocki to read.

Another feature MagicSchool offers is a text leveler, which adjusts a text for different reading levels. Blair says, “This doesn’t replace the original text, but it gives teachers another way to ensure that every student can engage meaningfully with the material. For example, a teacher might provide the original text alongside a leveled version so that students can build confidence and gradually work toward the more complex version.”
MagicSchool’s character chatbot allows students to engage in conversation with figures from history. In Nawrocki’s class, students interviewed a World War I soldier about his experience. In another assignment, her students were assigned to read an article about early human migration into North America and then, using details learned from the reading, instruct the image-generating Canva AI to create a magazine cover picturing its contents.
When asked about the technology’s drawbacks, Nawrocki allows that “some students begin to rely too heavily on AI. This can discourage them from taking creative or intellectual risks, which are essential parts of the learning process.”
Then, of course, there is the temptation to use AI to fabricate homework assignments. To counteract this, Nawrocki says, “I no longer assign traditional ‘original’ homework that could easily be completed by a tool. Instead, I design assignments that require students . . . to show their thinking process. . . .The focus is less on producing a polished product at home and more on engaging in authentic skill-building that can be observed and assessed directly.” She might ask her students to relate their homework to the day’s lesson or illustrate historical processes in a drawing or diagram, rather than answer a straight question in writing, as would lend itself to generation by AI.
According to Jerrod Blair, many Irvington teachers as well “are thoughtfully adapting their assignments to reflect the reality that students have access to AI. This doesn’t mean starting from scratch; instead, it’s about asking deeper questions, emphasizing process and critical thinking, and creating opportunities for students to demonstrate their learning in multiple ways.” But also, Blair says, “the district sees this as an opportunity to teach students how to use emerging technologies ethically.”
Read or leave a comment on this story…
Tools & Platforms
Anthropic Taps Higher Education Leaders for Guidance on AI

The artificial intelligence company Anthropic is working with six leaders in higher education to help guide how its AI assistant Claude will be developed for teaching, learning and research. The new Higher Education Advisory Board, announced in August, will provide regular input on educational tools and policies.
According to a news release from Anthropic, the board is tasked with ensuring that AI “strengthens rather than undermines learning and critical thinking skills” through policies and products that support academic integrity and student privacy.
As teachers adapt to AI, ed-tech leaders have called for educators to play an active role in aligning AI to educational standards.
“Teachers and educators and administrators should be in the decision-making seat at every critical decision-making point when AI is being used in education,” Isabella Zachariah, formerly a fellow at the U.S. Department of Education’s Office of Educational Technology, said at the EDUCAUSE conference in October 2024. The Office of Educational Technology has since been shuttered by the Trump administration.
To this end, advisory boards or councils involving educators have emerged in recent years among ed-tech companies and institutions seeking to ground AI deployments in classroom experiences. For example, the K-12 software company Otus formed an AI advisory board earlier this year with teachers, principals, instructional technology specialists and district administrators representing more than 20 school districts across 11 states. Similarly, software company Frontline Education launched an AI advisory council last month to allow district leaders to participate in pilots and influence product design choices.
The Anthropic board taps experts in the education, nonprofit and technology sectors, including two former university presidents and three campus technology leaders. Rick Levin, former president of Yale University and CEO of Coursera, will serve as board chair. Other members include:
- David Leebron, former president of Rice University
- James DeVaney, associate vice provost for academic innovation at the University of Michigan
- Julie Schell, assistant vice provost of academic technology at the University of Texas at Austin
- Matthew Rascoff, vice provost for digital education at Stanford University
- Yolanda Watson Spiva, president of Complete College America
The board contributed to a recent trio of AI fluency courses for colleges and universities, according to the news release. The online courses aim to give students and faculty a foundation in the function, limitations and potential uses of large language models in academic settings.
Schell said she joined the advisory board to explore how technology can address persistent challenges in learning.
“Sometimes we forget how cognitively taxing it is to really learn something deeply and meaningfully,” she said. “Throughout my career, I’ve been excited about the different ways that technology can help accentuate best practices in teaching or pedagogy. My mantra has always been pedagogy first, technology second.”
In her work at UT Austin, Schell has focused on responsible use of AI and engaged with faculty, staff, students and the general public to develop guiding principles. She said she hopes to bring the feedback from the community, as well as education science, to regular meetings. She said she participated in vetting existing Anthropic ed-tech tools, like Claude Learning mode, with this in mind.
In the weeks since the board’s announcement, the group has met once, Schell said, and expects to meet regularly in the future.
“I think it’s important to have informed people who understand teaching and learning advising responsible adoption of AI for teaching and learning,” Schell said. “It might look different than other industries.”
Tools & Platforms
Duke AI program emphasizes critical thinking for job security :: WRAL.com

Duke’s AI program is spearheaded by a professor who is not just teaching, he also built his own AI model.
Professor Jon Reifschneider says we’ve already entered a new era of teaching and learning across disciplines.
He says, “We have folks that go into healthcare after they graduate, go into finance, energy, education, etc. We want them to bring with them a set of skills and knowledge in AI, so that they can figure out: ‘How can I go solve problems in my field using AI?'”
He wants his students to become literate in AI, which is a challenge in a field he describes as a moving target.
“I think for most people, AI is kind of a mysterious black box that can do somewhat magical things, and I think that’s very risky to think that way, because you don’t develop an appreciation of when you should use it and when you shouldn’t use it,” Reifschneider told WRAL News.
Student Harshitha Rasamsetty said she is learning the strengths and shortcomings of AI.
“We always look at the biases and privacy concerns and always consider the user,” she said.
The students in Duke’s engineering master’s programs come from all backgrounds, countries, even ages. Jared Bailey paused his insurance career in Florida to get a handle on the AI being deployed company-wide.
He was already using AI tools when he wondered, “What if I could crack them open and adjust them myself and make them better?”
John Ernest studied engineering in undergrad, but sought job security in AI.
“I hear news every day that AI is replacing this job, AI is replacing that job,” he said. “I came to a conclusion that I should be a part of a person building AI, not be a part of a person getting replaced by AI.”
Reifschneider thinks warnings about AI taking jobs are overblown.
In fact, he wants his students to come away understanding that humans have a quality AI can’t replace. That’s critical thinking.
Reifschneider says AI “still relies on humans to guide it in the right direction, to give it the right prompts, to ask the right questions, to give it the right instructions.”
“If you can’t think, well, AI can’t take you very far,” Bailey said. “It’s a car with no gas.”
Reifschneider told WRAL that he thinks children as young as elementary school students should begin learning how to use AI, when it’s appropriate to do so, and how to use it safely.
WRAL News went inside Wake County schools to see how it is being used and what safeguards the district is using to protect students. Watch that story Wednesday on WRAL News.
Tools & Platforms
WA state schools superintendent seeks $10M for AI in classrooms

This article originally appeared on TVW News.
Washington’s top K-12 official is asking lawmakers to bankroll a statewide push to bring artificial intelligence tools and training into classrooms in 2026, even as new test data show slow, uneven academic recovery and persistent achievement gaps.
Superintendent of Public Instruction Chris Reykdal told TVW’s Inside Olympia that he will request about $10 million in the upcoming supplemental budget for a statewide pilot program to purchase AI tutoring tools — beginning with math — and fund teacher training. He urged legislators to protect education from cuts, make structural changes to the tax code and act boldly rather than leaving local districts to fend for themselves. “If you’re not willing to make those changes, don’t take it out on kids,” Reykdal said.
The funding push comes as new Smarter Balanced assessment results show gradual improvement but highlight persistent inequities. State test scores have ticked upward, and student progress rates between grades are now mirroring pre-pandemic trends. Still, higher-poverty communities are not improving as quickly as more affluent peers. About 57% of eighth graders met foundational math progress benchmarks — better than most states, Reykdal noted, but still leaving four in 10 students short of university-ready standards by 10th grade.
Reykdal cautioned against reading too much into a single exam, emphasizing that Washington consistently ranks near the top among peer states. He argued that overall college-going rates among public school students show they are more prepared than the test suggests. “Don’t grade the workload — grade the thinking,” he said.
Artificial intelligence, Reykdal said, has moved beyond the margins and into the mainstream of daily teaching and learning: “AI is in the middle of everything, because students are making it in a big way. Teachers are doing it. We’re doing it in our everyday lives.”
OSPI has issued human-centered AI guidance and directed districts to update technology policies, clarifying how AI can be used responsibly and what constitutes academic dishonesty. Reykdal warned against long-term contracts with unproven vendors, but said larger platforms with stronger privacy practices will likely endure. He framed AI as a tool for expanding customized learning and preparing students for the labor market, while acknowledging the need to teach ethical use.
Reykdal pressed lawmakers to think more like executives anticipating global competition rather than waiting for perfect solutions. “If you wait until it’s perfect, it will be a decade from now, and the inequalities will be massive,” he said.
With test scores climbing slowly and AI transforming classrooms, Reykdal said the Legislature’s next steps will be decisive in shaping whether Washington narrows achievement gaps — or lets them widen.
TVW News originally published this article on Sept. 11, 2025.
-
Business3 weeks ago
The Guardian view on Trump and the Fed: independence is no substitute for accountability | Editorial
-
Tools & Platforms1 month ago
Building Trust in Military AI Starts with Opening the Black Box – War on the Rocks
-
Ethics & Policy2 months ago
SDAIA Supports Saudi Arabia’s Leadership in Shaping Global AI Ethics, Policy, and Research – وكالة الأنباء السعودية
-
Events & Conferences4 months ago
Journey to 1000 models: Scaling Instagram’s recommendation system
-
Jobs & Careers3 months ago
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
-
Podcasts & Talks2 months ago
Happy 4th of July! 🎆 Made with Veo 3 in Gemini
-
Education3 months ago
VEX Robotics launches AI-powered classroom robotics system
-
Education2 months ago
Macron says UK and France have duty to tackle illegal migration ‘with humanity, solidarity and firmness’ – UK politics live | Politics
-
Podcasts & Talks2 months ago
OpenAI 🤝 @teamganassi
-
Funding & Business3 months ago
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries