Connect with us

Education

Ohio emerges as early adopter in AI education policies

Published

on


COLUMBUS, Ohio (WCMH) — Ohio is reportedly the first state to require artificial intelligence policies in public schools. 

With the passage of House Bill 96 this summer, the Ohio Department of Education and Workforce must develop a model policy on AI use for students and staff by Dec. 31. Ohio schools will have until July 1, 2026, to adopt policies. 

Some central Ohio districts already have policies addressing AI. Bexley City Schools considers unauthorized AI use akin to plagiarism. Olentangy schools allow students to use AI in specific circumstances with staff permission. 

In June, Columbus City Schools’ policy and governance committee discussed developing an AI policy. The board has not approved one, but the policy committee will have another meeting Sept. 9 where the topic could reemerge. 

Canal Winchester Local Schools and Dublin City Schools both have a policy acknowledging the positives of AI in education and grants the superintendent the right to support AI use where appropriate to learning. Students are not allowed to use AI to complete assignments, but they can use it for research and writing assistance, data analysis and comprehension with teacher permission. 

Higher education in Ohio is also a national leader in AI and education. Starting this fall, Ohio State University is requiring all students to use AI in their classes. OSU promises by 2029, all graduates will leave the university “AI fluent,” meaning they are trained to use AI in their respective fields. See previous coverage in the video player above.

The state has already embraced AI possibilities in education. In May 2024, current Sen. and then-Lt. Gov. Jon Husted led Ohio’s AI in Education Coalition to develop a strategy for integrating AI into the state’s K-12 education system. 

The coalition invited industry and education stakeholders to participate in one of three work groups: industry, operations and instructional. Together, these groups developed several recommendations for Ohio’s education system to be best prepared for AI. The coalition said schools should begin incorporating AI thoughtfully and develop policies quickly. The coalition also encouraged the state to provide support and incentives for AI integration. 

There is also a federal push toward AI in schools. In April, President Donald Trump signed an executive order establishing an AI education task force to help schools adjust to new technology. Last week, Melania Trump invited K-12 students to participate in a nationwide contest to use AI tools to solve community issues.

The White House has also encouraged businesses to pledge to invest in AI education, including offering funding, dispersing resources and developing training. As of early August, 110 companies had signed the pledge, including Apple, Amazon, Intel and COSI.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Education

‘Cheating is off the charts: AI tools reshape how students learn and study

Published

on


The book report is now a thing of the past. Take-home tests and essays are becoming obsolete.

Student use of artificial intelligence has become so prevalent, high school and college educators say, that to assign writing outside of the classroom is like asking students to cheat.

“The cheating is off the charts. It’s the worst I’ve seen in my entire career,” says Casey Cuny, who has taught English for 23 years. Educators are no longer wondering if students will outsource schoolwork to AI chatbots. “Anything you send home, you have to assume is being AI’ed.”

The question now is how schools can adapt, because many of the teaching and assessment tools that have been used for generations are no longer effective. As AI technology rapidly improves and becomes more entwined with daily life, it is transforming how students learn and study and how teachers teach, and it’s creating new confusion over what constitutes academic dishonesty.

“We have to ask ourselves, what is cheating?” says Cuny, a 2024 recipient of California’s Teacher of the Year award. “Because I think the lines are getting blurred.”

Cuny’s students at Valencia High School in southern California now do most writing in class. He monitors student laptop screens from his desktop, using software that lets him “lock down” their screens or block access to certain sites. He’s also integrating AI into his lessons and teaching students how to use AI as a study aid “to get kids learning with AI instead of cheating with AI.”

In rural Oregon, high school teacher Kelly Gibson has made a similar shift to in-class writing. She is also incorporating more verbal assessments to have students talk through their understanding of assigned reading.

“I used to give a writing prompt and say, ‘In two weeks, I want a five-paragraph essay,’” says Gibson. “These days, I can’t do that. That’s almost begging teenagers to cheat.”

Take, for example, a once typical high school English assignment: Write an essay that explains the relevance of social class in “The Great Gatsby.” Many students say their first instinct is now to ask ChatGPT for help “brainstorming.” Within seconds, ChatGPT yields a list of essay ideas, plus examples and quotes to back them up. The chatbot ends by asking if it can do more: “Would you like help writing any part of the essay? I can help you draft an introduction or outline a paragraph!”

Casey Cuny, an English teacher at Valencia High School, works on his computer as he prepares for class in Santa Clarita, Calif., Wednesday, Aug. 27, 2025. (AP Photo/Jae C. Hong)AP

Students are uncertain when AI usage is out of bounds

Students say they often turn to AI with good intentions for things like research, editing or help reading difficult texts. But AI offers unprecedented temptation, and it’s sometimes hard to know where to draw the line.

College sophomore Lily Brown, a psychology major at an East Coast liberal arts school, relies on ChatGPT to help outline essays because she struggles putting the pieces together herself. ChatGPT also helped her through a freshman philosophy class, where assigned reading “felt like a different language” until she read AI summaries of the texts.

“Sometimes I feel bad using ChatGPT to summarize reading, because I wonder, is this cheating? Is helping me form outlines cheating? If I write an essay in my own words and ask how to improve it, or when it starts to edit my essay, is that cheating?”

Her class syllabi say things like: “Don’t use AI to write essays and to form thoughts,” she says, but that leaves a lot of grey area. Students say they often shy away from asking teachers for clarity because admitting to any AI use could flag them as a cheater.

Schools tend to leave AI policies to teachers, which often means that rules vary widely within the same school. Some educators, for example, welcome the use of Grammarly.com, an AI-powered writing assistant, to check grammar. Others forbid it, noting the tool also offers to rewrite sentences.

“Whether you can use AI or not depends on each classroom. That can get confusing,” says Valencia 11th grader Jolie Lahey. She credits Cuny with teaching her sophomore English class a variety of AI skills like how to upload study guides to ChatGPT and have the chatbot quiz them, and then explain problems they got wrong.

But this year, her teachers have strict “No AI” policies. “It’s such a helpful tool. And if we’re not allowed to use it that just doesn’t make sense,” Lahey says. “It feels outdated.”

Schools are introducing guidelines, gradually

Many schools initially banned use of AI after ChatGPT launched in late 2022. But views on the role of artificial intelligence in education have shifted dramatically. The term “AI literacy” has become a buzzword of the back-to-school season, with a focus on how to balance the strengths of AI with its risks and challenges.

Over the summer, several colleges and universities convened their AI task forces to draft more detailed guidelines or provide faculty with new instructions.

The University of California, Berkeley emailed all faculty new AI guidance that instructs them to “include a clear statement on their syllabus about course expectations” around AI use. The guidance offered language for three sample syllabus statements — for courses that require AI, ban AI in and out of class, or allow some AI use.

“In the absence of such a statement, students may be more likely to use these technologies inappropriately,” the email said, stressing that AI is “creating new confusion about what might constitute legitimate methods for completing student work.”

Artificial Intelligence Cheating
A student types a prompt into ChatGPT on a Chromebook during Casey Cuny’s English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025. (AP Photo/Jae C. Hong)AP

Carnegie Mellon University has seen a huge uptick in academic responsibility violations due to AI, but often students aren’t aware they’ve done anything wrong, says Rebekah Fitzsimmons, chair of the AI faculty advising committee at the university’s Heinz College of Information Systems and Public Policy.

For example, one student who is learning English wrote an assignment in his native language and used DeepL, an AI-powered translation tool, to translate his work to English. But he didn’t realize the platform also altered his language, which was flagged by an AI detector.

Enforcing academic integrity policies has become more complicated, since use of AI is hard to spot and even harder to prove, Fitzsimmons said. Faculty are allowed flexibility when they believe a student has unintentionally crossed a line, but are now more hesitant to point out violations because they don’t want to accuse students unfairly. Students worry that if they are falsely accused, there is no way to prove their innocence.

Over the summer, Fitzsimmons helped draft detailed new guidelines for students and faculty that strive to create more clarity. Faculty have been told a blanket ban on AI “is not a viable policy” unless instructors make changes to the way they teach and assess students. A lot of faculty are doing away with take-home exams. Some have returned to pen and paper tests in class, she said, and others have moved to “flipped classrooms,” where homework is done in class.

Emily DeJeu, who teaches communication courses at Carnegie Mellon’s business school, has eliminated writing assignments as homework and replaced them with in-class quizzes done on laptops in “a lockdown browser” that blocks students from leaving the quiz screen.

“To expect an 18-year-old to exercise great discipline is unreasonable,” DeJeu said. ”That’s why it’s up to instructors to put up guardrails.”

If you purchase a product or register for an account through a link on our site, we may receive compensation. By using this site, you consent to our User Agreement and agree that your clicks, interactions, and personal information may be collected, recorded, and/or stored by us and social media and other third-party partners in accordance with our Privacy Policy.



Source link

Continue Reading

Education

Why Every School Needs an AI Policy

Published

on

By


AI is transforming the way we work but without a clear policy in place, even the smartest technology can lead to costly mistakes, ethical missteps and serious security risks

CREDIT: This is an edited version of an article that originally appeared in All Business

Artificial Intelligence is no longer a futuristic concept. It’s here, and it’s everywhere. From streamlining operations to powering chatbots, AI is helping organisations work smarter, faster and more efficiently.

According to G-P’s AI at Work Report, a staggering 91% of executives are planning to scale up their AI initiatives. But while AI offers undeniable advantages, it also comes with significant risks that organisations cannot afford to ignore. As AI continues to evolve, it’s crucial to implement a well-structured AI policy to guide its use within your school.

Understanding the Real-World Challenges of AI

While AI offers exciting opportunities for streamlining admin, personalising learning and improving decision-making in schools, the reality of implementation is more complex. The upfront costs of adopting AI tools be high. Many schools, especially those with legacy systems, find it difficult to integrate new technologies smoothly without creating further inefficiencies or administrative headaches.

There’s also a human impact to consider. As AI automates tasks once handled by staff, concerns about job displacement and deskilling begin to surface. In an environment built on relationships and pastoral care, it’s important to question how AI complements rather than replaces the human touch.

Data security is another significant concern. AI in schools often relies on sensitive pupil data to function effectively. If these systems are compromised the consequences can be serious. From safeguarding breaches to trust erosion among parents and staff, schools must be vigilant about privacy and protection.

And finally, there’s the environmental angle. AI requires substantial computing power and infrastructure, which comes with a carbon cost. As schools strive to meet sustainability targets and educate students on climate responsibility, it’s worth considering AI’s footprint and the long-term environmental impact of widespread adoption.

The Role of an AI Policy in Modern School

To navigate these issues responsibly, schools must adopt a comprehensive AI policy. This isn’t just a box-ticking exercise, it’s a roadmap for how your school will use AI ethically, securely and sustainably. A good AI policy doesn’t just address technology; it reflects your values, goals and responsibilities. The first step in building your policy is to create a dedicated AI policy committee. This group should consist of senior leaders, board members, department heads and technical stakeholders. Their mission? To guide the safe and strategic use of AI across your school. This group should be cross-functional so they can represent all school areas and raise practical concerns around how AI may affect people, processes and performance.

Protecting Privacy: A Top Priority

One of the most important responsibilities when implementing AI is protecting personal and corporate data. Any AI system that collects, stores, or processes sensitive data must be governed by robust security measures. Your AI policy should establish strict rules for what data can be collected, how long it can be stored and who has access. Use end-to-end encryption and multi-factor authentication wherever possible. And always ask: is this data essential? If not, don’t collect it.

Ethics Matter: Keep AI Aligned With Your Values

When creating an AI policy, you must consider how your principles translate to digital behaviour. Unfortunately, AI models can unintentionally amplify bias, especially when trained on datasets that lack diversity or were built without appropriate oversight. Plagiarism, misattribution and theft of intellectual property are also common concerns. Ensure your policy includes regular audits and bias detection protocols. Consult ethical frameworks such as those provided by the EU AI Act or OECD principles to ensure you’re building in fairness, transparency and accountability from day one.

The Bottom Line: Use AI to Support, Not Replace, Your Strengths

AI is powerful. But like any tool, its value depends on how you use it. With a strong, ethical policy in place, you can harness the benefits of AI without compromising your people, principles, or privacy.

Don’t forget to follow us on
Twitter
like us on Facebook
or connect with us on
LinkedIn!





Source link

Continue Reading

Education

School security on Long Island: New safety measures include AI-powered gun detection and wearable panic buttons

Published

on


Lights flashed, alarms buzzed and an announcement blared through an electronic classroom display: “At this time we will be moving into a lockdown,” the voice warned.

The alert was part of a recent demonstration of a new security system installed at Plainedge High School in North Massapequa, where every district classroom now has sleek wall-mounted devices that will eventually livestream video and audio feeds to police during an emergency such as a mass shooting. The panels, officials said, will allow first responders to see directly into classrooms to determine the exact location of a shooter or any injured victims.

“I tell everyone on a daily basis that you’re going to send me your children, husbands, wives, sons, daughters, and they’re all going to come home safely,” schools Superintendent Edward A. Salina Jr. said.

School officials across Long Island have grappled for years with how best to protect students and staffers in the event of an attack. Concern over school safety was reignited late last month when a shooting during a Catholic school Mass in Minneapolis left two children dead and nearly two dozen others wounded.

WHAT NEWSDAY FOUND

  • Several Long Island school districts have added new security measures this year ranging from a weapon detection system powered by artificial intelligence to wearable panic buttons.
  • Security experts said that while evolving technology can be helpful, preventative measures such as frequent safety drills and trainings and mental health resources are also important.

  • Learning how to spot threats and having a safe way to report them should also be part of a district’s security strategy, experts said.

In the past, school security might have meant hiring armed guards or installing metal detectors. But recently, several Island districts have also introduced technological upgrades ranging from a weapon detection system powered by artificial intelligence to wearable panic buttons.

Brian Selltiz, co-founder of Digital Provisions in Ronkonkoma, said he works with about 60 school districts on the Island and many have recently added wearable panic buttons, mass alert systems and other tools to their security arsenal.

“The school staff and the vendors on Long Island that work in the space are extremely diligent in providing a layered approach that does everything … available or within reason to protect the students,” he said.

New security technology in schools

Nationwide, 134 people have been fatally shot or wounded on school properties so far this year, according to the K-12 School Shooting Database founded by David Riedman. Last year there were 276 such victims. None were on Long Island.

In an effort to keep students and staff safe, Plainedge teachers now carry wearable panic buttons on their badges and mounted versions of the technology have been installed throughout district buildings. The district also has added secure vestibule areas and plans to hire armed security.

Panic button systems like this one have been installed throughout the Plainedge school district. Credit: Newsday/Drew Singh

The Three Village school district, meanwhile, now has 600 cameras with ZeroEyes weapons detection, according to district spokeswoman Denise Nash. When a firearm is detected, a monitoring team is alerted and local authorities are notified, bypassing normal 911 dispatch, according to the company website.

The Westbury and Cold Spring Harbor districts earlier this year also equipped all staff with wearable panic buttons that automatically alert police of an emergency with precise location information, triggering an automatic lockdown. Similar devices were credited with minimizing the loss of life in a school shooting in Georgia last year that resulted in the deaths of two teachers and two students.

Tahira DuPree Chase, superintendent for the Westbury school district, said that since January the district has rolled out a number of security measures, including the wearable devices, new cameras and more security personnel. Officials also created secure vestibule areas that control access into the building, and added alarms that set off when doors are not fully closed or improperly used.

Cold Spring Harbor schools Superintendent Joseph Monastero said in addition to the panic buttons, security cameras and fencing also were added throughout the district.

Costs for the new security measures range from around $100,000 in the Three Village district to $6.3 million for Westbury, according to school officials. The local Board of Cooperative Educational Services covered some costs, they said.

Preventative measures 

But local and national security experts said that while evolving technology can be helpful, a comprehensive safety strategy must also involve preventative measures. Frequent drills and training can help staff and students learn how to use security systems and respond quickly. Mental health resources can create a supportive and inclusive environment.

Learning how to spot threats and having a safe way to report them can also go a long way toward securing schools, experts said.

“A lot of these school incidents are linked to a mental health issue,” Chase said. “Prioritizing mental health awareness in schools and making sure that everyone knows what to do if they think someone is in crisis, and giving people a safe space to report if they need to — those things are important.” She noted that Westbury students can submit tips through an anonymous system.

Elyse Thulin, a research professor at the University of Michigan’s Institute for Firearm Injury Prevention, said that while there is no single profile of a mass shooter, in about three quarters of cases, the assailant left red flags behind.

“Probably 75% of perpetrators of mass harm display this thing called leakage, where they in some way disclose their plans or their intents. And often, what we’re seeing is that this is occurring in online spaces, but there are also in-person indicators that can potentially be observed,” Thulin said. “If people understand what those warning signs are and feel comfortable, feel empowered, and feel like we take it seriously, they’re likely to potentially report those.”

Thulin said concerning behaviors include bullying, aggression, name calling and withdrawal, in addition to suspicions of a planned attack.

Mo Canady, executive director of the Alabama-based National Association of School Resource Officers, said that with so much new technology available, a company’s policies and procedures are as critical as the systems themselves.

“The technology can be great, but if you don’t have that policy, procedure and training around it, then it can really become problematic,” Canady said.

He also stressed the human side of security.

“Every school should have a multidisciplinary school safety team that absolutely should include law enforcement, but it should also include school counselors, mental health specialists and school administrators,” Canady said. That team of staff, he said, could gather intelligence and go a long way toward preventing violence.

Kenneth Trump, president of the Cleveland-based National School Safety and Security Services, said in a time where vendors are aggressively marketing new products, schools should still be focused on emergency protocols, supervising students and teaching staff to make quick decisions under stress.

“There’s this heavy influence on looking at the target hardening, but the first and best line of defense is really a well trained, highly alert staff and student body,” said Trump, who is not related to the nation’s president.

“That means having planning for crisis situations. That means providing students the social and emotional, mental health supports they need to prevent them from getting to a tragedy in the first place,” he added.

Monastero, the Cold Springs Harbor superintendent, said ensuring a school’s security depends on everyone working together.

“If something doesn’t seem right, you need to say something,” he said. “We take care of the most precious packages … and we want to make sure we keep them as safe as possible.”



Source link

Continue Reading

Trending