Connect with us

Education

State Department layoffs deal “blow” to US cultural exchange 

Published

on


  • Some 40 employees at the Bureau of Educational and Cultural Affairs caught up in mass State Department layoffs, on top of an initial 50 who left the department voluntarily earlier this year.
  • Stakeholders warn educational and cultural exchange programs are at risk, as the US’s international education sector takes a hit under the Trump administration’s focus on its ‘America First’ policy.
  • Amid heavy losses, stakeholders await the contents of President Trump’s upcoming 2026 budget with baited breath.

Included in the 1,300 redundancies were roughly 40 employees of the Bureau of Educational and Cultural Affairs (ECA), which heads up the country’s educational and cultural exchange programs.  

Combined with the 50 ECA staff members who already took voluntary departure earlier this year, the losses amount to nearly 20% of the Bureau, with stakeholders warning of grave damage to global partnerships which underscore international cultural exchange.

“Diplomacy and exchanges don’t just build people-to-people relationships – they depend on them,” executive director at Alliance for International Exchange Mark Overmann told The PIE News. 

“The loss of so many dedicated State and ECA colleagues last week is a real blow to our community and our ability to build people-to-people relationships,” he said, adding: “working to make America safer and stronger just got that much harder.” 

State Department employees were notified of the layoffs on Friday July 11, comprising of 1,107 civil service and 246 foreign service employees with domestic assignments. 

The reduction in force (RIF) followed a Supreme Court ruling earlier last week lifting the temporary court injunction against mass firings of federal staff.  

The loss of members in the diplomatic workforce will absolutely affect the education abroad community’s ability to grow, thrive and keep students safe

Melissa Torres, Forum on Education Abroad

Overall, about 15% of the workforce is estimated to have been laid off, as the department undergoes an “historic reorganisation” to align with the “America First” foreign policy priorities announced by Secretary Rubio on April 22, according to the department. 

The firing process has been mired in confusion, with the lifting of the temporary court order creating chaos, and reports of emails delivered to staff at different times due to limitations of federal systems.  

According to internal communications seen by The PIE, ECA senior bureau official Darren Beattie assured all ECA and public diplomacy staff on July 14 that there were “no plans” to conduct any additional domestic RIFs.

A source close to the situation told The PIE that the full ECA policy office had been slashed, and half of both the Office of International Visitors (OIV) and the Speakers Program Office.  

They added that roughly 50 staff had already left ECA during the voluntary departures earlier this year, resulting in a total loss of about 18% for the bureau that runs educational and cultural exchanges. 

Overall department losses including departures earlier this year total approximately 3,000, with the stated aim of restoring the department to “results-driven diplomacy” powered primarily by overseas posts and Washington regional bureaus, a State Department spokesperson told The PIE. 

While the spokesperson maintained that security functions, consular services and regional desk responsibilities remain “fully operational”, colleagues in government and across higher education have warned of the damage to America’s national security and soft power.  

“The loss of members in the diplomatic workforce will absolutely affect the education abroad community’s ability to grow, thrive and keep students safe,” Melissa Torres, president of the Forum on Education Abroad, told The PIE.  

While it remains to be seen whether the layoffs will have a direct impact on programs such as the Fulbright and Gilman scholarships, Torres emphasised the importance of strong global partnerships and nuanced cultural understanding that will be damaged by the cuts.  

“To set the foundation that enables those relations, you need skilled diplomats to offer strategic direction and guidance,” she said, adding that students and the country would “suffer” as these “career-defining” opportunities are threatened.  

The American Foreign Service Association has said it “strongly opposed” the decision “in the strongest terms”, calling the cuts a “catastrophic blow” to US national interests. 

“In less than six months, the US has shed at least 20% of its diplomatic workforce through shuttering of institutions and forced resignations,” it said in a statement on July 11, advocating for the importance of non-partisan diplomacy.  

Democratic senators have argued that the administration “must invest in our diplomatic corps and national security experts – not erode the institutions that protect our interests, promote US values and keep Americans abroad safe”, they said in a statement.  

“As the US retreats, our adversaries – like the People’s Republic of China – are expanding their diplomatic reach, making Americans less safe and less prosperous,” they continued, claiming the “blanket and indiscriminate” cuts were a legacy of “Elon Musk’s failed DOGE efforts”.

For its part, the State Department has maintained the reorganisation will “better align [its] workforce activities and programs with the America First foreign policy priorities”.  

The cuts come at a time of heightened demand on the State Department, which absorbed 20% of programs from the recently dismantled Agency for International Development (USAID) this July.  

As stakeholders wait for the President’s upcoming FY2026 budget, study abroad colleagues welcomed a new Appropriations bill passed this week proposing a 22% cut to the State Department’s budget rather than the dire 93% cut initially requested by the president.  



Source link

Education

AI cheating in US schools prompts shift to in-class assessments, clearer policies

Published

on


Artificial intelligence is reshaping education in the United States, forcing schools to rethink how students are assessed.

Traditional homework like take-home essays and book reports is increasingly being replaced by in-class writing and digital monitoring. The rise of AI has blurred the definition of honest work, leaving both teachers and students grappling with new challenges.

California English teacher Casey Cuny, a 2024 Teacher of the Year, said, “The cheating is off the charts. It’s the worst I’ve seen in my entire career.”

He added that teachers now assume any work done at home may involve AI. “We have to ask ourselves, what is cheating? Because I think the lines are getting blurred.”

SHIFT TO IN-CLASS ASSESSMENTS

Across schools, teachers are designing assignments that must be completed during lessons. Oregon teacher Kelly Gibson explained, “I used to give a writing prompt and say, ‘In two weeks I want a five-paragraph essay.’

These days, I can’t do that. That’s almost begging teenagers to cheat.”

Students themselves are unsure how far they can go. Some use AI for research or editing, but question whether summarising readings or drafting outlines counts as cheating.

College student Lily Brown admitted, “Sometimes I feel bad using ChatGPT to summarise reading, because I wonder is this cheating?”

POLICY CONFUSION AND NEW GUIDELINES

Guidance on AI use varies widely, even within the same school. Some classrooms encourage AI-assisted study, while others enforce strict bans. Valencia 11th grader Jolie Lahey called it “confusing” and “outdated.”

Universities are also drafting clearer rules. At the University of California, Berkeley, faculty are urged to state expectations on AI in syllabi. Without clarity, administrators warn, students may use tools inappropriately.

At Carnegie Mellon University, rising cases of academic responsibility violations have prompted a rethink. Faculty have been told that outright bans ‘are not viable’ unless assessment methods change.

Many courses are now shifting to in-class digital quizzes, lockdown browsers, or pen-and-paper exams.

Emily DeJeu, who teaches at Carnegie Mellon’s business school, stressed, “To expect an 18-year-old to exercise great discipline is unreasonable, that’s why it’s up to instructors to put up guardrails.”

The debate continues as schools balance innovation with integrity, shaping how the next generation learns in an AI-driven world.

(With inputs from AP)

– Ends

Published By:

Princy Shukla

Published On:

Sep 12, 2025



Source link

Continue Reading

Education

‘Cheating is off the charts: AI tools reshape how students learn and study

Published

on


The book report is now a thing of the past. Take-home tests and essays are becoming obsolete.

Student use of artificial intelligence has become so prevalent, high school and college educators say, that to assign writing outside of the classroom is like asking students to cheat.

“The cheating is off the charts. It’s the worst I’ve seen in my entire career,” says Casey Cuny, who has taught English for 23 years. Educators are no longer wondering if students will outsource schoolwork to AI chatbots. “Anything you send home, you have to assume is being AI’ed.”

The question now is how schools can adapt, because many of the teaching and assessment tools that have been used for generations are no longer effective. As AI technology rapidly improves and becomes more entwined with daily life, it is transforming how students learn and study and how teachers teach, and it’s creating new confusion over what constitutes academic dishonesty.

“We have to ask ourselves, what is cheating?” says Cuny, a 2024 recipient of California’s Teacher of the Year award. “Because I think the lines are getting blurred.”

Cuny’s students at Valencia High School in southern California now do most writing in class. He monitors student laptop screens from his desktop, using software that lets him “lock down” their screens or block access to certain sites. He’s also integrating AI into his lessons and teaching students how to use AI as a study aid “to get kids learning with AI instead of cheating with AI.”

In rural Oregon, high school teacher Kelly Gibson has made a similar shift to in-class writing. She is also incorporating more verbal assessments to have students talk through their understanding of assigned reading.

“I used to give a writing prompt and say, ‘In two weeks, I want a five-paragraph essay,’” says Gibson. “These days, I can’t do that. That’s almost begging teenagers to cheat.”

Take, for example, a once typical high school English assignment: Write an essay that explains the relevance of social class in “The Great Gatsby.” Many students say their first instinct is now to ask ChatGPT for help “brainstorming.” Within seconds, ChatGPT yields a list of essay ideas, plus examples and quotes to back them up. The chatbot ends by asking if it can do more: “Would you like help writing any part of the essay? I can help you draft an introduction or outline a paragraph!”

Casey Cuny, an English teacher at Valencia High School, works on his computer as he prepares for class in Santa Clarita, Calif., Wednesday, Aug. 27, 2025. (AP Photo/Jae C. Hong)AP

Students are uncertain when AI usage is out of bounds

Students say they often turn to AI with good intentions for things like research, editing or help reading difficult texts. But AI offers unprecedented temptation, and it’s sometimes hard to know where to draw the line.

College sophomore Lily Brown, a psychology major at an East Coast liberal arts school, relies on ChatGPT to help outline essays because she struggles putting the pieces together herself. ChatGPT also helped her through a freshman philosophy class, where assigned reading “felt like a different language” until she read AI summaries of the texts.

“Sometimes I feel bad using ChatGPT to summarize reading, because I wonder, is this cheating? Is helping me form outlines cheating? If I write an essay in my own words and ask how to improve it, or when it starts to edit my essay, is that cheating?”

Her class syllabi say things like: “Don’t use AI to write essays and to form thoughts,” she says, but that leaves a lot of grey area. Students say they often shy away from asking teachers for clarity because admitting to any AI use could flag them as a cheater.

Schools tend to leave AI policies to teachers, which often means that rules vary widely within the same school. Some educators, for example, welcome the use of Grammarly.com, an AI-powered writing assistant, to check grammar. Others forbid it, noting the tool also offers to rewrite sentences.

“Whether you can use AI or not depends on each classroom. That can get confusing,” says Valencia 11th grader Jolie Lahey. She credits Cuny with teaching her sophomore English class a variety of AI skills like how to upload study guides to ChatGPT and have the chatbot quiz them, and then explain problems they got wrong.

But this year, her teachers have strict “No AI” policies. “It’s such a helpful tool. And if we’re not allowed to use it that just doesn’t make sense,” Lahey says. “It feels outdated.”

Schools are introducing guidelines, gradually

Many schools initially banned use of AI after ChatGPT launched in late 2022. But views on the role of artificial intelligence in education have shifted dramatically. The term “AI literacy” has become a buzzword of the back-to-school season, with a focus on how to balance the strengths of AI with its risks and challenges.

Over the summer, several colleges and universities convened their AI task forces to draft more detailed guidelines or provide faculty with new instructions.

The University of California, Berkeley emailed all faculty new AI guidance that instructs them to “include a clear statement on their syllabus about course expectations” around AI use. The guidance offered language for three sample syllabus statements — for courses that require AI, ban AI in and out of class, or allow some AI use.

“In the absence of such a statement, students may be more likely to use these technologies inappropriately,” the email said, stressing that AI is “creating new confusion about what might constitute legitimate methods for completing student work.”

Artificial Intelligence Cheating
A student types a prompt into ChatGPT on a Chromebook during Casey Cuny’s English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025. (AP Photo/Jae C. Hong)AP

Carnegie Mellon University has seen a huge uptick in academic responsibility violations due to AI, but often students aren’t aware they’ve done anything wrong, says Rebekah Fitzsimmons, chair of the AI faculty advising committee at the university’s Heinz College of Information Systems and Public Policy.

For example, one student who is learning English wrote an assignment in his native language and used DeepL, an AI-powered translation tool, to translate his work to English. But he didn’t realize the platform also altered his language, which was flagged by an AI detector.

Enforcing academic integrity policies has become more complicated, since use of AI is hard to spot and even harder to prove, Fitzsimmons said. Faculty are allowed flexibility when they believe a student has unintentionally crossed a line, but are now more hesitant to point out violations because they don’t want to accuse students unfairly. Students worry that if they are falsely accused, there is no way to prove their innocence.

Over the summer, Fitzsimmons helped draft detailed new guidelines for students and faculty that strive to create more clarity. Faculty have been told a blanket ban on AI “is not a viable policy” unless instructors make changes to the way they teach and assess students. A lot of faculty are doing away with take-home exams. Some have returned to pen and paper tests in class, she said, and others have moved to “flipped classrooms,” where homework is done in class.

Emily DeJeu, who teaches communication courses at Carnegie Mellon’s business school, has eliminated writing assignments as homework and replaced them with in-class quizzes done on laptops in “a lockdown browser” that blocks students from leaving the quiz screen.

“To expect an 18-year-old to exercise great discipline is unreasonable,” DeJeu said. ”That’s why it’s up to instructors to put up guardrails.”

If you purchase a product or register for an account through a link on our site, we may receive compensation. By using this site, you consent to our User Agreement and agree that your clicks, interactions, and personal information may be collected, recorded, and/or stored by us and social media and other third-party partners in accordance with our Privacy Policy.



Source link

Continue Reading

Education

Why Every School Needs an AI Policy

Published

on

By


AI is transforming the way we work but without a clear policy in place, even the smartest technology can lead to costly mistakes, ethical missteps and serious security risks

CREDIT: This is an edited version of an article that originally appeared in All Business

Artificial Intelligence is no longer a futuristic concept. It’s here, and it’s everywhere. From streamlining operations to powering chatbots, AI is helping organisations work smarter, faster and more efficiently.

According to G-P’s AI at Work Report, a staggering 91% of executives are planning to scale up their AI initiatives. But while AI offers undeniable advantages, it also comes with significant risks that organisations cannot afford to ignore. As AI continues to evolve, it’s crucial to implement a well-structured AI policy to guide its use within your school.

Understanding the Real-World Challenges of AI

While AI offers exciting opportunities for streamlining admin, personalising learning and improving decision-making in schools, the reality of implementation is more complex. The upfront costs of adopting AI tools be high. Many schools, especially those with legacy systems, find it difficult to integrate new technologies smoothly without creating further inefficiencies or administrative headaches.

There’s also a human impact to consider. As AI automates tasks once handled by staff, concerns about job displacement and deskilling begin to surface. In an environment built on relationships and pastoral care, it’s important to question how AI complements rather than replaces the human touch.

Data security is another significant concern. AI in schools often relies on sensitive pupil data to function effectively. If these systems are compromised the consequences can be serious. From safeguarding breaches to trust erosion among parents and staff, schools must be vigilant about privacy and protection.

And finally, there’s the environmental angle. AI requires substantial computing power and infrastructure, which comes with a carbon cost. As schools strive to meet sustainability targets and educate students on climate responsibility, it’s worth considering AI’s footprint and the long-term environmental impact of widespread adoption.

The Role of an AI Policy in Modern School

To navigate these issues responsibly, schools must adopt a comprehensive AI policy. This isn’t just a box-ticking exercise, it’s a roadmap for how your school will use AI ethically, securely and sustainably. A good AI policy doesn’t just address technology; it reflects your values, goals and responsibilities. The first step in building your policy is to create a dedicated AI policy committee. This group should consist of senior leaders, board members, department heads and technical stakeholders. Their mission? To guide the safe and strategic use of AI across your school. This group should be cross-functional so they can represent all school areas and raise practical concerns around how AI may affect people, processes and performance.

Protecting Privacy: A Top Priority

One of the most important responsibilities when implementing AI is protecting personal and corporate data. Any AI system that collects, stores, or processes sensitive data must be governed by robust security measures. Your AI policy should establish strict rules for what data can be collected, how long it can be stored and who has access. Use end-to-end encryption and multi-factor authentication wherever possible. And always ask: is this data essential? If not, don’t collect it.

Ethics Matter: Keep AI Aligned With Your Values

When creating an AI policy, you must consider how your principles translate to digital behaviour. Unfortunately, AI models can unintentionally amplify bias, especially when trained on datasets that lack diversity or were built without appropriate oversight. Plagiarism, misattribution and theft of intellectual property are also common concerns. Ensure your policy includes regular audits and bias detection protocols. Consult ethical frameworks such as those provided by the EU AI Act or OECD principles to ensure you’re building in fairness, transparency and accountability from day one.

The Bottom Line: Use AI to Support, Not Replace, Your Strengths

AI is powerful. But like any tool, its value depends on how you use it. With a strong, ethical policy in place, you can harness the benefits of AI without compromising your people, principles, or privacy.

Don’t forget to follow us on
Twitter
like us on Facebook
or connect with us on
LinkedIn!





Source link

Continue Reading

Trending