Schools throughout Utah are grappling with how to properly use or restrict the use of artificial intelligence. Although schools in Washington County reportedly have a strategy for the immediate future, AI is currently evolving so fast that educators must constantly be open to change.
Chris Agnew, director of the Generative AI for Education Hub at Stanford University in California, recently said that when it comes to grappling with AI in education, “one of the most coherent statewide level strategies and approaches is Utah.”
The state government created a position overseeing AI in K-12 education statewide over a year ago, when the Utah State Board of Education hired Matthew Winters as its full-time AI specialist. Agnew said Utah can collect stronger data about how AI is used in schools and get a clearer picture of what is and isn’t working.
Dixie Technical College staff have recently discussed the issues related to AI development in education. Utah Tech University Provost Michael Lacourse spoke at length on this subject during the university’s convocation ceremony last month, shortly before a new cybersecurity degree program was unveiled for this semester.
Stock image.
Image by PhonlamaiPhoto/iStock/Getty Images Plus
Utah Tech Associate Professor of Computer Science Curtis Larsen told St. George News that it is “always risky” to speculate about the forms AI may take in the future. However, he said he believes that a few forms of AI seem likely, even though they may feel unfamiliar at first.
The first likely form, Larsen said, is truly personal agents.
“Not a generic chatbot, but a trusted assistant that learns your preferences, filters the firehose, summarizes what matters and takes routine actions across your apps,” he said.
“Think of it as a private curator aligned to your goals, not an ad-driven feed,” he added.
Larsen said he thinks that a second likely form will be clinical copilots and surgical robotics in the medical field. He said AI could be somewhat involved in procedures such as a laparoscopy, where a fiber-optic instrument is inserted into the body to examine certain organs.
“I don’t see AI replacing surgeons, but I do see systems that plan procedures, guide instruments, watch for safety issues and automate narrow tasks in laparoscopy and beyond — always with clinicians directing and reviewing the work,” he said. “Health care has higher stakes and tighter regulation than self-driving cars, so the pace will be measured, but task-level autonomy is plausible.”
Larsen said another likely form is what he called “backstage autonomy” across the economy. This means that company warehouses, labs, customer support and campus operations would be run by specialized AI agents operating “backstage” while humans would handle exceptions that require additional attention.
“These systems are less visible than chatbots but potentially more transformative,” he said, adding that, “across all of this, the guardrails matter.”
Those guardrails include privacy, provenance, liability and clear human oversight.
“The promise is productivity and focus; the risk is over-reliance,” Larsen said. “We should design for the former and protect against the latter.”
Lacourse had previously spoken about how the university will soon be using agentic AI to help students.
When asked about how he thinks this should be implemented, Larsen said he doesn’t think the answer is “a single, one-size-fits-all university chatbot.”
Utah Tech University Provost Michael Lacourse speaks at the Fall Academic Convocation on the university campus in St. George, Utah, Aug. 11, 2025.
Photo by Nick Fiala, St. George News
“The near-term path is specialized, domain-specific agents embedded in the tools students and faculty already use,” Larsen said. “General models are excellent for broad help, but they often miss program-level details. Coding agents are the clearest early example: they follow well-defined workflows and boost productivity, but they still require a developer to direct and review the work.”
Larsen said he expects agentic AI to show up at Utah Tech in three ways.
The first way will be student-facing course agents that turn out syllabi, rubrics and datasets that will be delivered to students through the university’s learning management systems. He hopes that this will “free students to spend more time on the creative and conceptual work — without replacing fundamentals.”
Secondly, there will be AI faculty assistants that operate on real files by editing slides, aligning outcomes and assessments, proposing better exercises and flagging issues with pacing or flow.
Thirdly, operational agents are expected to help with routine university workflows, with humans verifying the results.
“None of this is ‘set it and forget it,’” Larsen said. “Effective agents will be specialized, supervised and integrated with clear guardrails for data privacy and academic integrity.”
Larsen said that most of his colleagues in Utah Tech’s computing department use AI in some form, although they do so “primarily as a tutor or assistant, not an ‘easy button.’”
Stock image.
Photo by Supatman/iStock/Getty Images Plus
“We encourage students to use these tools to learn faster, and we’re redesigning assessments — especially in first-year courses — to verify understanding,” he said. “Because many traditional assignments can now be completed by AI, we’re shifting toward work that requires students to explain their reasoning, show their process and defend their code.”
He said that “AI lets us raise the bar” in upper-division classes, where students can accomplish more in the same amount of time with an AI assistant at their side, but are still accountable for the concepts and for every line they submit.
“They may type less, but they must understand more,” Larsen said. “As for misuse, it happens — just as with any new technology — but we’re keeping pace by emphasizing assessment design over detection. Oral check-ins, in-class problem-solving, version-control history, and code reviews make it harder to pass off AI’s work as your own and easier for us to confirm genuine learning.”
Blueprints and warning signs
Larsen said that there are three things that he wishes more people understood about AI.
First, he wants people to know that today’s AI systems are powerful pattern recognizers, but they are not minds.
“They generate fluent text by predicting likely words from training data; they don’t have intent, self-awareness or human-level understanding,” he said.
Secondly, he would like people to treat conversations with chatbots as if they are willingly sharing their personal data with the chatbot.
“Depending on the provider and your settings, prompts and files may be stored and used to improve the service — so don’t paste sensitive or confidential information,” he said.
Thirdly, Larsen would like to remind people that “anyone speaking with certainty about AI’s long-term trajectory is making an educated guess — including me.”
Stock image.
Photo by gorodenkoff/iStock/Getty Images Plus
“The field is moving fast, and timelines and impacts remain uncertain,” he said.
So, in light of that, what are the drawbacks of AI that people should be worried about or prepared to face in the future?
“AI delivers real value, but the risks are just as real,” Larsen said, adding that one of the risks with AI is quality, since “these systems can be fluent and confident while being wrong, and they can reflect biases in their training data.”
He said that subject-matter expertise and verification are non-negotiable tools, since “you need to tell the jewels from the junk.”
Another concern with AI is data ownership.
“Prompts and files can be stored or reused depending on the service, and source attribution and copyright remain unsettled,” he said. “Treat anything you paste as shared data and favor tools with clear enterprise controls.”
In the long term, there is a risk of online content degrading after years of relying on inferior AI too much. This would mean that, if you look up the answer to a problem online, the AI-generated answers or search results you receive may be much less reliable, since the answers themselves could be based on data compiled sloppily with AI.
“As more online material is AI-generated, models can end up training on their own outputs — a feedback loop that degrades quality unless we curate high-quality, human-verified datasets,” Larsen said.
The final risk of AI that Larsen pointed out is over-reliance.
“If we let AI drive the process, we can slowly give up judgment and skill,” he said. “The safeguard is straightforward: keep AI in an assistive role, require human review for consequential decisions, and design workflows that keep people — and their expertise — in charge.”
Editor’s Note: This article was written for Mosaic, an independent journalism training program for high school students who report and photograph stories under the guidance of professional journalists.
As a 16-year-old high school student living in the Bay Area, I notice artificial intelligence being used around me daily. At my school, I’ve seen students submitting AI-generated work as their own, rather than taking the time to research, write and truly understand content.
Teachers see this and fear that students will go out in the real world and not know how to think critically without consulting a machine first. They see how it has already created an overreliance on shortcuts, weaker problem-solving skills and lower writing ability in their students. In response, many teachers have banned AI from their classrooms.
I’ll be honest: sometimes I use AI in school, too. By asking ChatGPT to help me explain the meaning of a piece of text or asking it to identify flaws in my writing, I intentionally use AI to help me learn. But when some of my peers use it to generate their entire assignments, it leads teachers to see any use of AI — whether it be productive or exploitative — as a lazy way to cheat, and it makes me feel guilty to use it at all.
It’s understandable why many educators feel this way, and researchers agree. For example, in a 2024 study published by the journal Societies, a sampling of 666 people showed that younger participants reported a higher use of AI tools but displayed lower critical-thinking skills.
However, AI isn’t going away. Other studies show that the use of AI in the workplace and in education is rising. A 2025 Gallup poll of U.S. employees showed those who frequently use AI multiple times a week nearly doubled, from 11% to 19% in two years. It’s only becoming more integrated in the world we live in, and without being taught AI literacy in school, the future feels uncertain.
Completely wiping AI out from classrooms doesn’t work in the long term, because students still find ways to move around it and misuse it for their assignments. It’s easy for them to adapt by using content humanizers like Bypass GPT or AIHumanize to avoid being detected. Banning it does not solve the problem.
A ban also erases the opportunities that AI can give students to do better in school. For example, an English teacher might have students generate an essay using AI, then have them critique its writing style and argument, or compare it to their own essay to identify areas that can be improved.
Khanmigo, an AI platform developed by Khan Academy, can give students more practice problems in math when they struggle with a particular concept, and can work with them to learn, rather than handing them answers.
This is a chance for educators to teach students how to use AI responsibly — not as a substitute for creative and critical thought, but as a tool to support them in academics instead. AI is still a developing technology that presents ethical issues, like its substantial environmental impact and potential biases that can be introduced by algorithms. It’s also known to not always be reliable for credible information and research.
But I rarely see these issues being discussed around me — it’s a missed opportunity for teachers to encourage digital literacy and for students to engage with a technology that will ultimately shape our world.
Sophie Luo is a member of the class of 2027 at Irvington High School in Fremont.
Launched as part of ACSES’ Australian Student Equity Symposium in Sydney, Equity Insights 2025: Policy, Power, and Practice for a Fairer Australian Tertiary Education System shares the views of vice-chancellors, policymakers, practitioners, and students, who examine why, despite substantial investment and effort, equity progress in Australian higher education remains modest.
Shamit Saggar, executive director of ACSES, said the purpose of the report is to shed light on the challenge of student equity, the responsibilities involved, and the progress being made.
“The report gathers 14 perspectives from key figures involved in higher education policy, university sector management, equity practice, student experience, and academic expertise. Each of these contributions reflects distinct elements of the task facing the sector,” said Saggar.
The report spans three themed sections: Rewriting the System: Policy, Structures and Reform; Power, Voice and Justice; and Making Equity Real: Practice, Place and Participation.
ACSES research and policy program director, Ian Li, said the report discusses the actions required across a broad range of areas. “It highlights both the systemic reforms required and the everyday practices that can make a real difference in the lives of students,” said Li.
[The report] highlights both the systemic reforms required and the everyday practices that can make a real difference in the lives of students Ian Li, ASCES
The report argues that higher education is still shaped by entrenched class hierarchies, colonial legacies, and rigid divisions between vocational and university pathways, and that incremental reforms have not, and will not, deliver the impact required for the nation to meet the ambitious target of 80% tertiary attainment by 2050, with full parity for underrepresented groups.
Contributors also emphasise that cost-of-living pressures, housing stress, and mental health challenges are not peripheral concerns but central to whether students can complete their studies, and are just as important as reforming admissions or curriculum.
The longstanding divide between vocational education and universities was raised as a key issue, with calls for a harmonised system that allows students to move more easily between sectors and provides more flexible entry and exit points.
The report outlined measures to increase access for regional and remote students, including creating regional study hubs, tailored funding, and localised support outside metropolitan areas. It also called for leveraging the success of regional universities that already serve high proportions of equity students by using them as models for scaling equity.
Disability inclusion, meanwhile, must move beyond individual adjustments toward accessible curriculum and learning environments designed from the outset.
Indigenous contributors Leanne Holt and Tracy Woodroffe called for universities to move beyond transactional support and embed Indigenous leadership and cultural safety at every level of governance, teaching, and research.
Universities were also issued a warning that using AI and technology as a quick fix for equity risks widening the divide further. Instead, the report suggested equity must shape how AI is integrated through universal digital access, AI literacy, and student co-design.
The report also contained early insights from the new National Student Ombudsman, launched in February 2025, revealing strong demand for independent complaint resolution, especially on course administration, wellbeing, and financial issues. Sarah Bendall, who leads the office as First Assistant Ombudsman, argues this is proof that accountability must become a sector-wide priority.
While each section of the report contains unique perspectives and experiences, the overall message of Equity Insights 2025 is not simply to do more, it is to do differently, and it calls for bold leadership across the entire sector.
Effective as of September 6, all nonimmigrant US visa applicants including international students must schedule interviews at their local US embassy or consulate, or face an increased risk of rejection, the State Department has announced.
“Applicants who scheduled nonimmigrant interviews at a US embassy or consulate outside of their country of nationality or residence might find that it will be more difficult to qualify for the visa,” the department warned.
Fees paid for such applications will not be refunded and cannot be transferred, said the department, adding that applicants would have to demonstrate residence in the country where they are applying.
The directive puts an end to a pandemic-era practice of students bypassing long wait times at home by scheduling visa appointments from a third country.
Stakeholders have raised concerns that the new rule could exacerbate wait times and disadvantage students whose local embassies are plagued by delays.
According to State Department data updated last month, individuals applying for student and exchange visitor visas at the US embassy in Abuja, Nigeria, currently must wait 14-months before obtaining an interview.
Meanwhile, the next available F, M, and J visa appointments at consulates in Accra, Ghana and Karachi, Pakistan, are not for another 11 and 10.5 months respectively.
Applicants who scheduled nonimmigrant interviews at a US embassy or consulate outside of their country of nationality or residence might find that it will be more difficult to qualify for the visa
US State Department
The new rule comes after a near month-long pause on new student visa interviews this summer saw major delays and cancelled appointments at embassies across the globe, preventing some international students from enrolling at US colleges this semester.
Enhanced social media vetting for all student visa applications is also believed to be contributing to the delays.
Existing nonimmigrant visa appointments “will generally not be cancelled,” said the department, adding that: “Rare exceptions may also be made for humanitarian or medical emergencies or foreign policy reasons.”
Nationals of countries where the US is not conducting routine visa operations have been instructed to apply at the following designated alternatives:
National of:
Designated location (s):
Afghanistan
Islamabad
Belarus
Vilnius, Warsaw
Chad
Yaoundé
Cuba
Georgetown
Haiti
Nassau
Iran
Dubai
Libya
Tunis
Niger
Ouagadougou
Russia
Astana, Warsaw
Somalia
Nairobi
South Sudan
Nairobi
Sudan
Cairo
Syria
Amman
Ukraine
Krakow, Warsaw
Venezuela
Bogota
Yemen
Riyadh
Zimbabwe
Johannesburg
The State Department did not immediately respond to The PIE’s request for comment.