Education
La Crosse Public Library hosts AI education event | News

We recognize you are attempting to access this website from a country belonging to the European Economic Area (EEA) including the EU which
enforces the General Data Protection Regulation (GDPR) and therefore access cannot be granted at this time.
For any issues, contact news8@wkbt.com or call 608-784-7897.
Education
Professor calls for government to begin AI education earlier

The government has announced a number of new secondary school subjects for years 11 to 13.
Photo: AFP / JONATHAN RAA
A digital education professor has called on the government to be bolder with its plan to bring in artificial intelligence as a subject in schools.
The government has announced a number of new secondary school subjects for years 11 to 13, which will begin rolling out from 2028.
It included a new year 13 subject on Generative AI, but for later development.
Speaking to media announcing the new secondary school subjects, Education Minister Erica Stanford said there would be a new emphasis on artificial intelligence (AI).
“With the rapid development of AI, students will be able to learn about generative AI. This may include learning about how digital systems work, machine learning, cybersecurity, and digital ethics.
“I’ve asked the Ministry [of Education] to investigate a new specialised Year 13 subject on generative AI for later development.”
Education Minister Erica Stanford
Photo:
Canterbury University Associate Professor of Digital Education Kathryn MacCallum said children needed to start learning about AI before year 13.
“Why does it sit at year 13 and why aren’t we doing this a lot earlier. It misses the point that a lot of students and even younger are engaging with AI.
“If we’re leaving it to year 13 to engage with this, how does it set them up to using it appropriately?” she said.
MacCallum said if the government was going to focus on AI as a year 13 topic, it was too late.
“Looking at the refresh of the curriculum, we should be actually starting with an explicit engagement of AI from year 1, and we should be tailing this with digital literacy.
Professor Kathryn MacCullum
Photo: University of Canterbury
“So AI literacy to some degree sits as a separate subject or a separate focus, but it also needs to dovetail into being good citizens, and being able to be digitally literate, and supporting students to engage in a society that is so much more around digital.
“So I think we also have a responsibility to also be really focused on how do we get all students to be more engaged in the digital side and that’s not just coding, it needs to be about the broader spectrum of digital,” she said.
MacCallum also urged the government to be broad when it began developing the framework.
“I know that some of the commentary is about generative AI, but if we’re going to do it, we need to be very broad about what we’re trying to engage our students in and understanding that what AI is in its broad nature.
“We should be talking about how it works and not just the technology, so we shouldn’t be just saying, OK, [here is] AI technology, these [are the] tools, and how do we use it in this context.
“We should be explaining to the students and helping them understand where AI sits and equally, where it shouldn’t sit… that’s why we need to start early is because part of the process of navigating the space is knowing when AI should sit in the process, but equally when we shouldn’t be and how it’s manipulating us, but equally how it can be useful in certain spaces.”
MacCallum said she supported the government’s move to introduce AI into schools and wanted a framework in place as soon as possible.
Other school subjects announced on Thursday include mechanical engineering, building and construction, infrastructure engineering, civics, politics and philosophy, Pacific studies and primary industry.
Stanford said the new subjects reflected the growing importance of science, technology, engineering and maths.
RNZ has approached her office for more comment on Professor MacCallum’s views.
Concern over teacher numbers
Dr Nina Hood is a former secondary school teacher and founder of the Teachers Institute, which gives prospective in-school training but also upskills existing teachers too.
She told First Up she didn’t think there were currently enough specialist teachers to teach new subjects being introduced by the government.
She said some work would need to be done to build capacity across the teaching force to be able to offer the subjects.
That could involve bringing new people into the profession or partnering with industry.
Devaluation of outdoor education – principal
Mount Aspiring College principal Nicola Jacobson said high school students who were looking to continue on to university were being discouraged to study outdoor education under the new curriculum.
Outdoor education would lose its University Entrance & Academic status – becoming a vocational subject instead.
Jacobson said the move would narrow students’ academic focus and devalue the subjects that no longer counted towards University Entrance.
“What the government is proposing is very distinct lists between vocational and academic. Students might develop a perception around a pathway – or parents might develop a perception around a pathway – when we need people who are able to do all of those things and feel valued in the skills and knowledge that they have,” Jacobson said.
Jacobson said equally valuing all subjects under NCEA allowed a greater variety of skills and achievement to be reflected within the qualification.
Sign up for Ngā Pitopito Kōrero, a daily newsletter curated by our editors and delivered straight to your inbox every weekday.
Education
DVIDS – News – Lethality, innovation, and transformation though AI education at the U.S. Army School of Advanced Military Studies

THE ARMY UNIVERSITY, FORT LEAVENWORTH, Kansas – In late July 2025, the Advanced Military Studies Program at the School of Advanced Military Studies, known as SAMS, launched its first-ever experimental, three-day, Practical Application of Artificial Intelligence module.
The mission was simple: transform the program with an innovative, hands-on AI learning experience for students and faculty. The purpose was to enable warfighter lethality through AI education and training.
“AI is changing the character of warfare. Our graduates have got to be ready to lead formations powered by AI—and that’s why we did something about it,” Col. Dwight Domengeaux, Director, SAMS said.
Dr. Bruce Stanley, Director, AMSP, envisioned a module that pushed institutional norms about how mid-career officers learn about AI and learn with AI.
“Did we accept risk? Yes. We did—to create a critical learning opportunity for our students,” Stanley remarked. “We knew what was at stake, and we trusted our faculty and students to make it work.”
And make it work they did.
According to AMSP faculty, the module’s experimental instructional design was key, consisting of ten-and-a-half hours of total classroom contact time divided over three lessons.
“We covered a lot of ground with our students in three days,” Dr. Jacob Mauslein, associate professor, AMSP, said. “Subjects ranged from AI theory and ethical considerations of AI, to applying AI tools, and leading AI-enabled organizations.”
A novel feature of the module was that it was developed by AMSP students. As a task in their Future Operational Environment course, six students from the Class of 2025, mentored by two faculty, developed the AI module that would be taught to the Class of 2026. The students’ final draft was adopted almost without change by the faculty.
“Incorporating students as full participants in the process allowed us to co-develop lesson objectives and materials that deeply mattered to them,” Dr. Luke Herrington, one of the faculty leads for the module shared.
Meeting students where they were in terms of their AI skills and then taking them to the next level was part of the academic approach for the AI module, Herrington explained.
Maj. Justin Webb, PhD, an AY 2025 AMSP student, and one of the module’s developers explained it this way: “SAMS is a warfighting school—so we chose learning activities that would help us become more lethal warfighters with AI. Using AI tools like CamoGPT, Ask Sage, and others for several hours over three days helped us get there.”
Some students in the AY 2026 class were initially skeptical of using AI.
“At first, I didn’t know what I didn’t know,” Army Maj. Stuart Allgood, an Armor officer SAMS student said. “But by the end of the first day my thinking about AI had changed. After the second day, I could use AI tools I had never even heard of.”
Maj. Callum Knight, an intelligence officer from the United Kingdom summed up his experience.
“Before this course I viewed AI as just a data point,” Knight said. “Now that I’ve experienced what’s possible with AI, I realize it’s an imperative that is going to impact everything I do going forward.”
So, what’s next for AI at SAMS?
“Based on what our students got out of this, we intend to add more AI learning moments across the program,” Stanley said. “The priority now is to integrate AI into our upcoming operational warfare practical exercise.”
AMSP is one of the three distinct academic programs within SAMS.
The other two SAMS programs are the Advanced Strategic Leadership Studies Program or ASLSP – a Senior Service College equivalent, and, the Advanced Strategic Planning and Policy Program or ASP3 also known as the Goodpaster Scholars—a post-graduate degree program.
Matthew Yandura is an AMSP assistant professor, and retired Army colonel.
Date Taken: | 08.29.2025 |
Date Posted: | 09.11.2025 13:34 |
Story ID: | 547863 |
Location: | FORT LEAVENWORTH, KANSAS, US |
Web Views: | 7 |
Downloads: | 0 |
PUBLIC DOMAIN
This work, Lethality, innovation, and transformation though AI education at the U.S. Army School of Advanced Military Studies, must comply with the restrictions shown on https://www.dvidshub.net/about/copyright.
Education
AI in education: Most teachers aware of ChatGPT but few confident in classroom use

Artificial intelligence is reshaping education, but most teachers in American classrooms remain unprepared to use it effectively. A new study by researchers from the University of Tennessee reveals sharp divides in awareness, usage, and confidence among elementary and secondary educators.
Published in Education Sciences, the study “AI Literacy: Elementary and Secondary Teachers’ Use of AI-Tools, Reported Confidence, and Professional Development Needs“ surveyed 242 teachers across grades 3–12 in the southeastern United States. The findings underscore how quickly AI tools like ChatGPT are making their way into schools, while also exposing the challenges teachers face in adapting to them.
Awareness and Use of AI Tools
The research shows that while most teachers have heard of AI writing tools, fewer than half actively use them in the classroom. ChatGPT emerged as the most widely recognized and utilized platform, followed by Grammarly, Magic School, and Brisk. Secondary teachers consistently reported higher levels of familiarity, understanding, and actual use than their elementary counterparts.
Overall, 92 percent of respondents were aware of ChatGPT, but only 47 percent reported using AI in their teaching. Among those who did, AI was employed primarily for lesson planning, assessment design, feedback generation, and text differentiation for diverse learners. Some teachers also relied on AI for professional tasks such as drafting emails or creating instructional visuals. Despite these uses, adoption remains uneven, with elementary educators often citing developmental appropriateness and tool complexity as barriers.
The survey found that 80 percent of secondary teachers recognized AI tutoring systems, compared to just over half of elementary teachers. Awareness of grading and assessment tools such as Turnitin and Gradescope was also significantly higher among secondary teachers. This divide suggests that students in higher grades are more likely to experience AI-enhanced instruction than younger peers, potentially widening existing gaps in exposure to technology.
Confidence and Classroom Challenges
Confidence proved to be a decisive factor in whether teachers integrated AI. Secondary educators reported greater self-assurance across all categories, from lesson planning and grading to communicating with families and districts about AI use. Elementary teachers scored consistently lower, especially in areas like troubleshooting AI-related issues or explaining integration policies to parents.
Teachers expressed generally positive feelings about AI, particularly its ability to save time and improve productivity. Many reported satisfaction with tools like ChatGPT and Grammarly, which were seen as reliable supports for planning and student feedback. However, stress and uncertainty surfaced among those concerned about ethical implications, accuracy of AI-generated content, and student misuse.
The most frequently cited challenge was academic dishonesty. Teachers noted that students often bypassed learning by copying AI responses wholesale. Other problems included crafting effective prompts, dealing with inaccurate outputs, and navigating an overwhelming variety of platforms. Ethical concerns such as data privacy, algorithmic bias, and the risk of diminishing student critical thinking skills added to the complexity of adoption.
Even among teachers who felt positive, caution was evident. Many described AI as useful for generating first drafts or lesson outlines but insisted on editing outputs to suit classroom needs. A minority expressed skepticism altogether, fearing that reliance on AI could erode essential student skills and deepen inequities.
Professional Development Needs
Perhaps the most striking finding of the study is the lack of professional development. Only 24 percent of teachers reported receiving any AI-related training. Among those, many relied on self-teaching, peer collaboration, or district-led sessions rather than formal instruction. While some found these resources moderately helpful, the majority identified structured workshops as their top need.
Eighty percent of all respondents said professional development workshops would boost their confidence in using AI. Teachers also called for clear permission policies, access to reliable tools, and guidance on best practices. Elementary teachers were particularly likely to request training, reflecting their lower reported confidence compared to secondary colleagues.
Despite growing awareness of AI’s potential, the absence of system-wide support leaves teachers struggling to keep pace with technological change. The study warns that without targeted training and consistent policies, educators risk either misusing AI or failing to leverage its benefits at all. This could lead to fragmented practices across districts and further disparities between elementary and secondary levels.
Policy and Research Implications
The authors argue that one-size-fits-all solutions will not work, as the needs of elementary and secondary teachers diverge sharply. Differentiated professional development, ongoing support, and robust ethical guidelines are essential. Schools must also provide leadership, infrastructure, and a clear vision to reassure teachers about integrating AI responsibly.
The research highlights the importance of addressing not just technical competence but also the ethical and social dimensions of AI use in education. Teachers voiced concerns about academic integrity, algorithmic bias, and equity of access. Addressing these issues requires policies that balance innovation with protection for students.
Future studies, the authors suggest, should track how teacher awareness and confidence evolve over time as AI tools advance. Longitudinal research could shed light on whether training efforts actually lead to better classroom practices and improved student outcomes. There is also a pressing need to examine which forms of professional development are most effective in building AI literacy among teachers across different contexts.
-
Business2 weeks ago
The Guardian view on Trump and the Fed: independence is no substitute for accountability | Editorial
-
Tools & Platforms1 month ago
Building Trust in Military AI Starts with Opening the Black Box – War on the Rocks
-
Ethics & Policy2 months ago
SDAIA Supports Saudi Arabia’s Leadership in Shaping Global AI Ethics, Policy, and Research – وكالة الأنباء السعودية
-
Events & Conferences4 months ago
Journey to 1000 models: Scaling Instagram’s recommendation system
-
Jobs & Careers2 months ago
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
-
Podcasts & Talks2 months ago
Happy 4th of July! 🎆 Made with Veo 3 in Gemini
-
Education2 months ago
Macron says UK and France have duty to tackle illegal migration ‘with humanity, solidarity and firmness’ – UK politics live | Politics
-
Education2 months ago
VEX Robotics launches AI-powered classroom robotics system
-
Funding & Business2 months ago
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries
-
Podcasts & Talks2 months ago
OpenAI 🤝 @teamganassi