Connect with us

Education

New push for AI as Education Minister Erica Stanford announces curriculum changes

Published

on


Education Minister Erica Stanford
Photo: RNZ / Samuel Rillstone

The government has announced a number of new secondary school subjects and a new emphaisis on artificial intelligence it says will help prepare young people for the jobs of the future.

Education Minister Erica Stanford said those working on the changes will investigate having a new Year 13 subject on Generative AI “for later development”.

“With the rapid development of AI, students will also be able to learn about and use generative AI in a range of subjects. This may include learning about how digital systems work, machine learning, cybersecurity, and digital ethics.”

Stanford said the new subjects, being developed for the Years 11 to 13 curriculum, reflect the growing importance of science, technology, engineering and mathematics, often referred to as STEM.

The subjects include: automative engineering, building and construction, and infrastructure engineering.

“Students will be able to specialise in areas such as earth and space science, statistics and data science, and electronics and mechatronics. There will also be a range of new specialist maths subjects including further maths.

“When our young people leave school, we want doors to open for them whether they’re going to tertiary study, learning a trade, or heading straight into work. These refreshed subjects will provide students with choice, purposeful pathways and opportunities for specialisation that set them up for success,” Stanford said in a statement.

It was vital students had access to “innovative and dynamic subjects” that would help the country’s future, she said.

Other new subjects include: civics, politics and philosophy, Pacific studies, Te Mātai i te Ao Māori and music technology.

Te Marautanga o Aotearoa will be resourced with a first ever detailed curriculum in te reo Māori as well as new subjects including new Tātai Arorangi (Māori traditional systems of Earth and Sky), Te Ao Whakairo (Māori carving) and Te Ao Māori subjects.

The subjects are planned to be phased in from 2028.

Sign up for Ngā Pitopito Kōrero, a daily newsletter curated by our editors and delivered straight to your inbox every weekday.



Source link

Education

DVIDS – News – Lethality, innovation, and transformation though AI education at the U.S. Army School of Advanced Military Studies

Published

on



THE ARMY UNIVERSITY, FORT LEAVENWORTH, Kansas – In late July 2025, the Advanced Military Studies Program at the School of Advanced Military Studies, known as SAMS, launched its first-ever experimental, three-day, Practical Application of Artificial Intelligence module.

The mission was simple: transform the program with an innovative, hands-on AI learning experience for students and faculty. The purpose was to enable warfighter lethality through AI education and training.

“AI is changing the character of warfare. Our graduates have got to be ready to lead formations powered by AI—and that’s why we did something about it,” Col. Dwight Domengeaux, Director, SAMS said.

Dr. Bruce Stanley, Director, AMSP, envisioned a module that pushed institutional norms about how mid-career officers learn about AI and learn with AI.

“Did we accept risk? Yes. We did—to create a critical learning opportunity for our students,” Stanley remarked. “We knew what was at stake, and we trusted our faculty and students to make it work.”

And make it work they did.

According to AMSP faculty, the module’s experimental instructional design was key, consisting of ten-and-a-half hours of total classroom contact time divided over three lessons.

“We covered a lot of ground with our students in three days,” Dr. Jacob Mauslein, associate professor, AMSP, said. “Subjects ranged from AI theory and ethical considerations of AI, to applying AI tools, and leading AI-enabled organizations.”

A novel feature of the module was that it was developed by AMSP students. As a task in their Future Operational Environment course, six students from the Class of 2025, mentored by two faculty, developed the AI module that would be taught to the Class of 2026. The students’ final draft was adopted almost without change by the faculty.

“Incorporating students as full participants in the process allowed us to co-develop lesson objectives and materials that deeply mattered to them,” Dr. Luke Herrington, one of the faculty leads for the module shared.

Meeting students where they were in terms of their AI skills and then taking them to the next level was part of the academic approach for the AI module, Herrington explained.

Maj. Justin Webb, PhD, an AY 2025 AMSP student, and one of the module’s developers explained it this way: “SAMS is a warfighting school—so we chose learning activities that would help us become more lethal warfighters with AI. Using AI tools like CamoGPT, Ask Sage, and others for several hours over three days helped us get there.”
Some students in the AY 2026 class were initially skeptical of using AI.

“At first, I didn’t know what I didn’t know,” Army Maj. Stuart Allgood, an Armor officer SAMS student said. “But by the end of the first day my thinking about AI had changed. After the second day, I could use AI tools I had never even heard of.”

Maj. Callum Knight, an intelligence officer from the United Kingdom summed up his experience.

“Before this course I viewed AI as just a data point,” Knight said. “Now that I’ve experienced what’s possible with AI, I realize it’s an imperative that is going to impact everything I do going forward.”

So, what’s next for AI at SAMS?

“Based on what our students got out of this, we intend to add more AI learning moments across the program,” Stanley said. “The priority now is to integrate AI into our upcoming operational warfare practical exercise.”

AMSP is one of the three distinct academic programs within SAMS.

The other two SAMS programs are the Advanced Strategic Leadership Studies Program or ASLSP – a Senior Service College equivalent, and, the Advanced Strategic Planning and Policy Program or ASP3 also known as the Goodpaster Scholars—a post-graduate degree program.

Matthew Yandura is an AMSP assistant professor, and retired Army colonel.







Date Taken: 08.29.2025
Date Posted: 09.11.2025 13:34
Story ID: 547863
Location: FORT LEAVENWORTH, KANSAS, US






Web Views: 7
Downloads: 0


PUBLIC DOMAIN  





Source link

Continue Reading

Education

AI in education: Most teachers aware of ChatGPT but few confident in classroom use

Published

on


Artificial intelligence is reshaping education, but most teachers in American classrooms remain unprepared to use it effectively. A new study by researchers from the University of Tennessee reveals sharp divides in awareness, usage, and confidence among elementary and secondary educators.

Published in Education Sciences, the study AI Literacy: Elementary and Secondary Teachers’ Use of AI-Tools, Reported Confidence, and Professional Development Needs surveyed 242 teachers across grades 3–12 in the southeastern United States. The findings underscore how quickly AI tools like ChatGPT are making their way into schools, while also exposing the challenges teachers face in adapting to them.

Awareness and Use of AI Tools

The research shows that while most teachers have heard of AI writing tools, fewer than half actively use them in the classroom. ChatGPT emerged as the most widely recognized and utilized platform, followed by Grammarly, Magic School, and Brisk. Secondary teachers consistently reported higher levels of familiarity, understanding, and actual use than their elementary counterparts.

Overall, 92 percent of respondents were aware of ChatGPT, but only 47 percent reported using AI in their teaching. Among those who did, AI was employed primarily for lesson planning, assessment design, feedback generation, and text differentiation for diverse learners. Some teachers also relied on AI for professional tasks such as drafting emails or creating instructional visuals. Despite these uses, adoption remains uneven, with elementary educators often citing developmental appropriateness and tool complexity as barriers.

The survey found that 80 percent of secondary teachers recognized AI tutoring systems, compared to just over half of elementary teachers. Awareness of grading and assessment tools such as Turnitin and Gradescope was also significantly higher among secondary teachers. This divide suggests that students in higher grades are more likely to experience AI-enhanced instruction than younger peers, potentially widening existing gaps in exposure to technology.

Confidence and Classroom Challenges

Confidence proved to be a decisive factor in whether teachers integrated AI. Secondary educators reported greater self-assurance across all categories, from lesson planning and grading to communicating with families and districts about AI use. Elementary teachers scored consistently lower, especially in areas like troubleshooting AI-related issues or explaining integration policies to parents.

Teachers expressed generally positive feelings about AI, particularly its ability to save time and improve productivity. Many reported satisfaction with tools like ChatGPT and Grammarly, which were seen as reliable supports for planning and student feedback. However, stress and uncertainty surfaced among those concerned about ethical implications, accuracy of AI-generated content, and student misuse.

The most frequently cited challenge was academic dishonesty. Teachers noted that students often bypassed learning by copying AI responses wholesale. Other problems included crafting effective prompts, dealing with inaccurate outputs, and navigating an overwhelming variety of platforms. Ethical concerns such as data privacy, algorithmic bias, and the risk of diminishing student critical thinking skills added to the complexity of adoption.

Even among teachers who felt positive, caution was evident. Many described AI as useful for generating first drafts or lesson outlines but insisted on editing outputs to suit classroom needs. A minority expressed skepticism altogether, fearing that reliance on AI could erode essential student skills and deepen inequities.

Professional Development Needs

Perhaps the most striking finding of the study is the lack of professional development. Only 24 percent of teachers reported receiving any AI-related training. Among those, many relied on self-teaching, peer collaboration, or district-led sessions rather than formal instruction. While some found these resources moderately helpful, the majority identified structured workshops as their top need.

Eighty percent of all respondents said professional development workshops would boost their confidence in using AI. Teachers also called for clear permission policies, access to reliable tools, and guidance on best practices. Elementary teachers were particularly likely to request training, reflecting their lower reported confidence compared to secondary colleagues.

Despite growing awareness of AI’s potential, the absence of system-wide support leaves teachers struggling to keep pace with technological change. The study warns that without targeted training and consistent policies, educators risk either misusing AI or failing to leverage its benefits at all. This could lead to fragmented practices across districts and further disparities between elementary and secondary levels.

Policy and Research Implications

The authors argue that one-size-fits-all solutions will not work, as the needs of elementary and secondary teachers diverge sharply. Differentiated professional development, ongoing support, and robust ethical guidelines are essential. Schools must also provide leadership, infrastructure, and a clear vision to reassure teachers about integrating AI responsibly.

The research highlights the importance of addressing not just technical competence but also the ethical and social dimensions of AI use in education. Teachers voiced concerns about academic integrity, algorithmic bias, and equity of access. Addressing these issues requires policies that balance innovation with protection for students.

Future studies, the authors suggest, should track how teacher awareness and confidence evolve over time as AI tools advance. Longitudinal research could shed light on whether training efforts actually lead to better classroom practices and improved student outcomes. There is also a pressing need to examine which forms of professional development are most effective in building AI literacy among teachers across different contexts.



Source link

Continue Reading

Education

Digital Learning for Africa: Ministers, Practitioners and Pathways

Published

on


Frameworks for the Futures of AI in Education.

Countries are using UNESCO’s Readiness Assessment Methodology (RAM) to map weaknesses and opportunities and to guide national AI strategies ; two latest additions being Namibia and Mozambique. 

The DRC is prioritizing digital transformation projects, investment partnerships for infrastructure, AI adapted to local languages, and personalized learning, organized around governance, regulation of human capital, and industrialization. RAM has supported startups, scholarships and capacity-building—pointing toward sovereign digital infrastructures and talent retention. 

Dr. Turyagenda notes that youth are already using AI and need a structured framework; its National AI Strategy and Digital Agenda Strategy align with UNESCO, AU and East African frameworks, with teachers involved from the start.

Preparing learners for an AI-driven economy.

Namibia—among the first in Southern Africa to launch a RAM process—is developing a national AI strategy and a National AI Institute. Hon. Mr. Dino Ballotti, Deputy Minister of Education, Innovation, Youth, Sports, Arts and Culture of Namibia underscores that the national approach is “humanity first” and context-specific—“Namibian problems require Namibian solutions”—with priorities in school connectivity, teacher and learner readiness, and data availability. Indigenous communities are actively involved in developing tools and digital technologies. 



Source link

Continue Reading

Trending