Connect with us

Education

How students and staff are embracing AI in teaching and learning

Published

on


Last term, Teacher columnist Professor Martin Westwell – Chief Executive of the South Australian Department for Education – wrote about the implementation of a custom generative AI tool, EdChat, in government schools across South Australia. In this article, we speak with Sarah Chambers, principal at Adelaide Botanic High School – one of the first adopters of the tool back in 2023 – about how students and staff are using the tool. 

At Adelaide Botanic High School, both staff and students have been using EdChat, a generative AI tool developed by the South Australia Department for Education for use by schools, since the trial period began in 2023. ‘Tomorrow, today’ is the motto at the vertical school that opened in 2019 and is home to 1,300 students across years 7-12. 

‘I don’t think it was it probably even conceived of that we wouldn’t be part of a trial of something like EdChat, because we know that our educators come here to work in a space where they get to push those boundaries, to be on the edge of the new learning at all times, to be at the forefront of what good practice looks like,’ Principal Sarah Chambers tells Teacher

‘We have an environment that absolutely is invested in exploring what this might look like for educators and students and young people now and into the future. And I think AI is one of those significant impacts of change in our current experience – if we look at, sort of between 2023 and now, the change in the landscape around AI is quite radical.’

A secure tool to support teaching and learning

A roadblock for schools using other generative AI tools available to the public, such as ChatGPT, is that the environment is not secure. EdChat was launched by the Department for Education with guardrails to ensure safe and ethical use, so staff and students can explore the tool freely.

‘[EdChat] has the data security … that just means that we can guarantee our students are not going to encounter content that’s not appropriate; that any data that they share within that is not put back into the broader system. It’s just kept within that secure environment, which is so good in a school context,’ Chambers explains. 

‘I think that’s the greatest difference for me as a leader within a school; that I know, when using EdChat, that our students do have that data security and the content security as well.’  

How students are embracing AI 

Students at the school have been using EdChat in a range of ways. For example, Chambers shares one student uses EdChat as a study planner to help with time management, and others use it to generate revision questions by feeding EdChat material from their teacher and prompting the tool to ask them progressively more challenging questions.   

In the term 2 holidays, Chambers travelled to Osaka, Japan, with staff and students to present at the World Expo 2025 on their use of EdChat. Students created videos for their student-led presentation sharing examples of how they’re using the generative AI tool. 

‘A couple of our students told the story of using EdChat to help them learn how to code or how to use applications like Premiere Pro to edit film and, rather than watching tutorials online where they had to watch a whole section to find the tiny bit that they wanted, they could ask the questions around: “I want to do this. How do I do this? How do I make that happen?” And it could be far more targeted as that guide on the side.’

EdChat is supporting teachers to save time 

Chambers says staff are learning about EdChat alongside students, and many have found ways to reduce their workload. 

One such example has seen a common task for teachers that typically takes 30 minutes, reduce to 52 seconds (Department for Education, 2025) with the help of EdChat. Adelaide Botanic High School teacher Rebecca Weber identified an opportunity for EdChat to assist in assessing a student’s LEAP (Learning English: Achievement and Proficiency) Level. 

The Department (2025) says this process involves a teacher assessing a text that has been written by a student, and in 2024, 31,434 students were assessed across the state, with each writing sample taking an average of 30 minutes to be processed by a teacher. With EdChat completing this process in less than a minute, they say teachers will save thousands of hours of work. 

Chambers adds that EdChat is helping other staff at the school support students with a language background other than English. Another staff member has created an app to benefit careers counselling activities. The app collates university and TAFE information that would otherwise be contained in hardback guidebooks and need to be manually referred to. 

‘And so she uses that as, again, a guide on the side that she can – when career counselling with students and looking at subject selections for their year 11 and 12 – the conversation she has moves from the what to the why and that really strong connection with the young person … because the information she needs is at hand,’ Chambers says. 

Keeping humans at the centre

According to Chambers, an important driver around the school’s use of EdChat is keeping humans at the centre. 

‘Some of the writing about AI talks about having a human in the loop, but actually, we like to say that we’ve got the AI in the loop and that the humans are still at the centre,’ she says.

‘It’s another tool that becomes one of those things that our students need to have the confidence to use, but we also need to make sure that they’ve got the skills and the attributes and the human skills that they need to be able to navigate what is appropriate, what isn’t.’ 

References 

Department for Education South Australia (2025, June 4). AI tool saves teachers thousands of hours. https://www.education.sa.gov.au/department/media-centre/our-news/ai-tool-saves-teachers-thousands-of-hours-of-work



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Education

Trump Continues Push for AI in Schools as FTC Probes Risks

Published

on


White House Kick Off School Year With AI Education Efforts, Public-Private Collabs

Image: VideoFlow/Shutterstock

The White House is rolling out its Presidential Artificial Intelligence Challenge with new commitments to further expand the use of AI in education just as the school year begins – and as the Federal Trade Commission readies a probe into whether popular AI chatbots are harming children’s mental health.

See Also: OnDemand | Transform API Security with Unmatched Discovery and Defense

U.S. President Donald Trump hosted several big tech leaders Thursday night “for discussions centered on harnessing [AI] to propel the U.S. to the forefront of global innovation,” according to a press release the White House published Friday. The meeting followed the second White House Task Force on AI Education summit, where First Lady Melania Trump announced a series of commitments to help further the administration’s AI challenge, including forthcoming toolkits, webinars, classroom guides and agency action items to increase the implementation of AI training materials and tools in K-12 schools nationwide.

Education Secretary Linda McMahon said during the meeting the agency is “fully aligned with the Presidential AI Challenge” and “encouraging students and educators to explore AI technologies with curiosity and with creativity.” Experts have also warned the federal push to rapidly deploy AI tools across American classrooms could come with cybersecurity vulnerabilities, privacy risks and potential harm to minors (see: Trump Wants AI in Classrooms. Where Are the Safeguards?).

Initiatives to further AI in education include billion-dollar commitments from companies like Alphabet and million-dollar agreements with IBM – both of which had their CEOs at the education summit. Labor Secretary Lori Chavez-DeRemer said that her agency is in the process of building new private sector partnerships to expand access to AI education and training materials nationwide. Google CEO Sundar Pichai said efforts are designed “in the service of helping the next generation to solve problems, fuel innovation and build an incredible future.”

Recent studies have shown AI tools and systems may have some positive benefits when introduced in the classroom but in their present form typically introduce major risks including the presence of inappropriate and harmful content. According to the Wall Street Journal, the FTC is preparing to send many of the top tech companies developing the leading AI tools that were present at the White House this week letters demanding information while investigating whether children’s mental health is impacted by the use of chatbots like OpenAI’s ChatGPT.

The White House Presidential AI Challenge invites U.S. students to complete a project that involves AI tools or systems to address community challenges, and encourages educators to use creative approaches to teaching and using AI technologies in K-12 education. Trump signed executive orders in April encouraging public-privte partnerships to expand AI in K-12 education, establishing the Presidential AI challenge and directing agencies to work with leading AI organizations in creating new resources specifically for K-12 education.

OpenAI has announced plans to create accounts for teens with parental controls amid lawsuits against AI companies over teenage suicides from families alleging their children were adversely affected by their tools.





Source link

Continue Reading

Education

Is AI helping or hurting students? We answered your questions

Published

on


Open this photo in gallery:

Acadia professor Dr. Daniel Lametti says that you can’t rely on AI detectors to identify computer-generated writing.Michael C. York

On Sept. 4, postsecondary education reporter Joe Friesen, online culture reporter Samantha Edwards and psychology professor Dr. Daniel Lametti from Acadia University answered reader questions on the use of AI in schools, what parents and educators should look out for, and how students could harness its potential.

Readers asked about how AI use is changing classrooms, students’ ability to learn, and how they can use it responsibly. Here are some highlights from the Q+A.

Questions and answers have been edited for length and clarity.


AI in classrooms

In what ways can AI be a helpful tool for education, from the perspective of both students and teachers?

Dr. Daniel Lametti: Great question! We know from decades of research in cognitive psychology that passively reviewing material (for example, flipping through flash cards) has almost no impact on learning, and does not increase your memory of the material. In contrast, repeat testing is highly beneficial to learning; this is where AI can be helpful. For instance, students can use AI to generate practice questions in preparation for tests and exams.

In my classes where I allow AI use, I suggest students use it for test preparation, with some important warnings – mainly, anything shared with an AI to generate practice problems (e.g., class notes) may be used to train new AI models, making the material public.

As a university instructor, I’m always on the lookout for new ways to engage students in the classroom. In some cases, I’ve found that AI can be helpful as a brainstorming tool for coming up with new assignments and pedagogical approaches.

Joe Friesen: I’ve heard a lot from universities about how they want to ensure they’re teaching students how to work with AI. It’s here, it’s a reality and it will likely be something that shapes the world of work students are preparing to enter. So teaching students how to use AI can be really valuable and I expect it’s something students want. It’s a tool, and like any tool it has strengths and limitations. For example, in my experience, it provides results that need interpretation and refining, and educators can help students learn how to do that.

I’ve heard students say they use AI tools to help them get started with assignments and understanding, particularly with big, daunting topics. I’ve seen them use it for help with outlines, or with preparing quizzes that help them study.

What does AI mean for catching and addressing learning inequalities? Are less privileged students more likely to fall behind if they don’t have as much access to technology at home?

Friesen: That’s an interesting point. One of the things I notice reading about the high flyers of Silicon Valley and the technology world is an inclination to limit how much access to technology their children have. I see the same thing among many university-educated parents in Canada. I wonder if in future having less access to technology in the early years will be correlated with higher academic achievement.

But I think the danger of missing out is something many people worry about: phones, computers, wifi access, are all nearly essential these days and assumed to be available. But that’s not true for everyone, and it can have an impact.

Open this photo in gallery:

The danger of missing out is a strong factor in the pull towards new technologies, Joe Friesen says.ILLUSTRATION BY MATT ROTA

How can teachers and professors know that their students’ essays and research papers are actually fully written by them and not by AI?

Lametti: It’s challenging. AI detectors do not work, and likely never will. In classes where I don’t want students to use AI, I tend to give assignments that focus on the learning process over the final outcome. For example, in a writing seminar I have students start assignments in class in front of me. They work in a Google doc so I can track the history of how their papers came together. They’re also graded based on effort rather than the final outcome. So far, I’ve found this approach effective.

Some university professors say AI is here to stay, so students should learn how to use it

Why does it feel like institutions are pushing AI use onto students and kids?

Lametti: I think that some institutions are, and we tend to hear about these schools more than the ones taking a more thoughtful approach. Students should be taught how to use AI, but this is not a challenging task.

Depending on the course, pushing AI on students risks jeopardizing learning outcomes. For example, I teach a senior seminar on effective science writing, and if students used AI for every assignment—or even just for editing—they wouldn’t get much from the class, and thus I don’t allow AI in this course.

Rather than pushing AI on students, at Acadia we leave AI use up to individual professors. We also offer all first year undergrads a 45-minute lecture on how AI works – what it does well, what it fails at, the ethical implications of using AI, and the possibility that overreliance on AI could compromise learning and lead to cognitive debt. I’ve found that undergraduates are very receptive to this information, and I think this is the correct approach.

I’ve noticed that while a handful of students in public classrooms can manage their phone use, most simply can’t. Whatever benefits phones might bring to learning are often overshadowed by how much they hurt attention spans and real-world social interaction.

Samantha Edwards: You raise a good point! I recently spoke to teachers, principals and students about how effective the cellphone bans are in the classroom, and the results have definitely been mixed. And even when the bans are working, students are still allowed to use their personal laptops or school-supplied Chromebooks in the classroom, which can be even more distracting. Social media is blocked on school wifi, but students told me they’d just hotspot from their phones or would download VPNs to access Snapchat and Instagram anyway.

I agree that we need to encourage students to be creative thinkers, and that we need to teach greater tech/media literacy in elementary and high schools. I’d love to see a class that touches on how social media apps are designed to be addictive and how algorithms actually work.

Open this photo in gallery:

A classroom at John D Bracco School in Edmonton, Alberta on Thursday, August 28, 2025.Amber Bracken /The Globe and Mail

Youth and technology

How should you talk to your students or kids about AI?

Edwards: I think the most important thing is to explain the limitations of these chatbots and how they can get things wrong. So, if you’re using ChatGPT for homework help, be sure to double check the answers it gives you with a more reputable source. We’re also seeing more people form relationships with AI chatbots, either as friends, therapists or romantic interests. I think for parents it’s super important to talk to their kids about these relationships and how although they can be beneficial, they can be harmful too.

Is all this technology use affecting how these young people socialize and interact with others?

Edwards: For sure. I’ve heard from teens who say they chat with their friends constantly on Discord or in other group chats, but they don’t actually hang out in person as much. Some have said while they do want to meet up with friends, they don’t have the energy to actually make plans. It’s easier to just chill at home and scroll social media. So yes, I think it’s definitely affecting how young people socialize.

Also, I should say, I’m a millennial so I spent a lot of my teen years online and chatting with friends on MSN, so I don’t think technology = bad for socializing. But I do think the rise of algorithmic social media has really affected how we interact with others online. Instead of chatting with friends, a lot of the time we’re passively watching strangers do stuff.

I’ve also been seeing a resurgence of in-person events catered specifically to young people who want to be offline, which makes me hopeful! Earlier this summer someone in Toronto posted a TikTok about throwing a picnic in a downtown park and hundreds of people showed up. I think that’s awesome.

AI-generated videos are all over social media, is it normalizing its use to young people?

Edwards: Definitely. The proliferation of AI-generated videos on TikTok, Instagram, X and Facebook in even in the past six months has been wild to see. Some of these videos are clearly AI-generated, but many look incredibly realistic. Social platforms say that they mark AI-generated content, but in my reporting, I’ve found this rarely happens. Earlier this year I wrote a story about AI influencers and some of the followers of these accounts didn’t even realize they were AI!

At the same time that users are uploading their own AI videos/images, a lot of social media platforms are integrating generative AI right into their platforms, which I think normalizes its use too. On Facebook, you can now chat with different kinds of AI avatars/users, and on X, its chatbot Grok is built right in too.

Open this photo in gallery:

Students work on laptops at First Avenue Elementary School in Newark, N.J., on May 22, 2023.GABRIELA BHASKAR/The New York Times News Service

The future of education

As AI takes over more cognitive tasks, what should we resist giving up, and what should we accept as healthy adaptation?

Lametti: As a cognitive scientist, I’m tempted to say that we should resist it all, but in our digital world there are certainly tasks that are more a distraction than a benefit (handling emails, for example). If AI can eliminate these distractions, freeing up time for more productive work, it will be a beneficial tool. But as you point out, it’s important that we don’t offload all of our “thinking” to AI – for one reason, it can’t really do it (AI makes a lot of mistakes), but there’s also some evidence that keeping a mentally sharp mind benefits our cognition as we age. In a university context, it’s important to teach students about AI, how it works, what it does well, and what it fails at, so they know when to use it and when to put it aside in favour of thoughtful learning.

Opinion: To the freshman class of 2025: Will you let AI think for you, or learn how to think for yourself?

What does AI mean for university admissions, especially at the most competitive programs?

Friesen: I would imagine that AI would be a useful tool for admissions offices when it comes to sorting applications. We know that competitive programs can get hundreds and even thousands more applications than they have spots, so I expect universities may find it economical to have AI doing some of the first phase analysis.

If the question is more about whether we can be sure that top students are really earning their top marks in the age of AI, that’s a tougher one to answer. I think it’s in the hands of high schools to keep an eye on it.

AI is likely here to stay. What do you think needs to happen in the education sphere to adapt to this reality?

Friesen: I asked the president of Toronto Metropolitan University, Mohamed Lachemi, about artificial intelligence and the university in an interview last week. I liked the phrase he used. “Artificial intelligence is like a sword with two edges,” he said. “We use artificial intelligence in many aspects of research and it can give us advantages in efficiency and the use of resources. But at the same time, we have to be also mindful about the danger of artificial intelligence, especially in terms of academic misconduct. We need to put some mechanisms in place to make sure that we don’t lose that control.”

So clearly universities are concerned about the implications for academic integrity. This does change the landscape for how evaluation can be done. But that’s something universities can adapt to.

And AI holds a lot of promise for research, making things that took a long time much quicker. I expect that as time goes on we will hear more about its promise than about the academic misconduct, but I could be wrong.

Is AI further deflating the value of a university degree? Are degrees now worth less?

Friesen: That’s something I’ve heard people speculating about. It’s so early at this point we can’t really know, but there are rumblings that employers may forego hiring in some positions that traditionally went to recent grads. That might be interpreted as a knock against getting a university degree. But over the longer term university degrees have tended to be fairly valuable for most people, most of the time. It’s always possible that it could change, but people may have asked the same thing when the Internet arrived, making knowledge available to anyone, anywhere, and universities have only grown since then. I would think that the critical thinking that’s required to interrogate AI results and direct the bots is the kind of thing that universities aim to foster.



Source link

Continue Reading

Education

Rwanda Launches Landmark Day of AI Program to Transform Education

Published

on


In a groundbreaking step toward preparing the next generation for an AI-driven future, the Day of AI Rwanda program was launched in partnership with the Rwanda Ministry of Education and the Rwanda Education Board (REB). The initiative focuses on equipping educators with the knowledge and tools to integrate artificial intelligence (AI) into the country’s existing ICT curriculum, enabling students to both use AI and critically evaluate its impact on society.

Building AI Literacy Through Education

The program trains teachers on incorporating AI concepts into classroom lessons while emphasizing ethical decision-making, responsible use, and problem-solving. Developed by MIT researchers in collaboration with K-12 educators, the Day of AI curriculum is freely available and used in more than 170 countries.

The Rwandan rollout represents the largest coordinated national deployment of the program in Africa so far, with newly trained master educators returning to their districts to share knowledge and scale the initiative nationwide.

“This was one of the most impactful, hopeful experiences I’ve had in this work,” said Dr. Randi Williams, a key partner in the program. “The enthusiasm, curiosity, and leadership shown by these educators was inspiring. Together, we’re planting the seeds of a generation that will understand, build, and lead with AI.”

Leadership and Strategic Vision

Dr. Williams praised Rwanda’s visionary commitment to AI readiness, crediting the leadership of Minister of Education Joseph Nsengimana and the REB team, alongside the Ibrahim El Hefni Technical Training Foundation, which provided critical support for the initiative.

Professor Cynthia Breazeal, Director of the MIT RAISE Initiative, described the program’s launch as a model for other nations:

“The launch of Day of AI Rwanda is not only a powerful moment for us, but also a blueprint for how countries can prioritize AI readiness in education with both vision and urgency.”

Reflections from Rwandan Education Leaders

At the program’s closing ceremony, Rwandan education leaders emphasized the country’s ambition to integrate AI into all levels of schooling.

Eden Mamo, speaking on behalf of Minister Nsengimana, highlighted how the initiative aligns with Rwanda’s Vision 2050, which seeks to build a knowledge-based economy powered by innovation:

“This moment places education at the very heart of Rwanda’s AI journey. Our youth must not only use AI but build it, question it, and improve it. This initiative shows Rwanda has chosen to ride the AI wave consciously and ethically.”

Dr. Diane Segati, Director of ICT at REB, echoed this vision:

“We want AI to be used in all schools, by all students, and by all teachers. When we first introduced ICT, we started small. Now, we’ve integrated ICT into education, and we are moving to the next level with AI. We are grateful to MIT RAISE, Day of AI, and Dr. Randi for their support and collaboration.”

Shaping Rwanda’s Digital Future

The Day of AI Rwanda program marks a milestone in the nation’s digital transformation strategy, ensuring that Rwandan students and educators are not only consumers of AI technologies but also innovators shaping their future. With master educators leading localized training and the integration of AI into ICT programs, Rwanda is positioning itself as a continental leader in AI literacy, education reform, and future-ready skills development.



Source link

Continue Reading

Trending