Connect with us

Education

Local Schools Adopt AI Guidelines: Balancing Technology and Learning

Published

on


Brainstorming, laptop and meeting for healthcare in conference, technology and discussion with medical workers. Collaboration, doctor and nurse in team for affirmative action, health and planning.

Artificial intelligence (AI) is rapidly changing the education landscape, from personalized learning to automated grading. As AI becomes more integrated into classrooms, local schools are implementing guidelines to ensure it enhances learning without replacing essential skills. Across the country, districts are working to find a balance between technology and traditional education methods.

Why AI Guidelines Matter

The use of AI in education has sparked both excitement and concern. On one hand, AI tools like chatbots, virtual tutors, and automated assessment systems can help students learn at their own pace. On the other hand, educators worry about academic integrity, over-reliance on technology, and potential biases in AI algorithms.

To address these concerns, schools are adopting AI guidelines that promote responsible use. These policies often focus on ensuring AI serves as a learning aid rather than a shortcut. For example, some schools have introduced rules requiring students to disclose when AI-generated content is used in assignments. Others are training teachers to incorporate AI tools effectively without diminishing critical thinking skills.

Burbank Schools and AI Integration

​Artificial intelligence is increasingly becoming a focal point in educational discussions, prompting school districts nationwide to consider implementing guidelines for its use. While specific information about Burbank Unified School District’s (BUSD) policies on AI integration is not readily available, the district has demonstrated a commitment to leveraging technology to enhance learning experiences. For instance, during the pandemic, BUSD transitioned to a 100% distance learning model, showcasing its adaptability in utilizing technology to maintain educational continuity. 

In contrast, other districts have proactively developed comprehensive AI guidelines. The Santa Ana Unified School District (SAUSD), for example, introduced the “AI Compass,” a guide aimed at navigating the intersection of technology and education. This initiative underscores the importance of establishing clear policies to ensure AI tools are used responsibly and effectively in educational settings. 

As AI continues to evolve, it is imperative for school districts, including BUSD, to assess and develop frameworks that address the ethical and practical implications of AI in classrooms. By doing so, they can ensure that technology serves as a beneficial tool for both educators and students, enhancing the learning experience while upholding academic integrity.

Technology’s Role Across Different Education Levels

AI and technology are not just influencing K-12 education; they are transforming learning at every level.

  • Elementary and Middle Schools: Younger students are engaging with AI-powered educational games that adapt to their learning pace. Interactive platforms provide real-time feedback, helping students strengthen their skills in subjects like math and reading. However, teachers are also emphasizing hands-on learning to ensure fundamental skills remain strong.
  • High Schools: Many high schools are introducing AI tools that assist with college preparation. Platforms that analyze writing, suggest improvements, and provide instant feedback are becoming popular. Some schools are also incorporating AI into career-focused programs, helping students develop coding and data analysis skills that are in high demand.
  • Higher Education and Professional Programs: Universities are adopting AI for tasks such as grading, course recommendations, and virtual tutoring. Online education has particularly benefited, as AI-driven platforms create personalized study plans and offer 24/7 support. For example, online MSW programs in California use simulations to help students develop real-world social work skills in a virtual environment. This allows for more flexible and accessible education, preparing students for their careers with innovative tools.

Finding the Right Balance

While AI offers many benefits, striking the right balance is essential. Schools must ensure that students develop core skills such as critical thinking, creativity, and collaboration—areas where human interaction is irreplaceable. Educators are working to blend AI-driven tools with traditional teaching methods to create a well-rounded learning experience.

Transparency is also key. Schools that openly communicate AI policies with students and parents can create an environment where technology is embraced responsibly. By setting clear expectations and continuously evaluating AI’s impact, schools can make the most of technological advancements while preserving the human element of education.

Conclusion

AI is here to stay, and its role in education will only continue to grow. By adopting thoughtful guidelines, schools can harness AI’s potential without compromising academic integrity or essential skills. In places like Burbank, educators are leading the way by ensuring AI is used as a tool for learning rather than a shortcut. As technology continues to evolve, the goal remains the same: preparing students for the future while keeping education meaningful and engaging.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Education

New York Passes the Responsible AI Safety and Education Act

Published

on


The New York legislature recently passed the Responsible AI Safety and Education Act (SB6953B) (“RAISE Act”).  The bill awaits signature by New York Governor Kathy Hochul.

Applicability and Relevant Definitions

The RAISE Act applies to “large developers,” which is defined as a person that has trained at least one frontier model and has spent over $100 million in compute costs in aggregate in training frontier models. 

  • “Frontier model” means either (1) an artificial intelligence (AI) model trained using greater than 10°26 computational operations (e.g., integer or floating-point operations), the compute cost of which exceeds $100 million; or (2) an AI model produced by applying knowledge distillation to a frontier model, provided that the compute cost for such model produced by applying knowledge distillation exceeds $5 million.
  • “Knowledge distillation” is defined as any supervised learning technique that uses a larger AI model or the output of a larger AI model to train a smaller AI model with similar or equivalent capabilities as the larger AI model.

The RAISE Act imposes the following obligations and restrictions on large developers:  

  • Prohibition on Frontier Models that Create Unreasonable Risk of Critical Harm: The RAISE Act prohibits large developers from deploying a frontier model if doing so would create an unreasonable risk of “critical harm.”

    • Critical harm” is defined as the death or serious injury of 100 or more people, or at least $1 billion in damage to rights in money or property, caused or materially enabled by a large developer’s use, storage, or release of a frontier model through (1) the creation or use of a chemical, biological, radiological or nuclear weapon; or (2) an AI model engaging in conduct that (i) acts with no meaningful human intervention and (ii) would, if committed by a human, constitute a crime under the New York Penal Code that requires intent, recklessness, or gross negligence, or the solicitation or aiding and abetting of such a crime.

  • Pre-Deployment Documentation and Disclosures: Before deploying a frontier model, large developers must:

    • (1) implement a written safety and security protocol;
    • (2) retain an unredacted copy of the safety and security protocol, including records and dates of any updates or revisions, for as long as the frontier model is deployed plus five years;
    • (3) conspicuously publish a redacted copy of the safety and security protocol and provide a copy of such redacted protocol to the New York Attorney General (“AG”) and the Division of Homeland Security and Emergency Services (“DHS”) (as well as grant the AG access to the unredacted protocol upon request);
    • (4) record and retain for as long as the frontier model is deployed plus five years information on the specific tests and test results used in any assessment of the frontier model that provides sufficient detail for third parties to replicate the testing procedure; and
    • (5) implement appropriate safeguards to prevent unreasonable risk of critical harm posed by the frontier model.

  • Safety and Security Protocol Annual Review: A large developer must conduct an annual review of its safety and security protocol to account for any changes to the capabilities of its frontier models and industry best practices and make any necessary modifications to protocol. For material modifications, the large developer must conspicuously publish a copy of such protocol with appropriate redactions (as described above).  
  • Reporting Safety Incidents: A large developer must disclose each safety incident affecting a frontier model to the AG and DHS within 72 hours of the large developer learning of the safety incident or facts sufficient to establish a reasonable belief that a safety incident occurred.

    • “Safety incident” is defined as a known incidence of critical harm or one of the following incidents that provides demonstrable evidence of an increased risk of critical harm: (1) a frontier model autonomously engaging in behavior other than at the request of a user; (2) theft, misappropriation, malicious use, inadvertent release, unauthorized access, or escape of the model weights of a frontier model; (3) the critical failure of any technical or administrative controls, including controls limiting the ability to modify a frontier model; or (4) unauthorized use of a frontier model. The disclosure must include (1) the date of the safety incident; (2) the reasons the incident qualifies as a safety incident; and (3) a short and plain statement describing the safety incident.

If enacted, the RAISE Act would take effect 90 days after being signed into law.



Source link

Continue Reading

Education

School suspensions and exclusions rise to nearly a million in England

Published

on


The number of school suspensions and exclusions in England has reached its highest level since 2006, Department for Education figures show.

There were 954,952 suspensions in state schools in 2023/24 – a 21% increase on the previous year – while exclusions also rose 16% to 10,885.

While secondary school pupils comprised most suspensions, more than 100,000 were primary age – a number that has grown significantly.

A suspended pupil must stay out of school for a fixed period of up to 45 days per school year, while those excluded are permanently removed. Individual pupils often account for more than one period of suspension.

The government says it is tackling the root causes of poor behaviour and is intensively supporting 500 schools with the worst behaviour.

Persistent disruptive behaviour was the most common reason pupils were sent home, accounting for half of all suspensions and 39% exclusions.

Nearly half of the suspensions were among pupils getting support for special educational needs – who were three times more likely to be suspended than their classmates.

Children on free school meals were also overrepresented, making up a quarter of the school population but 60% of suspensions.

Paul Whiteman, general secretary at school leaders’ union NAHT, said schools alone could not address the causes of poor behaviour.

“Schools have a duty to provide a safe environment for all pupils and only use suspensions and exclusions when other options to ensure this have been exhausted,” he said.

“The reasons for disruptive behaviour often lie beyond the school gates and have their roots in wider challenges, including everything from poverty to access to support with special educational needs and mental ill-health.”

The vast majority of suspensions – nine in 10 – occurred at secondary schools, with Year 9 having the highest rate.

But primary-age suspensions rose too, up 24% on the previous year.

The vast majority (88%) of pupils who were excluded at primary school were getting support for special educational needs, compared with 46% of excluded secondary school pupils.

Research from charity Chance UK, which supports families of excluded children in London, suggests that 90% of children who are excluded at primary school fail to pass GCSE English and maths.

Sophie Schmal, the charity’s director, said Thursday’s figures revealed a “very concerning picture” – particularly the rise in primary school suspensions.

“Early intervention has to mean early. We can’t wait until these children are teenagers to tackle this.”

Sarah – not her real name – is a mum of one in London. Her six-year-old son was suspended several times within his first few weeks at primary school for hitting other pupils and throwing things in class.

She said that even after school staff agreed that her son showed signs of autism, he continued to be sent out of class regularly and suspended, which made him feel “isolated”.

“Since he was three years old, my son has been labelled as the naughty and difficult kid when all he really needed was help,” she said.

“I sought help as soon as I recognised that he needed additional support. But rather than helping me immediately, they waited until it was an emergency.”

Sarah eventually managed to move her son to a different mainstream school where he is getting more support, she said.

Responding to the figures, early education minister Stephen Morgan said the Labour government had “wasted no time in tackling the root causes of poor behaviour”, including offering mental health support in every school and expanding free school meals.

He pointed to its new attendance and behaviour hubs, which will directly support the 500 schools that “need the most help”.

“We’re also continuing to listen to parents as we reform the SEND system, while already putting in place better and earlier support for speech and language needs, ADHD and autism,” Morgan added.



Source link

Continue Reading

Education

Pasco schools have a new AI program. It may help personalize lessons.

Published

on


When Lacoochee Elementary School resumes classes in August, principal Latoya Jordan wants teachers to focus more attention on each student’s individual academic needs.

She’s looking at artificial intelligence as a tool they can use to personalize lessons.

“I’m interested to see how it can help,” Jordan said.

Lacoochee is exploring whether to become part of the Pasco County school district’s new AI initiative being offered to 30 campuses in the fall. It’s a test run that two groups — Scholar Education and Khanmigo — have offered the district free of charge to see whether the schools find a longer-term fit for their classes.

Scholar, a state-funded startup that made its debut last year at Pepin Academy and Dayspring Academy, will go into selected elementary schools. Khanmigo, a national model recently highlighted on 60 Minutes, is set for use in some middle and high schools.

“Schools ultimately will decide how they want to use it,” said Monica Ilse, deputy superintendent for academics. “I want to get feedback from teachers and leaders for the future.”

Ilse said she expected the programs might free teachers from some of the more mundane aspects of their jobs, so they can pay closer attention to their students. A recent Gallup poll found teachers who regularly use AI said it saves them about six hours of work weekly, in areas such as writing quizzes and completing paperwork.

Marlee Strawn, cofounder of Scholar Education, introduced her system to the principals of 19 schools during a June 30 video call. The model is tied to Florida’s academic standards, Strawn said, and includes dozens of lessons that teachers can use.

It also allows teachers to craft their own assignments, tapping into the growing body of material being uploaded. The more specific the request, the more fine-tuned the exercises can be. If a student has a strong interest in baseball or ballet, for instance, the AI programming can help develop standards-based tasks on those subjects, she explained.

Perhaps most useful, Strawn told the principals, is the system’s ability to support teachers as they analyze student performance data. It identifies such things as the types of questions students asked and the items they struggled with, and can make suggestions about how to respond.

“The data analytics has been the most helpful for our teachers so far,” she said.

She stressed that Scholar Education protects student data privacy, a common concern among parents and educators, noting the system got a top rating from Common Sense.

School board member Jessica Wright brought up criticisms that AI has proven notoriously error-prone in math.

Strawn said the system has proven helpful when teachers seek to provide real-life examples for math concepts. She did not delve into details about the reliability of AI in calculations and formulas.

Required reading for Floridians

Subscribe to our free Florida in Focus newsletter

Get the biggest stories happening across the state every Wednesday.

You’re all signed up!

Want more of our free, weekly newsletters in your inbox? Let’s get started.

Explore all your options

Lacoochee principal Jordan wanted to know how well the AI system would interface with other technologies, such as iReady, that schools already use.

“If it works with some of our current systems, that’s an easier way to ease into it, so for teachers it doesn’t become one more thing that you have to do,” Jordan said.

Strawn said the automated bot is a supplement that teachers can integrate with data from other tools to help them identify classroom needs and create the types of differentiated instruction that Jordan and others are looking for.

The middle and high school model, Khanmigo, will focus more on student tutoring, Ilse wrote in an email to principals. It’s designed to “guide students to a deeper understanding of the content and skills mastery,” she explained in the email. As with Scholar, teachers can monitor students’ interactions and step in with one-on-one support as needed, in addition to developing lesson plans and standards-aligned quizzes.

Superintendent John Legg said teachers and schools would not be required to use AI. Legg said he simply wanted to provide options that might help teachers in their jobs. After a year, the district will evaluate whether to continue, most likely with paid services.

While an administrator at Dayspring Academy before his election, Legg wrote a letter of support for Scholar Education’s bid for a $1 million state startup grant, and he also received campaign contributions from some of the group’s leaders. He said he had no personal stake in the organization and was backing a project that might improve education, just as he previously supported Algebra Nation, the University of Florida’s online math tutoring program launched in 2013.



Source link

Continue Reading

Trending