Connect with us

AI Research

3 Arguments Against AI in the Classroom

Published

on


Generative artificial intelligence is here to stay, and K-12 schools need to find ways to use the technology for the benefit of teaching and learning. That’s what many educators, technology companies, and AI advocates say.

In response, more states and districts are releasing guidance and policies around AI use in the classroom. Educators are increasingly experimenting with the technology, with some saying that it has been a big time saver and has made the job more manageable.

But not everyone agrees. There are educators who are concerned that districts are buying into the AI hype too quickly and without enough skepticism.

A nationally representative EdWeek Research Center survey of 559 K-12 educators conducted during the summer found that they are split on whether AI platforms will have a negative or positive impact on teaching and learning in the next five years: 47% say AI’s impact will be negative, while 43% say it will be positive.

Education Week talked to three veteran teachers who are not using generative AI regularly in their work and are concerned about the potential negative effects the technology will have on teaching and learning.

Here’s what they think about using generative AI in K-12.

AI provides ‘shortcuts’ that are not conducive for learning

Dylan Kane, a middle school math teacher at Lake County High School in Leadville, Colo., isn’t “categorically against AI,” he said.

He has experimented with the technology personally, using it to help him improve his Spanish-language skills. AI is a “half decent” Spanish tutor, if you understand its limitations, he said. For his teaching job, Kane has experimented with AI tools to generate student materials like many other teachers, but it takes too many iterations of prompting to generate something he would actually put in front of his classes.

“I will do a better job just doing it myself and probably take less time to do so,” said Kane, who is in his 14th year of teaching. Creating student materials himself means he can be “more intentional” about the questions he asks, how they’re sequenced, how they fit together, how they build on each other, and what students already know.

His biggest concern is how generative AI will affect educators and students’ critical-thinking skills. Too often, people are using these tools to take “shortcuts,” he said.

“If I want students to learn something, I need them to be thinking about it and not finding shortcuts to avoid thinking,” Kane said.

The best way to prepare students for an AI-powered future is to “give them a broad and deep collection of knowledge about the world and skills in literacy, math, history and civics, and science,” so they’ll have the knowledge they need to understand if an AI tool is providing them with a helpful answer, he said.

That’s true for teachers, too, Kane said. The reason he can evaluate whether AI-generated material is accurate and helpful is because of his years of experience in education.

“One of my hesitations about using large language models is that I won’t be developing skills as a teacher and thinking really hard about what things I put in front of students and what I want them to be learning,” Kane said. “I worry that if I start leaning heavily on large language models, that it will stunt my growth as a teacher.”

And the fact that teachers have to use generative AI tools to create student materials “points to larger issues in the teaching profession” around the curricula and classroom resources teachers are given, Kane said. AI is not “an ideal solution. That’s a Band-Aid for a larger problem.”

Kane’s open to using AI tools. For instance, he said he finds generative AI technology helpful for writing word problems. But educators should “approach these things with a ton of skepticism and really ask ourselves: ‘Is this better than what we should be doing?’”

Experts and leaders haven’t provided good justifications for AI use in K-12

Jed Williams, a high school math and science teacher in Belmont, Mass., said he hasn’t heard any good justifications for why generative AI should be implemented in schools.

The way AI is being presented to teachers tends to be “largely uncritical,” said Williams, who teaches computer science, physics, and robotics at Belmont High School. Often, professional development opportunities about AI don’t provide a “critical analysis” of the technology and just “check the box” by mentioning that AI tools have downsides, he said.

For instance, one professional development session he attended only spent “a few seconds” on the downsides of AI tools, Williams said. The session covered the issue of overreliance on AI tools, but Williams criticized it for not talking about “labor exploitation, overuse of resources, sacrificing the privacy of students and faculty,” he said.

“We have a responsibility to be skeptical about technologies that we bring into the classroom,” Williams said, especially because there’s a long history of ed-tech adoption failures.

Williams, who has been teaching since 2006, is also concerned that AI tools could decrease students’ cognitive abilities.

“So much of learning is being put into a situation that is cognitively challenging,” he said. “These tools, fundamentally, are built on relieving the burden of cognitive challenge.

“Especially in introductory courses, where students aren’t familiar with programming and you want them to try new things and experiment and explore, why would you give them this tool that completely removes those aspects that are fundamental to learning?” Williams said.

Williams is also worried that a rushed implementation of AI tools would sacrifice students and teachers’ privacy and use them as “experimental subjects in developing technologies for tech companies.”

Education leaders “have a tough job,” Williams said. He understands the pressure they feel around implementing AI, but he hopes they give it “critical thought.”

Decisionmakers need to be clear about what technology is being proposed, how they anticipate teachers and students using it, what the goal of its use is, and why they think it’s a good technology to teach students how to use, Williams said.

“If somebody has a good answer for that, I’m very happy to hear proposals on how to incorporate these things in a healthy, safe way,” he said.

Educators shouldn’t fall for the ‘fallacy’ that AI is inevitable

Elizabeth Bacon, a middle school computer science teacher in California, hasn’t found any use cases with generative AI tools that she feels will be beneficial for her work.

“I would rather do my own lesson plan,” said Bacon, who has been teaching for more than 20 years. “I have an idea of what I want the students to learn, of what’s interesting to them, and where they are and the entry points for them to engage in it.”

Teachers have a lot of pressure to do more with less. That’s why Bacon said she doesn’t judge other teachers who want to use AI to get the job done. It’s “a systemic problem,” but teaching and learning shouldn’t be replaced by machines, she said.

Bacon believes it’s “particularly dangerous” for middle school students to be using “a machine emulating a person.” Students are still developing their character, their empathy, their ability to socialize with peers and work collectively toward a goal, she said, and a chatbot would undermine that.

She can foresee using generative AI tools to explain to her students what large language models are. It’s important for them to learn about generative AI, that it’s a statistical model predicting the next likely word based on data it’s been trained on, that there’s no meaning [or feelings] behind it, Bacon said.

Last school year, she asked her high school students what they wanted to know about AI. Their answers: the technology’s social and environmental impacts.

Bacon doesn’t think educators should fall for the “fallacy” that AI is the inevitable future because technology companies are the ones saying that and they have an incentive to say that, she said.

“Educators have basically been told, in a lot of ways, ‘don’t trust your own instincts about what’s right for your students, because [technology companies are] going to come in and tell you what’s going to be good for your students,” she said.

It’s discouraging to see that a lot of the AI-related professional development events she’s attended have “essentially been AI evangelism” and “product marketing,” she said. There should be more thought about why this technology is necessary in K-12, she said.

Technology experts have talked up AI’s potential to increase productivity and efficiency. But as an educator, “efficiency is not one of my values,” Bacon said.

“My value is supporting students, meeting them where they are, taking the time it takes to connect with these students, taking the time that it takes to understand their needs,” she said. “As a society, we have to take a hard look: Do we value education? Do we value doing our own thinking?”





Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

AI Research

Databricks at a crossroads: Can its AI strategy prevail without Naveen Rao?

Published

on


“Databricks is in a tricky spot with Naveen Rao stepping back. He was not just a figurehead, but deeply involved in shaping their AI vision, particularly after MosaicML,” said Robert Kramer, principal analyst at Moor Insights & Strategy.

“Rao’s absence may slow the pace of new innovation slightly, at least until leadership stabilizes. Internal teams can keep projects on track, but vision-driven leaps, like identifying the ‘next MosaicML’, may be harder without someone like Rao at the helm,” Kramer added.

Rao became a part of Databricks in 2023 after the data lakehouse provider acquired MosaicML, a company Rao co-founded, for $1.3 billion. During his tenure, Rao was instrumental in leading research for many Databricks products, including Dolly, DBRX, and Agent Bricks.



Source link

Continue Reading

AI Research

NFL player props, odds: Week 2, 2025 NFL picks, SportsLine Machine Learning Model AI predictions, SGP

Published

on


The Under went 12-4 in Week 1, indicating that not only were there fewer points scored than expected, but there were also fewer yards gained. Backing the Under with NFL prop bets was likely profitable for the opening slate of games, but will that maintain with Week 2 NFL props? Interestingly though, four of the five highest-scoring games last week were the primetime games, so if that holds, then the Overs for this week’s night games could be attractive with Week 2 NFL player props.

There’s a Monday Night Football doubleheader featuring star pass catchers like Nico Collins, Mike Evans and Brock Bowers. The games also feature promising rookies such as Ashton Jeanty, Omarion Hampton and Emeka Egbuka. Prop lines are usually all over the place early in the season as sportsbooks attempt to establish a player’s potential, and you could take advantage of this with the right NFL picks. If you are looking for NFL prop bets or NFL parlays for Week 2, SportsLine has you covered with the top Week 2 player props from its Machine Learning Model AI.

Built using cutting-edge artificial intelligence and machine learning techniques by SportsLine’s Data Science team, AI Predictions and AI Ratings are generated for each player prop. 

Now, with the Week 2 NFL schedule quickly approaching, SportsLine’s Machine Learning Model AI has identified the top NFL props from the biggest Week 2 games.

Week 2 NFL props for Sunday’s main slate

After analyzing the NFL props from Sunday’s main slate and examining the dozens of NFL player prop markets, the SportsLine’s Machine Learning Model AI says Lions receiver Amon-Ra St. Brown goes Over 63.5 receiving yards (-114) versus the Bears at 1 p.m. ET. Detroit will host this contest, which is notable as St. Brown has averaged 114 receiving yards over his last six home games. He had at least 70 receiving yards in both matchups versus the Bears a year ago.

Chicago allowed 12 receivers to go Over 63.5 receiving yards last season as the Bears’ pass defense is adept at keeping opponents out of the endzone but not as good at preventing yardage. Chicago allowed the highest yards per attempt and second-highest yards per completion in 2024. While St. Brown had just 45 yards in the opener, the last time he was held under 50 receiving yards, he then had 193 yards the following week. The SportsLine Machine Learning Model projects 82.5 yards for St. Brown in a 4.5-star pick. See more Week 2 NFL props here.

Week 2 NFL props for Vikings vs. Falcons on Sunday Night Football

After analyzing Falcons vs. Vikings props and examining the dozens of NFL player prop markets, the SportsLine’s Machine Learning Model AI says Falcons running back Bijan Robinson goes Over 65.5 rushing yards (-114). Robinson ran for 92 yards and a touchdown in Week 14 of last season versus Minnesota, despite the Vikings having the league’s No. 2 run defense a year ago. The SportsLine Machine Learning Model projects Robinson to have 81.8 yards on average in a 4.5-star prop pick. See more NFL props for Vikings vs. Falcons here

You can make NFL prop bets on Robinson, Justin Jefferson and others with the Underdog Fantasy promo code CBSSPORTS2. Pick at Underdog Fantasy and get $50 in bonus funds after making a $5 wager:

Week 2 NFL props for Buccaneers vs. Texans on Monday Night Football

After analyzing Texans vs. Buccaneers props and examining the dozens of NFL player prop markets, the SportsLine’s Machine Learning Model AI says Bucs quarterback Baker Mayfield goes Under 235.5 passing yards (-114). While Houston has questions regarding its offense, there’s little worry about the team’s pass defense. In 2024, Houston had the second-most interceptions, the fourth-most sacks and allowed the fourth-worst passer rating. Since the start of last year, and including the playoffs, the Texans have held opposing QBs under 235.5 yards in 13 of 20 games. The SportsLine Machine Learning Model forecasts Mayfield to finish with just 200.1 passing yards, making the Under a 4-star NFL prop. See more NFL props for Buccaneers vs. Texans here

You can also use the latest FanDuel promo code to get $300 in bonus bets instantly:

Week 2 NFL props for Chargers vs. Raiders on Monday Night Football

After analyzing Raiders vs. Chargers props and examining the dozens of NFL player prop markets, the SportsLine’s Machine Learning Model AI says Chargers quarterback Justin Herbert goes Under 254.5 passing yards (-114). The Raiders’ defense was underrated in preventing big passing plays a year ago as it ranked third in the NFL in average depth of target allowed. It forced QBs to dink and dunk their way down the field, which doesn’t lead to big passing yardages, and L.A. generally prefers to not throw the ball anyway. Just four teams attempted fewer passes last season than the Chargers, and with L.A. running for 156.5 yards versus Vegas last season, Herbert shouldn’t be overly active on Monday night. He’s forecasted to have 221.1 passing yards in a 4.5-star NFL prop bet. See more NFL props for Chargers vs. Raiders here

How to make Week 2 NFL prop picks

SportsLine’s Machine Learning Model has identified another star who sails past his total and has dozens of NFL props rated 4 stars or better. You need to see the Machine Learning Model analysis before making any Week 2 NFL prop bets.

Which NFL prop picks should you target for Week 2, and which quarterback has multiple 5-star rated picks? Visit SportsLine to see the latest NFL player props from SportsLine’s Machine Learning Model that uses cutting-edge artificial intelligence to make its projections.





Source link

Continue Reading

AI Research

In the News: Thomas Feeney on AI in Higher Education – Newsroom

Published

on


“I had an interesting experience over the summer teaching an AI ethics class. You know plagiarism would be an interesting question in an AI ethics class … They had permission to use AI for the first written assignment. And it was clear that many of them had just fed in the prompt, gotten back the paper and uploaded that. But rather than initiate a sort of disciplinary oppositional setting, I tried to show them, look, what you what you’ve produced is kind of generic … and this gave the students a chance to recognize that they weren’t there in their own work. This opened the floodgates,” Feeney said.

“I think the focus should be less on learning how to work with the interfaces we have right now and more on just graduate with a story about how you did something with AI that you couldn’t have done without it. And then, crucially, how you shared it with someone else,” he continued.



Source link

Continue Reading

Trending