Connect with us

AI Insights

Film Schools Are Embracing AI. Should They?

Published

on


Jake Panek, a 20-year-old film major, says he’s had a great time at DePaul University in Chicago, and a very positive experience with the school’s cinema program. However, a recent email alerting students to a new course in “AI screenwriting” triggered a wellspring of untapped rage in him.

The email, which was circulated last week, offered undergrads the opportunity to examine “the rapidly evolving role of artificial intelligence in the screenwriting process” and to help students “explore how AI can support and enhance creativity in writing for film and television.” Panek wasn’t having it.

Not long after he received the email, the young filmmaker was so angry that he took to Instagram to express his thoughts: “seeing this email made me embarrassed to be a depaul film student,” he wrote, tagging the school and its film program. “I think the professor that’ll be ‘teaching’ this course, every student who enrolls in this course, and everyone who is allowing this course to happen should seriously reconsider it—THIS COURSE SHOULD NOT BE A THING.”

When Panek talked to me about the program, his disdain for the class didn’t seem to have waned much. “I think it’s bullshit,” he told me. “I’m just so angry at the very existence of it.”

DePaul’s School of Cinematic Arts is considered one of the top film programs in the country, and it has often distinguished itself by allowing the student body to access cutting-edge equipment and software. Recently, however, officials at the school have become interested in AI. In May, the film program held an “AI in the Arts” symposium, designed to explore “the transformative role of Artificial Intelligence in the arts.” Even before this event, certain people within the administration have been pushing the film program to further explore integration of AI into its curriculum, said Matthew Quinn, the DePaul professor who has been tasked with teaching the new screenwriting course.

“Our school, the School of Cinematic Arts, is in the College of Computing and Digital Media,” said Quinn. “Our dean is from the School of Computing, so they’re of course very invested in AI.” DePaul also has an AI task force, he added, noting that there was a university-wide effort to study and integrate AI into the curriculum.

What does an “AI screenwriting” course entail? Quinn said that the course was very similar to other screenwriting courses that he’d taught, with the key difference being that generative AI was used to help create and shape the material. “So, like when it comes to generating log lines and then working on character bios and character development, and then ultimately culminating with a step outline [a step outline is a summary of a script’s scenes],” Quinn said that students in the class would “collaborate” with ChatGPT. Later, students would participate in a workshop where their assignments were discussed, Quinn said. Students would talk about their creative process, discuss their collaborations with the chatbots, and explain what was “helpful” and what wasn’t. The class was designed to replicate “the process of developing a script,” he said.

Quinn also noted that, currently, DePaul’s film program has a policy that requires students to acknowledge the use of AI in screenwriting. If students use it, they have to explain why and how, he said. It’s currently at the faculty’s discretion to determine whether students can use AI in that way or not, he added.

DePaul isn’t the only film school that has begun to offer AI-related courses. The University of Southern California recently launched an AI for Media and Storytelling studio, which is designed to explore how the tech can be integrated into the film, media, and journalism industries. UCLA Extension recently launched a new course called “Creative Process in the Age of AI,” and even the American Film Institute has dipped its toes into the space, having debuted a three-day seminar series on “Storytelling and AI” earlier this year.

Holly Willis, the co-director of USC’s AI studio, said the school got interested in developing a new AI program in 2023, not long after the release of ChatGPT and the groundswell of cultural interest in generative AI. “It was kind of around that same time,” Willis said. “I think at that moment, it was such a huge thing, we realized ‘Oh, this is a really important shift’,” she said.

Willis, who has now taught multiple courses examining potential creative applications of AI, describes herself as “deeply critical” of the technology but also said that she was “very excited about [the] new forms of storytelling” that the tech could provide. “I think there are definite problems with generative AI and how it’s been introduced to filmmakers and artists, and how, right now, much of the ownership of the tools is within a corporate context,” she said.  “But at the same time, the work that I’m seeing artists do is really exciting.”

In an article about AI’s use in the arts, Willis highlights the work of Souki Mansoor, a former documentarian who “stumbled into the AI filmmaking rabbit hole” and now works in the tech industry. Mansoor, who served as a guest speaker in one of Willis’ AI-themed classes, currently works for OpenAI as “Sora Artist Program Lead,” according to her LinkedIn profile. It’s unclear what that really means, but Mansoor, who describes herself as a “recovering filmmaker,” has produced some visual pieces using platforms like the ones OpenAI is currently marketing. Indeed, in 2023, she “generated” a short film dubbed An AI Dreams of Dogfish, using prompts entered into RunwayML’s Gen2.

While Willis expresses excitement for works of the sort that Mansoor has produced, she notes that some of her students seem a little concerned about the infusion of AI into the arts. “I would say that students are very nervous,” she said. “The first class I taught when we started this initiative, students were very wary…like, ‘Why are we paying for this education when anyone can now create these images so easily?’” They didn’t realize that you still needed “skills and storytelling,” she said.

As for DePaul’s AI screenwriting course, Quinn said he hasn’t seen a ton of pushback from students, but there doesn’t appear to be a whole lot of interest either. “Right now, there’s not even that many students enrolled in it,” Quinn told me. “It might not even run.” He further clarified that the course wasn’t about mindlessly embracing AI. Instead, he described it as a workshop designed to expose students to different perspectives on the “current state-of-play” of the technology and what it could potentially offer creatives. Quinn admitted that he, himself, was “conflicted” about AI’s use in the creative arts. “It’s not like I’m a huge proponent of AI and love AI,” he said. “It’s more like, as an educator, I feel like I’m doing a disservice to students if I’m not exposing them to this or pretending like it’s not happening.” Quinn wants students to make an informed decision on whether they want to engage with AI or not, and to do that, they need to understand it.

For students like Panek, however, the whole thing seems like a huge betrayal of the fundamentals of the creative process. “I understand the desire, as an artist, to take a shortcut,” Panek offered. After all, making movies is really difficult, and it can often feel like the world is against you. But Panek said that he and his fellow students find their own ways to solve problems—that’s part of making movies. “Taking the shortcut of generative AI” ultimately “doesn’t do anything for anyone,” he said. “You’re not gaining anything by typing something into a computer and having it spit something back at you,” he added.

“Filmmaking is hard,” Panek said, while noting that if “you’re not willing to…find your own solutions to things, and your first thought is just, ‘Oh, well generative AI exists, let’s just use that’” it’s hard to really call yourself an artist.





Source link

AI Insights

AI Can Generate Code. Is That a Threat to Computer Science Education?

Published

on


Some of Julie York’s high school computer science students are worried about what generative artificial intelligence will mean for future careers in the tech industry. If generative AI can code, then what is left for them to do? Will those jobs they are working toward still be available by the time they graduate? Is it still worth it to learn to code?

They are “worried about not being necessary anymore,” said York, who teaches at South Portland High School in South Portland, Maine. “The biggest fear is, if the computer can do this, then what can I do?”

The anxieties are fueled by the current landscape of the industry: Many technology companies are laying off employees, with some linking the layoffs to the rise of AI. CEOs are embracing AI tools, making public statements that people don’t need to learn to code anymore and that AI tools can replace lower or mid-level software engineers.

However, many computer science education experts disagree with the idea that AI will make learning to code obsolete.

Technology CEOs “have an economic interest in making that argument,” said Philip Colligan, the chief executive officer of the Raspberry Pi Foundation, a U.K.-based global nonprofit focused on computer science education. “But I do think that argument is not only wrong, but it’s also dangerous.”

While computer science education experts acknowledged the uncertainty of the job market right now, they argued it’s still valuable to learn to code along with foundational computer science principles, because those are the skills that will help them better navigate an AI-powered world.

Why teaching and learning coding is still important, even if AI can spit out code

The Raspberry Pi Foundation published a position paper in June outlining five arguments why kids still need to learn to code in the age of AI. In an interview with Education Week, Colligan described them briefly:

  1. We need skilled human programmers who can guide, control, and critically evaluate AI outputs.
  2. Learning to code is an essential part of learning to program. “It is through the hard work of learning to code that [students] develop computational thinking skills,” Colligan said.
  3. Learning to code will open up more opportunities in the age of AI. It’s likely that as AI seeps into other industries, it will lead to more demand for computer science and coding skills, Colligan said.
  4. Coding is a literacy that helps young people have agency in a digital world. “Lots of the decisions that affect our lives are already being taken by AI systems,” Colligan said, and with computer science literacy, people have “the ability to challenge those automated decisions.”
  5. The kids who learn to code will shape the future. They’ll get to decide what technologies to build and how to build them, Colligan said.

Hadi Partovi, the CEO and founder of Code.org, agreed that the value of computer science isn’t just economic. It’s also about “equipping students with the foundation to navigate an increasingly digital world,” he wrote in a LinkedIn blog post. These skills, he said, matter even for students who don’t pursue tech careers.

“Computer science teaches problem-solving, data literacy, ethical decision-making and how to design complex systems,” Partovi wrote. “It empowers students not just to use technology but to understand and shape it.”

With her worried students, York said it’s her job as a teacher to reassure them that their foundational skills are still necessary, that AI can’t do anything on its own, that they still need to guide the tools.

“By teaching those foundational things, you’re able to use the tools better,” York said.

Computer science education should evolve with emerging technologies

If foundational computer science skills are even more valuable in a world increasingly powered by AI, then does the way teachers teach them need to change? Yes, according to experts.

“There is a new paradigm of computing in the world, which is this probabilistic, data-driven model, and that needs to be integrated into computer science classes,” said Colligan.

The Computer Science Teachers Association this year released its AI learning priorities: All students should understand how AI technologies work and where they might be used, the association asserted; students should be able to use and critically evaluate AI systems, including their societal impacts and ethical considerations; students should be able to create and not just consume AI technologies responsibly; and students should be innovative and persistent in solving problems with AI.

Some computer science teachers are already teaching about and modeling AI use with their students. York, for instance, allows her students to use large language models for brainstorming, to troubleshoot bugs in their code, or to help them get unstuck in a problem.

“It replaced the coding ducks,” York said. “It’s a method in computer science classes where you put a rubber duck in front of the student, and they talk through their problem to the duck. The intention is that, when you talk to a duck and you explain your problem, you kind of figure out what you want to say and what you want to do.”

The rise of generative AI in K-12 could also mean that educators need to rethink their assignments and assessments, said Allen Antoine, the director of computer science education strategy for the Texas Advanced Computing Center at the University of Texas at Austin.

“You need to do small tweaks of your lesson design,” Antoine said. “You can’t just roll out the same lesson you’ve been doing in CS for the last 20 years. Keep the same learning objective. Understand that the students need to learn this thing when they walk out. But let’s add some AI to have that discussion, to get them hooked into the assignment but also to help them think about how that assignment has changed now that they have access to these 21st century tools.”

But computer science education and AI literacy shouldn’t just be confined to computer science classes, experts said.

“All young people need to be introduced to what AI systems are, how they’re built, their potential, limitations and so on,” Colligan said. “The advent of AI technologies is opening up many more opportunities across the economy for kids who understand computers and computer science to be able to change the world for the better.”

What educators need in order to prepare students for what’s next

The challenge in making AI literacy and computer science cross-curricular is not new in education: Districts need more funding to provide teachers with the resources they need to teach AI literacy and other computer science skills, and educators need dedicated time to attend professional development opportunities, experts said.

“There are a lot of smart people across the nation who are developing different projects, different teacher professional development ideas,” Antoine said. “But there has to be some kind of a commitment from the top down to say that it’s important.”

The Trump administration has made AI in education a focus area: President Donald Trump, in April, signed an executive order that called for infusing AI throughout K-12 education. The U.S. Department of Education, in July, added advancing the use of AI in education as one of its proposed priorities for discretionary grant programs. And in August, first lady Melania Trump launched the Presidential AI Challenge for students and teachers to solve problems in their schools and communities with the help of AI.

The Trump administration’s AI push comes amid its substantial cuts to K-12 education and research.

Still, Antoine said he’s “optimistic that really good things are going to come from the new focus on AI.”





Source link

Continue Reading

AI Insights

Google’s top AI scientist says ‘learning how to learn’ will be next generation’s most needed skill

Published

on


ATHENS, Greece — A top Google scientist and 2024 Nobel laureate said Friday that the most important skill for the next generation will be “learning how to learn” to keep pace with change as Artificial Intelligence transforms education and the workplace.

Speaking at an ancient Roman theater at the foot of the Acropolis in Athens, Demis Hassabis, CEO of Google’s DeepMind, said rapid technological change demands a new approach to learning and skill development.

“It’s very hard to predict the future, like 10 years from now, in normal cases. It’s even harder today, given how fast AI is changing, even week by week,” Hassabis told the audience. “The only thing you can say for certain is that huge change is coming.”

The neuroscientist and former chess prodigy said artificial general intelligence — a futuristic vision of machines that are as broadly smart as humans or at least can do many things as well as people can — could arrive within a decade. This, he said, will bring dramatic advances and a possible future of “radical abundance” despite acknowledged risks.

Hassabis emphasized the need for “meta-skills,” such as understanding how to learn and optimizing one’s approach to new subjects, alongside traditional disciplines like math, science and humanities.

“One thing we’ll know for sure is you’re going to have to continually learn … throughout your career,” he said.

The DeepMind co-founder, who established the London-based research lab in 2010 before Google acquired it four years later, shared the 2024 Nobel Prize in chemistry for developing AI systems that accurately predict protein folding — a breakthrough for medicine and drug discovery.

Greece’s Prime Minister Kyriakos Mitsotakis, left, and Demis Hassabis, CEO of Google’s artificial intelligence research company DeepMind discuss the future of AI, ethics and democracy during an event at the Odeon of Herodes Atticus, in Athens, Greece, Friday, Sept. 12, 2025. Credit: AP/Thanassis Stavrakis

Greek Prime Minister Kyriakos Mitsotakis joined Hassabis at the Athens event after discussing ways to expand AI use in government services. Mitsotakis warned that the continued growth of huge tech companies could create great global financial inequality.

“Unless people actually see benefits, personal benefits, to this (AI) revolution, they will tend to become very skeptical,” he said. “And if they see … obscene wealth being created within very few companies, this is a recipe for significant social unrest.”

Mitsotakis thanked Hassabis, whose father is Greek Cypriot, for rescheduling the presentation to avoid conflicting with the European basketball championship semifinal between Greece and Turkey. Greece later lost the game 94-68.

_



Source link

Continue Reading

AI Insights

Artificial Intelligence Cheating – The Quad-City Times

Published

on



Artificial Intelligence Cheating  The Quad-City Times



Source link

Continue Reading

Trending