Education
AI, Irreality and the Liberal Educational Project (opinion)

I work at Marquette University. As a Roman Catholic, Jesuit university, we’re called to be an academic community that, as Pope John Paul II wrote, “scrutinize[s] reality with the methods proper to each academic discipline.” That’s a tall order, and I remain in the academy, for all its problems, because I find that job description to be the best one on offer, particularly as we have the honor of practicing this scrutinizing along with ever-renewing groups of students.
This bedrock assumption of what a university is continues to give me hope for the liberal educational project despite the ongoing neoliberalization of higher education and some administrators’ and educators’ willingness to either look the other way regarding or uncritically celebrate the generative software (commonly referred to as “generative artificial intelligence”) explosion over the last two years.
In the time since my last essay in Inside Higher Ed, and as Marquette’s director of academic integrity, I’ve had plenty of time to think about this and to observe praxis. In contrast to the earlier essay, which was more philosophical, let’s get more practical here about how access to generative software is impacting higher education and our students and what we might do differently.
At the academic integrity office, we recently had a case in which a student “found an academic article” by prompting ChatGPT to find one for them. The chat bot obeyed, as mechanisms do, and generated a couple pages of text with a title. This was not from any actual example of academic writing but instead was a statistically probable string of text having no basis in the real world of knowledge and experience. The student made a short summary of that text and submitted it. They were, in the end, not found in violation of Marquette’s honor code, since what they submitted was not plagiarized. It was a complex situation to analyze and interpret, done by thoughtful people who care about the integrity of our academic community: The system works.
In some ways, though, such activity is more concerning than plagiarism, for, at least when students plagiarize, they tend to know the ways they are contravening social and professional codes of conduct—the formalizations of our principles of working together honestly. In this case, the student didn’t see the difference between a peer-reviewed essay published by an academic journal and a string of probabilistically generated text in a chat bot’s dialogue box. To not see the difference between these two things—or to not care about that difference—is more disconcerting and concerning to me than straightforward breaches of an honor code, however harmful and sad such breaches are.
I already hear folks saying: “That’s why we need AI literacy!” We do need to educate our students (and our colleagues) on what generative software is and is not. But that’s not enough. Because one also needs to want to understand and, as is central to the Ignatian Pedagogical Paradigm that we draw upon at Marquette, one must understand in context.
Another case this spring term involved a student whom I had spent several months last fall teaching in a writing course that took “critical AI” as its subject matter. Yet this spring term the student still used a chat bot to “find a quote in a YouTube video” for an assignment and then commented briefly on that quote. The problem was that the quote used in the assignment does not appear in the selected video. It was a simulacrum of a quote; it was a string of probabilistically generated text, which is all generative software can produce. It did not accurately reflect reality, and the student did not cite the chat bot they’d copied and pasted from, so they were found in violation of the honor code.
Another student last term in the Critical AI class prompted Microsoft Copilot to give them quotations from an essay, which it mechanically and probabilistically did. They proceeded to base their three-page argument on these quotations, none of which said anything like what the author in question actually said (not even the same topic); their argument was based in irreality. We cannot scrutinize reality together if we cannot see reality. And many of our students (and colleagues) are, at least at times, not seeing reality right now. They’re seeing probabilistic text as “good enough” as, or conflated with, reality.
Let me point more precisely to the problem I’m trying to put my finger on. The student who had a chat bot “find” a quote from a video sent an email to me, which I take to be completely in earnest and much of which I appreciated. They ended the email by letting me know that they still think that “AI” is a really powerful and helpful tool, especially as it “continues to improve.” The cognitive dissonance between the situation and the student’s assertion took me aback.
Again: the problem with the “We just need AI literacy” argument. People tend not to learn what they do not want to learn. If our students (and people generally) do not particularly want to do work, and they have been conditioned by the use of computing and their society’s habits to see computing as an intrinsic good, “AI” must be a powerful and helpful tool. It must be able to do all the things that all the rich and powerful people say it does. It must not need discipline or critical acumen to employ, because it will “supercharge” your productivity or give you “10x efficiency” (whatever that actually means). And if that’s the case, all these educators telling you not to offload your cognition must be behind the curve, or reactionaries. At the moment, we can teach at least some people all about “AI literacy” and it will not matter, because such knowledge refuses to jibe with the mythology concerning digital technology so pervasive in our society right now.
If we still believe in the value of humanistic, liberal education, we cannot be quiet about these larger social systems and problems that shape our pupils, our selves and our institutions. We cannot be quiet about these limits of vision and questioning. Because not only do universities exist for the scrutinizing of reality with the various methods of the disciplines as noted at the outset of this essay, but liberal education also assumes a view of the human person that does not see education as instrumental but as formative.
The long tradition of liberal education, for all its complicity in social stratification down the centuries, assumes that our highest calling is not to make money, to live in comfort, to be entertained. (All three are all right in their place, though we must be aware of how our moneymaking, comfort and entertainment derive from the exploitation of the most vulnerable humans and the other creatures with whom we share the earth, and how they impact our own spiritual health.)
We are called to growth and wisdom, to caring for the common good of the societies in which we live—which at this juncture certainly involves caring for our common home, the Earth, and the other creatures living with us on it. As Antiqua et nova, the note released from the Vatican’s Dicastery for Culture and Education earlier this year (cited commendingly by secular ed-tech critics like Audrey Watters) reiterates, education plays its role in this by contributing “to the person’s holistic formation in its various aspects (intellectual, cultural, spiritual, etc.) … in keeping with the nature and dignity of the human person.”
These objectives of education are not being served by students using generative software to satisfy their instructors’ prompts. And no amount of “literacy” is going to ameliorate the situation on its own. People have to want to change, or to see through the neoliberal, machine-obsessed myth, for literacy to matter.
I do believe that the students I’ve referred to are generally striving for the good as they know how. On a practical level, I am confident they’ll go on to lead modestly successful lives as our society defines that term with regard to material well-being. I assume their motivation is not to cause harm or dupe their instructors; they’re taking part in “hustle” culture, “doing school” and possibly overwhelmed by all their commitments. Even if all this is indeed the case, liberal education calls us to more, and it’s the role of instructors and administrators to invite our students into that larger vision again and again.
If we refuse to give up on humanistic, liberal education, then what do we do? The answer is becoming clearer by the day, with plenty of folks all over the internet weighing in, though it is one many of us do not really want to hear. Because at least one major part of the answer is that we need to make an education genuinely oriented toward our students. A human-scale education, not an industrial-scale education (let’s recall over and over that computers are industrial technology). The grand irony of the generative software moment for education in neoliberal, late-capitalist society is that it is revealing so many of the limits we’ve been putting on education in the first place.
If we can’t “AI literacy” our educational problems away, we have to change our pedagogy. We have to change the ways we interact with our students inside the classroom and out: to cultivate personal relationships with them whenever possible, to model the intellectual life as something that is indeed lived out with the whole person in a many-partied dialogue stretching over millennia, decidedly not as the mere ability to move information around. This is not a time for dismay or defeat but an incitement to do the experimenting, questioning, joyful intellectual work many of us have likely wanted to do all along but have not had a reason to go off script for.
This probably means getting creative. Part of getting creative in our day probably means de-computing (as Dan McQuillan at the University of London labels it). To de-compute is to ask ourselves—given our ambient maximalist computing habits of the last couple decades—what is of value in this situation? What is important here? And then: Does a computer add value to this that it is not detracting from in some other way? Computers may help educators collect assignments neatly and read them clearly, but if that convenience is outweighed by constantly having to wonder if a student has simply copied and pasted or patch-written text with generative software, is the value of the convenience worth the problems?
Likewise, getting creative in our day probably means looking at the forms of our assessments. If the highly structured student essay makes it easier for instructors to assess because of its regularity and predictability, yet that very regularity and predictability make it a form that chat bots can produce fairly readily, well: 1) the value for assessing may not be worth the problems of teeing up chat bot–ifiable assignments and 2) maybe that wasn’t the best form for inviting genuinely insightful and exciting intellectual engagement with our disciplines’ materials in the first place.
I’ve experimented with research journals rather than papers, with oral exams as structured conversations, with essays that focus intently on one detail of a text and do not need introductions and conclusions and that privilege the student’s own voice, and other in-person, handmade, leaving-the-classroom kinds of assessments over the last academic year. Not everything succeeded the way I wanted, but it was a lively, interactive year. A convivial year. A year in which mostly I did not have to worry about whether students were automating their educations.
We have a chance as educators to rethink everything in light of what we want for our societies and for our students; let’s not miss it because it’s hard to redesign assignments and courses. (And it is hard.) Let’s experiment, for our own sakes and for our students’ sakes. Let’s experiment for the sakes of our institutions that, though they are often scoffed at in our popular discourse, I hope we believe in as vibrant communities in which we have the immense privilege of scrutinizing reality together.
Education
Harnessing AI thoughtfully will be critical to prepare for tomorrow’s workforce, says US Education Secretary Linda McMahon: Here’s what students must learn

The American classroom is standing at a historic crossroads. For decades, the defining debates in education were about funding formulas, standardized tests, and curriculum design. Now, as artificial intelligence (AI) reshapes industries at a pace unseen since the Industrial Revolution, the urgent question is not whether schools should adopt technology—but how they should wield it responsibly to prepare the workforce of tomorrow.That was the core message delivered yesterday by U.S. Secretary of Education Linda McMahon during her visit to Austin, Texas, a stop on her Returning Education to the States Tour. Touring Alpha School—a private K–8 institution that has embedded AI into its instructional model—McMahon witnessed firsthand how algorithms can tailor learning to a child’s individual pace and strengths. Later, at the University of Austin, she sat down with President Carlos Carvalho to explore how higher education is cultivating innovation in the age of machine intelligence.“Harnessing AI thoughtfully will be critical to expanding opportunity and preparing students for tomorrow’s workforce,” McMahon said in a press release. “During my visit to Austin, I saw how AI can open doors, but also how curiosity, critical thinking, and open debate that drive learning. As we return education to the states, it’s vitally important that innovation is guided by the fundamentals, so that the classroom remains a true marketplace of ideas.”Her words carry weight at a moment when both the promises and perils of AI in education are becoming more visible. Proponents see AI as a force multiplier, personalizing instruction, automating routine tasks for teachers, and exposing students to digital tools they will inevitably encounter in future careers. Critics, however, warn of algorithmic bias, privacy concerns, and the risk of reducing education to data-driven efficiency rather than human inquiry.McMahon’s choice of Austin as a showcase is telling. The city is fast becoming a national hub for tech-driven education reform. Alpha School has built its reputation on leveraging AI platforms that adapt in real time to student performance, freeing teachers to focus on mentorship rather than rote instruction. The University of Austin, still young but already carving out a niche as an innovation-focused institution, is experimenting with ways to integrate ethics and humanities into technical training—reminding students that technological progress must be anchored in human values.Texas Education Commissioner Mike Morath, who joined the Secretary during her visit, underscored the point: “I am grateful to have joined Secretary McMahon at today’s visit and appreciated the thoughtful discussion on the responsible use of innovation in schools. At the Texas Education Agency, we are always searching for transformational best practices that can be shared with Texas public schools to help them best meet the needs of students and help our dedicated educators focus on what they do best – teaching,” as reported in a press release of US.
What students must learn in the AI era
McMahon’s remarks touch on a critical recalibration. The workforce of tomorrow will require a blend of digital fluency and human-centered skills. Experts increasingly argue that AI literacy should sit alongside reading, writing, and mathematics as a foundational competency. Yet knowing how to “use” AI is only the starting point.Students must also cultivate:
- Critical Thinking: The ability to question outputs generated by algorithms rather than passively accept them.
- Ethical Reasoning: An understanding of privacy, fairness, and bias in digital systems.
- Adaptability: Comfort in navigating a labor market where roles may evolve or disappear as automation deepens.
- Collaboration: Working with machines and with each other to solve problems that no single tool—or person—can resolve alone.
- Curiosity: As McMahon emphasized, genuine learning emerges not from pre-programmed answers but from asking better questions.
The larger pictureBy anchoring AI adoption in “curiosity, critical thinking, and open debate,” McMahon is pushing against a reductive narrative of education as a race to master tools. Instead, she appears to be calling for a model where innovation complements, but does not eclipse, the timeless goals of schooling: fostering independent minds and cultivating democratic discourse.As the Returning Education to the States Tour continues, the message from Austin reverberates well beyond Texas. States, districts, and schools will soon face decisions about how aggressively to integrate AI into classrooms. Those choices will determine not just the contours of learning, but the character of the next generation of workers and citizens.In the end, the challenge is not whether schools can keep pace with technology—it is whether they can ensure that technology serves learning, not the other way around.
Education
Empowering learners with AI from classrooms to career
Image credit: Adobe Stock/Comeback Images.
As generative AI continues to reshape education, Adobe sees AI not as a replacement for thinking, but as a catalyst — for accelerating ideation, enhancing creativity, and fostering deeper engagement in learning. We’re energized by the White House’s Pledge to America’s Youth to invest in AI education as a critical step in putting essential AI and creative skills in the hands of the next generation of learners and we are proud to contribute to this critical area.
Adobe is providing the approximately 50+ million K-12 students and teachers across the United States with free access to Adobe Express for Education — Adobe’s all-in-one creativity app with generative AI tools designed for the classroom. Adobe is also providing professional development and training for all U.S. educators to help them better equip their students with AI skills.
The world of work is changing fast, and AI skills are no longer a nice-to-have; they’re a must-have. That’s why Adobe is committed to preparing learners of all ages for the AI-driven world we live in now.
Why AI skills matter more than ever
Early access to AI skills is essential to ensure students aren’t left behind in a rapidly evolving workforce. Embedding these skills now builds a more innovative future. The data is compelling: According to Lightcast, AI-skilled roles offer a 28 percent salary premium, with demand growing across industries — including 800 percent growth in generative AI roles in non-tech industries and a 200 percent increase in education-related roles since 2022.
At Adobe, we see firsthand how AI is already revolutionizing the creative process. Adobe Firefly is supercharging creativity and productivity with features such as Generative Fill and Generative Extend. Acrobat Studio is revolutionizing documents for the AI era, turning static files into conversational knowledge hubs, with a personalized AI Assistant for deeper insights. And Adobe Express is bringing AI-charged ideation and creation to everyone. AI can help consumers and business professionals work more efficiently and raise the bar of what they create.
Adobe is committed to empowering learners at every stage
For decades, Adobe has been dedicated to supporting creativity and digital literacy in education. Adobe’s commitments as part of the White House Pledge to America’s Youth are part of our broader mission to empower learners of all ages and in all stages of learning to ideate, create and collaborate with AI.
Through programs like Adobe Creative Campuses, we partner with universities to bring creative education and design thinking into higher-ed curricula. For specialists investing in their careers, the Adobe Certified Professionals program offers formal credentialing and certification in Adobe tools like Photoshop, Illustrator, Firefly and more. The Adobe Digital Academy is preparing learners with creative and technical skills and is focused on AI literacy, content creation, and digital marketing. The program aims to reach 30 million next-generation learners and educators by 2030.
Last year, Adobe invested $100 million to expand access through product donations, scholarships, and partnerships with schools, nonprofits, and platforms like Coursera. Learners gain hands-on experience with Adobe Express, Acrobat, and Creative Cloud, developing in-demand skills that help them stand out in an increasingly digital world.
I was honored to take part in the White House Task Force meeting on the AI Education Pledge, which underscored the urgency of equipping every learner with the knowledge and tools to thrive in this new era. And as part of the Pledge to America’s Youth, students and educators interested in participating in the Presidential AI Challenge will also be able to use Adobe’s AI tools, including Adobe Express, in their submission.
At the White House Task Force Meeting on Artificial Intelligence Education.
Demystifying AI in the classroom
The positive impact of bringing AI into the classrooms is already being felt as Adobe’s AI-powered creative tools make learning more engaging and collaborative. Using Adobe’s AI tools, which are designed with safety and user control at their core, teachers are seeing deeper learning and increased motivation. For example, teachers using Adobe Express to integrate generative AI into hands-on projects found that the curriculum boosted student creativity, prompted ethical discussions about the application of AI, and sparked teamwork and engagement beyond traditional assignments.
Source: Leanlab Education x Adobe educator survey, spring 2025.
Building together
To fully realize the potential of AI in education, we’re committed to working alongside governments and education leaders to empower both teachers and students with AI skills and capabilities and help train an innovative, future-ready workforce that can thrive in the digital age.
Educators can learn more about using free Adobe tools and curriculum resources to unlock creative potential in your classroom.
We urge policymakers to invest in early AI skilling and equitable access for all students. Learn more about the commitments Adobe and others have made to the White House AI Education Pledge and join us in supporting a future where every learner is prepared for the opportunities ahead.
And we urge everyone to share stories of AI in the classroom. Let’s inspire each other and build a brighter future together.
Education
New push for AI as Education Minister Erica Stanford announces curriculum changes

Education Minister Erica Stanford
Photo: RNZ / Samuel Rillstone
The government has announced a number of new secondary school subjects and a new emphaisis on artificial intelligence it says will help prepare young people for the jobs of the future.
Education Minister Erica Stanford said those working on the changes will investigate having a new Year 13 subject on Generative AI “for later development”.
“With the rapid development of AI, students will also be able to learn about and use generative AI in a range of subjects. This may include learning about how digital systems work, machine learning, cybersecurity, and digital ethics.”
Stanford said the new subjects, being developed for the Years 11 to 13 curriculum, reflect the growing importance of science, technology, engineering and mathematics, often referred to as STEM.
The subjects include: automative engineering, building and construction, and infrastructure engineering.
“Students will be able to specialise in areas such as earth and space science, statistics and data science, and electronics and mechatronics. There will also be a range of new specialist maths subjects including further maths.
“When our young people leave school, we want doors to open for them whether they’re going to tertiary study, learning a trade, or heading straight into work. These refreshed subjects will provide students with choice, purposeful pathways and opportunities for specialisation that set them up for success,” Stanford said in a statement.
It was vital students had access to “innovative and dynamic subjects” that would help the country’s future, she said.
Other new subjects include: civics, politics and philosophy, Pacific studies, Te Mātai i te Ao Māori and music technology.
Te Marautanga o Aotearoa will be resourced with a first ever detailed curriculum in te reo Māori as well as new subjects including new Tātai Arorangi (Māori traditional systems of Earth and Sky), Te Ao Whakairo (Māori carving) and Te Ao Māori subjects.
The subjects are planned to be phased in from 2028.
Sign up for Ngā Pitopito Kōrero, a daily newsletter curated by our editors and delivered straight to your inbox every weekday.
-
Business2 weeks ago
The Guardian view on Trump and the Fed: independence is no substitute for accountability | Editorial
-
Tools & Platforms4 weeks ago
Building Trust in Military AI Starts with Opening the Black Box – War on the Rocks
-
Ethics & Policy2 months ago
SDAIA Supports Saudi Arabia’s Leadership in Shaping Global AI Ethics, Policy, and Research – وكالة الأنباء السعودية
-
Events & Conferences4 months ago
Journey to 1000 models: Scaling Instagram’s recommendation system
-
Jobs & Careers2 months ago
Mumbai-based Perplexity Alternative Has 60k+ Users Without Funding
-
Education2 months ago
VEX Robotics launches AI-powered classroom robotics system
-
Podcasts & Talks2 months ago
Happy 4th of July! 🎆 Made with Veo 3 in Gemini
-
Education2 months ago
Macron says UK and France have duty to tackle illegal migration ‘with humanity, solidarity and firmness’ – UK politics live | Politics
-
Funding & Business2 months ago
Kayak and Expedia race to build AI travel agents that turn social posts into itineraries
-
Podcasts & Talks2 months ago
OpenAI 🤝 @teamganassi