Connect with us

Education

AI and the Future of Higher Education

Published

on


In 2016, I created the first Coursera course for our university. It was based on my book, The Bilingual Brain. A few years later, they asked me to come and speak to a group of faculty who were thinking about creating their own courses. Before I spoke, Jeff Morgan, the Associate Provost for Education Innovation and Technology, came to introduce the session on Coursera. The first thing he said was that Coursera was not going to replace universities. The idea that someone would learn the same thing on their own did not fit with history. Students already had that available to them. They could just pick up a book. The fact that a book had not replaced universities was evidence of their value. His view was that there was something about people gathering to learn together that was irreplaceable.

Today, people have begun to ask themselves whether AI will replace higher education. The question is most pressing in a recent article by James Walsh in New York Magazine entitled “Everyone Is Cheating Their Way Through College: ChatGPT has unraveled the entire academic project.” And once again, I turn to the point that Jeff Morgan made more than 10 years ago. If higher education were just about learning on your own, then books would have done the job long ago.

AI has now introduced the ability to write articles entirely on its own. Students are using it, and faculty are using it too. I have experimented with it to generate texts that can offer opinions that are roughly in my voice after extended Q&A sessions. It is remarkable, and as a tool, it can open up new ways of exploring ideas that I would not have explored otherwise. But after extensive use, I am sure that it is not a replacement for what I can do on my own. In the end, the thinking still has to come from me. And as many have pointed out, writing is a form of thinking.

It is the loss of the writing process where people fear the shortcuts will short-circuit our ability to create on our own. As educators, we have the ability to enhance thinking and writing by controlling the amount of technology used in the classroom. Personally, I have moved to in-person written exams. I have asked students to present in class. In the fall, I will ask them to write in class and then assemble their own writings into some form of an in-person handwritten final. Will they write less? Most likely. Could they have written more with the help of MS Word? Definitely. But I am okay with that. Last semester, some of my students came to talk to me, worried about their final presentations. They felt they were not good enough. And I assured them that they were. One in particular did not like a video she made. I told her that if I wanted a cleaned-up version of a video, I would just watch Netflix or YouTube. What I wanted to do was understand the world from their perspective.

To paraphrase Jeff Morgan again, the point is that universities are here to bring people together to learn something new. Yes, AI might change that, but it will not replace it. It is up to us as instructors to control the classroom. That is why I think it is a matter of time before we go back to what has worked for at least 100 years, paper and pencil.

The world can be as developed and sophisticated as it is. But the classroom belongs to us as humans. We no longer need to teach people using technology in the classroom. We have it readily available every day. What sets higher education apart is being together in the service of learning something new. As educators, we should work to keep it that way.



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Education

Anthropic Continue The Push For AI In Education

Published

on


Let’s be honest. AI has already taken a seat in the classroom. Google, Microsoft, OpenAI, Anthropic have all been pushing hard. Today brings more announcements from Athropic, the company behind the AI chatbot Claude, adding even more momentum. The shift isn’t subtle anymore. It’s fast, loud and it’s happening whether schools are ready or not.

It’s not only big tech. The U.S. government is also driving efforts to integrate A1 into education.

The Balance of Innovation and Safety

There’s real concern, and for good reason. Sure, the benefits are hard to ignore. AI tutoring, lighter workloads for teachers, more personalized learning paths for students. It all sounds great. But there’s a flip side. Missteps here could make existing education gaps worse. And once the damage is done, it’s tough to undo.

Many policymakers are stepping in early. They’re drafting ethical guardrails, pushing for equitable access, and starting to fund research into what responsible use of AI in education really looks like. Not as a PR move, but because the stakes are very real.

Meanwhile, the tech companies are sprinting. Google is handing out AI tools for schools at no cost, clearly aiming for reach. The strategy is simple: remove barriers and get in early. Just yesterday Microsoft, OpenAI, and Anthropic teamed up to build a national AI academy for teachers. An acknowledgment that it’s not the tools, but the people using them, that determine success. Teachers aren’t optional in this equation. They’re central.

Claude’s New Education Efforts

Claude for Education’s recent moves highlight what effective integration could look like. Its Canvas integration means students don’t need to log into another platform or juggle windows. Claude just works inside what they’re already using. That kind of invisible tech, could be the kind that sticks.

Then there’s the Panopto partnership. Students can now access lecture transcripts directly in their Claude conversations. Ask a question about a concept from class and Claude can pull the relevant sections right away. No need to rewatch an entire lecture or scrub through timestamps. It’s like giving every student their own research assistant.

And they’ve gone further. Through Wiley, Claude can now pull from a massive library of peer-reviewed academic sources. That’s huge. AI tools are often criticized for producing shaky or misleading information. But with access to vetted, high-quality content, Claude’s answers become more trustworthy. In a world overflowing with misinformation, that matters more than ever.

Josh Jarrett, senior vice president of AI growth at Wiley, emphasized this: “The future of research depends on keeping high-quality, peer-reviewed content central to AI-powered discovery. This partnership sets the standard for integrating trusted scientific content with AI platforms.”

Claude for Education are building a grassroots movement on campuses, too. Their student ambassador program is growing fast and new Claude Builder Clubs are popping up at universities around the world. Rather than being coding bootcamps or formal classes, they’re open spaces where students explore what they can actually make with AI. Workshops, demo nights and group builds.

These clubs are for everyone. Not just computer science majors. Claude’s tools are accessible enough that students in any field, from philosophy to marketing, can start building. That kind of openness helps make AI feel less like elite tech and more like something anyone can use creatively.

Privacy is a big theme here, too. Claude seems to be doing things right. Conversations are private, they’re not used for model training and any data-sharing with schools requires formal approvals.cStudents need to feel safe using AI tools. Without that trust, none of this works long term.

At the University of San Francisco School of Law, students are working with Claude to analyze legal arguments, map evidence and prep for trial scenarios. This is critical training for the jobs they’ll have after graduation. In the UK, Northumbria University is also leaning in. Their focus is on equity, digital access and preparing students for a workplace that’s already being shaped by AI

Graham Wynn, vice-chancellor for education at Northumbria University, puts the ethical side of AI front and center: “The availability of secure and ethical AI tools is a
significant consideration for our applicants, and our investment in Claude for Education
will position Northumbria as a forward-thinking leader in ethical AI innovation.”

They see tools like Claude not just as educational add-ons, but as part of a broader strategy to drive social mobility and reduce digital poverty. If you’re serious about AI in education, that’s the level of thinking it takes.

Avoiding Complexity and Closing Gaps

The core truth here is simple. AI’s role in education is growing whether we plan for it or not. The technology is getting more capable. The infrastructure is being built. But what still needs to grow, is a culture of responsible use. The challenge for education isn’t chasing an even smarter tool, but ensuring the tools we have serve all students equally.

That means listening to educators. It means designing for inclusion from the ground up. It means making sure AI becomes something that empowers students, not just another layer of complexity.

The next few years will shape everything. If we get this right, AI could help close long-standing gaps in education. If we don’t, we risk deepening them in ways we’ll regret later.

This is more than a tech story. It’s a human one. And the decisions being made today will echo for a long time.



Source link

Continue Reading

Education

AI can access your school courses

Published

on


Using genAI software like ChatGPT for school makes perfect sense, considering how sophisticated the software has become. It’s not about cheating on exams or having the AI do your homework, though some people will use it that way. It’s about having an AI tutor that understands natural language and can guide you while you learn.

It’s like taking your professors home with you to explain the topics you’re still struggling with. Combined with human teachers, AI tools can make a real difference in education.

OpenAI is already working on a ChatGPT Study Together model that will act as an AI tutor, but you don’t have to wait for that product to launch. Anthropic is already ahead, having released a Claude for Education product back in April.

The AI firm is now ready to give Claude for Education a major upgrade. Anthropic on Wednesday announced new tools for Claude that let the AI access school courses and materials more easily, along with new university partnerships that will bring Claude to even more students.

Canvas, Panopto, and Wiley support

The current Learning Mode experience in Claude for Education involves turning the AI into a teacher-like persona. Instead of providing direct answers or solutions, Claude uses Socratic questioning to help students find the answers on their own.

“How would you approach this problem?” or “What evidence supports your conclusions?” are examples of questions Claude will ask in this mode.

Using Claude for Education with Canvas. Image source: Anthropic

The July update will let users give Claude more context by connecting it to three student-friendly data sources: Canvas, Panopto, and Wiley.

Claude will use MCP servers to gather information from Panopto, and Wiley. Panopto offers lecture transcripts. Wiley provides access to peer-reviewed content that can support learning with Claude.

Canvas contains course materials. Claude will also support Canvas LTI (Learning Tools Interoperability), letting students use the AI directly within their Canvas courses.

New partnerships

Anthropic also announced two new partnerships with “forward-thinking institutions” that want to give students access to AI tools built for education. These schools are the University of San Francisco School of Law and Northumbria University.

The former is especially notable in a world where some lawyers have used AI in legal matters, only for it to fumble legal citations. Future lawyers need to learn how AI can be used effectively and where its limits are.

Dean Johanna Kalb explained how Claude will be used at the University of San Francisco School of Law to actually help students:

We’re excited to introduce students to the practical use of LLMs in litigation. One way we’re doing this is through our Evidence course, where this fall, students will gain direct experience applying LLMs to analyze claims and defenses, map evidence to elements of each cause of action, identify evidentiary gaps to inform discovery, and develop strategies for admission and exclusion of evidence at trial.

That’s certainly better than having genAI write your legal documents and risk hallucinating key details.

Finally, Anthropic is expanding its student ambassadors program, giving more passionate students the chance to contribute to the Claude community. Claude Builder Clubs will launch on campuses around the world, offering hackathons, workshops, and demo nights for students interested in AI.



Source link

Continue Reading

Education

HBK trustee Harsh Kapadia shares vision for AI in education

Published

on


New Delhi [India], July 9: Harsh Kapadia, Trustee of The HB Kapadia New High School, represented the institution at the prestigious Economic Times Annual Education Summit 2025 in New Delhi. The summit, themed “Fuelling the Education Economy with AI: The India Story”, brought together some of the country’s most influential voices in education, technology, and policymaking.

Sharing the stage with national leaders such as Sanjay Jain, Head of Google for Education, India, Aanchal Chopra, Regional Head, North, LinkedIn, Shishir Jaipuria, Chairman of Jaipuria Group of Schools, and Shantanu Prakash, Founder of Millennium Schools, Mr. Kapadia highlighted the critical role of Artificial Intelligence in shaping the future of Indian education.

In his remarks, Mr. Kapadia emphasised the urgent need to integrate AI into mainstream schooling. He also said that this will begin not with advanced algorithms but with teachers.

“AI does not begin with algorithms. It begins with empowered educators,” he said, calling for schools to prioritise teacher readiness alongside technological upgrades.

He elaborated on HBK’s progressive steps under its FuturEdge Program, a future-readiness initiative that integrates academics with emerging technologies and life skills.

“Artificial Intelligence will soon be as essential to education as electricity and the internet,” he said, emphasising that while AI is a powerful technological tool, its greatest impact lies in how teachers and students use it collaboratively. He noted that AI won’t replace teachers, but teachers who use AI will replace those who don’t.

His recommendations included weekly AI training periods for teachers, AI-infused school curriculum, infrastructure upgrades, and cross-industry collaborations to expose students to real-world applications of AI.

Mr. Kapadia shared that HBK has already begun incorporating AI into its school assemblies and is planning to introduce a dedicated “AI Period” in the academic calendar. The school is also conceptualising an annual “AI Fest” for students, where innovation and problem-solving will take centre stage. In terms of infrastructure, the school is actively upgrading classrooms with AI-enabled digital panels and computer labs designed for hands-on learning.

Calling for greater collaboration between schools and industry, Mr. Kapadia also proposed regular expert-led sessions with professionals from Google, LinkedIn, IBM, and AI startups.

Concluding his address, he reaffirmed HBK’s commitment to pioneering responsible and human-centred use of technology in education, saying, “AI is not a separate subject. It is a way of thinking, creating, and teaching. If we want future-ready students, we must begin with future-ready schools.”

 



Source link

Continue Reading

Trending