Connect with us

Education

AI is now allowed in IITs and IIMs, has the ethics debate reached its end?

Published

on


In IITs, IIMs, and universities across the country, the use of AI sits in a grey zone. Earlier this year, IIM Kozhikode Director Prof Debashis Chatterjee said that there was no harm in using ChatGPT to write research papers. What started as a whisper has now become a larger question: not whether AI can be used, but how it should be.

Students and professors alike are now open to using it. Many already do, but without clear guidelines. The real issue now isn’t intent, but the lack of defined boundaries that need to be set.

Across India’s top institutions, including IITs, IIMs, and others, the debate is no longer theoretical. It’s practical; real; urgent. From IIT Delhi to IIM Sambalpur, from classrooms to coding labs, students and faculty are confronting the same reality: AI is not just here. It’s working. And it’s working fast.

“There’s no denying AI is here to stay, and the real question is not if it should be used, but how. Students are already using it to support their learning, so it’s vital they understand both its strengths and its limits, including ethical concerns and the cognitive cost of over-reliance,” said Professor Dr Srikanth Sugavanam, IIT Mandi, responding to a question to India Today Digital.

“Institutions shouldn’t restrict AI use, but they must set clear guardrails so that both teachers and students can navigate it responsibly,” he further added.

INITIATIVE BY IIT DELHI

In a changing but firm step, IIT Delhi has issued guidelines for the ethical use of AI by students and faculty. The institute conducted an internal survey before framing them. What they found was striking.

Over 80 percent of students admitted to using tools like ChatGPT, GitHub Copilot, Perplexity AI, Claude, and Chatbots.

On the other hand, more than half the faculty members said they too were using AI — some for drafting, some for coding, some for academic prep.

The new rules are not about banning AI. It is more about drawing a line that says: use it, but don’t outsource your thinking.

ON CAMPUS, A SHIFT IS UNDERWAY

At IIM Jammu, students say the policy is strict: no more than 10 percent AI use is allowed in any assignment.

One student put it simply: “We’re juggling lectures, committees, and eight assignments in three months. Every day feels like a new ball added to the juggling act. In that heat, AI feels like a bit of rain.”

They’re not exaggerating. There are tools now that can read PDFs aloud, prepare slide decks, even draft ideas. The moment you’re stuck, you can ‘chat’ your way out. The tools are easy, accessible, and, for many, essential.

But here’s the other side: some students now build their entire workflow around AI. They use AI to write, AI to humanise, AI to bypass AI detectors.

“Using of plagiarism detection tools, like Turnitin, which claim to detect the Gen-AI content. However, with Gen-AI being so fast evolving, it is difficult for these tools to keep up with its pace. We don’t have a detailed policy framework to clearly distinguish between the ethical and lazy use of Gen-AI,” said Prof Dr Indu Joshi, IIT Mandi.

NOT WHAT AI DOES, BUT WHAT IT REPLACES

At IIM Sambalpur, the administration isn’t trying to hold back AI. They’re embracing it. The institute divides AI use into three pillars:

  • Cognitive automation – for tasks like writing and coding
  • Cognitive insight – for performance assessment
  • Cognitive engagement – for interaction and feedback

Students are encouraged to use AI tools, but with one condition: transparency. They must declare their sources. If AI is used, it must be cited. Unacknowledged use is academic fraud.

“At IIM Sambalpur, we do not prohibit AI tools for research, writing, or coding. We encourage students to use technology as much as possible to enhance their performance. AI is intended to help enhance, not shortcut,” IIM Sambalpur Director Professor Mahadeo Jaiswal told India Today.

But even as tools evolve, a deeper issue is emerging: Are students losing the ability to think for themselves?

MIT’s recent research says yes, too much dependence on AI weakens critical thinking.

It slows down the brain’s ability to analyse, compare, question, and argue. And these are the very skills institutions are supposed to build.

“AI has levelled the field. Earlier, students in small towns didn’t have mentors or exposure. Now, they can train for interviews, get feedback, build skills, all online. But it depends how you use it,” said Samarth Bhardwaj, an IIM Jammu student.

TEACHERS ARE UNDER PRESSURE TOO

The faculty are not immune any more. AI is now turning mentor and performing stuff that even teachers cannot do. With AI around, teaching methods must change.

The old model — assign, submit, grade — works no more. Now, there’s a shift toward ‘guide on the side’ teaching.

Less lecture, more interaction. Instead of essays, group discussions. Instead of theory, hackathons.

It is all about creating real-world learning environments where students must think, talk, solve, and explain why they did what they did. AI can assist, but not answer for them.

SO, WHERE IS THE LINE?

There’s no clear national rule yet. But the broad consensus across IITs and IIMs is this:

  • AI should help, not replace.

  • Declare what you used.

  • Learn, don’t just complete.

Experts like John J Kennedy, former dean at Christ University, say India needs a forward-looking framework.

Not one that fears AI, but one that defines boundaries, teaches ethics, and rewards original thinking.

Today’s students know they can’t ignore AI. Not in tier-1 cities. Not in tier-2 towns either.

Institutions will keep debating policies. Tools will keep evolving. But for students, and teachers, the real test will be one of discipline, not access. Of intent, not ability.

Because AI can do a lot. But it cannot ask the questions that matter.

– Ends

Published By:

Rishab Chauhan

Published On:

Jul 9, 2025



Source link

Education

FIORENTINO: STATE SYSTEM WILL TACKLE AI EDUCATION

Published

on


In his new blog, State System of Higher Education Chancellor Christopher Fiorentino highlights an agreement signed with Google last week “to help students develop the AI competencies they will need in their future careers.”  Writing that the State System can’t fool itself into thinking artificial intelligence is a “trend” or a “passing fancy,” Fiorentino pledges to enter the future with “eyes wide open.”  He says perhaps the best contribution the state-owned universities can make is to ensure graduates know “what AI tools they should be prepared to use” as they start their careers.

The agreement to expand the State System’s partnership with Google includes IUP, Cheyney, East Stroudsburg, PennWest, and Millersville..

THE CHANCELLOR’S BLOG:

https://chancellorfiorentino.blogspot.com/2025/09/ai-is-not-fad.html

.





Source link

Continue Reading

Education

AI in healthcare education: The future of learning explained

Published

on


Artificial Intelligence (AI) is no longer just a tool for diagnostics; it is transforming the way healthcare professionals learn, train, and prepare for real-world practice. From medical students to practicing clinicians, AI-powered platforms are redefining education with immersive simulations, adaptive learning, and real-time feedback. Industry leaders weigh in on how AI is shaping the future of medical training.

AI AS A PERSONALISED LEARNING COMPANION

According to Ankit Modi, Founding Member & Chief Product Officer at Qure.ai,

“Artificial Intelligence is redefining healthcare education by enabling immersive, data-driven, and personalised learning experiences. AI-powered tools like virtual simulations, adaptive learning platforms, and predictive analytics are bridging the gap between theoretical knowledge and real-world clinical skills.”

He highlights that AI allows students to practice procedures in safe, simulated environments while improving decision-making and knowledge retention.

MAKING CLINICAL TRAINING IMMERSIVE AND ACCESSIBLE

Mr. Tejasvi Rao Veerapalli, CEO of Apollo Hospitals, Hyderabad, stresses the role of AI in scaling high-quality training:

“AI is revolutionising healthcare education by making clinical training more immersive, personalised, and efficient. Through intelligent simulation tools, real-time feedback, and adaptive learning platforms, students and professionals can now build skills in risk-free environments that mirror real-world scenarios.”

He adds that AI ensures deeper knowledge retention and bridges gaps in training access and expertise.

AI AS A MENTOR FOR FUTURE CLINICIANS

For Jeevan Kasara, Director & CEO of Steris Healthcare, AI is evolving beyond diagnostics to act as a learning mentor:

“Artificial Intelligence is revolutionising healthcare education, evolving from a diagnostic aid to a personalised mentor for future clinicians. Virtual patients strengthen diagnostic reasoning, adaptive assessment tailors learning to individual needs, and natural language tools make information instantly accessible.”

He emphasises that this transformation equips professionals with sharper skills, long-term retention, and empathetic patient care.

MOVING BEYOND TEXTBOOKS TO DYNAMIC LEARNING

Rustom Lawyer, Co-Founder and CEO of Augnito, sees AI as a bridge between static theory and interactive practice:

“Artificial Intelligence is ushering in a new era for healthcare education, firmly bridging the gap between theory and practice. By offering realistic simulations, personalised learning experiences, and real-time feedback, these technologies empower students and professionals to develop skills with greater confidence and accuracy.”

He notes that this shift accelerates training while making knowledge retention stronger and more practical.

THE FUTURE OF MEDICAL EDUCATION WITH AI

Across perspectives, one theme stands out: AI is not just a technological upgrade, but a pedagogical revolution. It makes education:

  • More immersive through simulations.

  • More personalised with adaptive learning.

  • More impactful with real-time feedback and recall.

By tailoring education to individual needs while keeping pace with advancing medicine, AI ensures the next generation of healthcare professionals are better equipped, more confident, and more empathetic from day one.

– Ends

Published By:

Chaitanya Dhawan

Published On:

Sep 14, 2025



Source link

Continue Reading

Education

How to use ChatGPT at university without cheating: ‘Now it’s more like a study partner’ | University guide

Published

on


For many students, ChatGPT has become as standard a tool as a notebook or a calculator.

Whether it’s tidying up grammar, organising revision notes, or generating flashcards, AI is fast becoming a go-to companion in university life. But as campuses scramble to keep pace with the technology, a line is being quietly drawn. Using it to understand? Fine. Using it to write your assignments? Not allowed.

According to a recent report from the Higher Education Policy Institute, almost 92% of students are now using generative AI in some form, a jump from 66% the previous year.

“Honestly, everyone is using it,” says Magan Chin, a master’s student in technology policy at Cambridge, who shares her favourite AI study hacks on TikTok, where tips range from chat-based study sessions to clever note-sifting prompts.

“It’s evolved. At first, people saw ChatGPT as cheating and [thought] that it was damaging our critical thinking skills. But now, it’s more like a study partner and a conversational tool to help us improve.”

It has even picked up a nickname: “People just call it ‘Chat’,” she says.

Used wisely, it can be a powerful self-study tool. Chin recommends giving it class notes and asking it to generate practice exam questions.

“You can have a verbal conversation like you would with a professor and you can interact with it,” she points out, adding that it can also make diagrams and summarise difficult topics.

Jayna Devani, the international education lead at ChatGPT’s US-based developer, OpenAI, recommends this kind of interaction. “You can upload course slides and ask for multiple-choice questions,” she says. “It helps you break down complex tasks into key steps and clarify concepts.”

Still, there is a risk of overreliance. Chin and her peers practise what they call the “pushback method”.

“When ChatGPT gives you an answer, think about what someone else might say in response,” she says. “Use it as an alternative perspective, but remember it’s just one voice among many.” She recommends asking how others might approach this differently.

That kind of positive use is often welcomed by universities. But academic communities are grappling with the issue of AI misuse and many lecturers have expressed grave concerns about the impact on the university experience.

Graham Wynn, pro-vice-chancellor for education at Northumbria University, says using it to support and structure assessments is permitted, but students should not rely on the knowledge and content of AI. “Students can quickly find themselves running into trouble with hallucinations, made-up references and fictitious content.”

Northumbria, like many universities, has AI detectors in place and can flag submissions where there is potential overreliance. At University of the Arts London (UAL) students are required to keep a log of their AI use to situate it in their individual creative process.

As with most emerging technologies, things are moving quickly. The AI tools students are using today are already common in the workplaces they will be entering tomorrow. But university is not just about the result, it is about the process and the message from educators is clear: let AI assist your learning, not replace it.

“AI literacy is a core skill for students,” says a UAL spokesperson, before adding: “Approach it with both curiosity and awareness.”



Source link

Continue Reading

Trending