
Artificial intelligence (AI) is all over TikTok right now, pumping out videos of cutting glass versions of fruit with little galaxies glowing inside. It’s mesmerizing when you’re doom-scrolling at 2:00 a.m., but when the same tech starts showing up in your assignments and lecture halls, the shine wears off fast.
AI has stormed into the university classroom in ways that seemed unthinkable only a few years ago. Once, the biggest concern for professors was students copying from Wikipedia or buying essays online. Now, the challenge is facing a tool that can generate entire assignments, write polished prose, and even mimic citation styles with the click of a button. The arrival of AI tools has been met with a mix of optimism and dread. Some see them as revolutionary study aids, while others worry they signal the collapse of traditional learning. Both views carry truth, but the negatives may now outweigh the positives — especially for students who should be learning to think for themselves.
AI does, undeniably, make life easier. Students juggling multiple assignments, part-time jobs, and extracurriculars know the value of any tool that saves time. Programs like ChatGPT can draft an essay outline in seconds. Grammarly and QuillBot can clean up grammar and suggest stronger phrasing. AI citation generators can build bibliographies without the endless formatting struggles that plague undergraduates. For international students, AI can translate difficult passages, summarize readings, and provide scaffolding when English is not their first language. For students with disabilities, AI can make classes more accessible through captioning, adaptive feedback, or text simplification. In these contexts, AI can level the playing field and give students more tools to succeed.
The problem is not that AI can help with planning and accessibility. The problem is that it can also replace the act of learning altogether. Universities are not meant to be factories that produce essays, but spaces where students build habits of thought. Reading, writing, and revising are slow, sometimes frustrating processes, but they force students to grapple with ideas, wrestle with evidence, and develop an authentic voice. When an algorithm takes over those steps, the student may save time, but they lose the core of their education. The danger is a shortcut culture that prioritizes speed over substance and convenience over curiosity.
Professors are already seeing this play out in classrooms. Assignments that used to test critical thinking are being completed in minutes. Discussion posts that once showed signs of struggle now read like polished blog entries. Some professors are relying on oral exams, in-class writing sessions, or highly personalized projects in an attempt to stay ahead of AI. But this arms race between teachers and students undermines trust. Universities may create environments where professors treat every essay with suspicion, rely on flawed detection software, and leave students fearing false accusations of cheating. Academic integrity policies were already tough to enforce. AI has blurred the line between help and plagiarism to the point of collapse.
This collapse feeds into a broader problem that educators have been warning about for years: the literacy crisis. Across Canada, teachers have reported alarming declines in reading comprehension, writing skills and sustained attention. A 2024 report from Ontario’s Education Quality and Accountability Office (EQAO) found significant drops in literacy scores among high school students. In Alberta, teachers have sounded similar alarms. This is a warning that many students may arrive at university unprepared for the demands of academic writing. Students are reading less, writing less, and practicing fewer of the skills that universities rely on as entry-level foundations. If AI becomes a crutch, these gaps will only deepen.
Literacy is not a relic of the past. Reading long texts, analyzing arguments, and writing clearly are still core competencies for almost every career. Employers do not just want graduates who can plug facts into prompts; they want graduates who can craft an email, write a report or defend a point of view. Civic life, too, depends on these skills. A society where fewer people can evaluate a news article or write a persuasive letter is a society where democratic participation weakens. Universities are supposed to counter this trend, but AI risks accelerating it.
There is also the question of student identity. University is the time when young people experiment with ideas, styles, and voices. Every essay is not just about marks — it is a chance to find out how you sound on paper, how you frame an argument, how you bring your personality into prose. With AI churning out smooth sentences at the click of a button, many students may never take that risk. They may hand in grammatically perfect papers that say nothing of themselves. When they graduate, they could leave with degrees in hand but without ever having discovered their own intellectual voice.
Even students who resist the temptation to rely on AI are facing new pressures. Detection software has started flagging certain writing characteristics as “AI-generated.” Systems can mark the use of the em dash, short sentences, or highly regular phrasing as suspicious. This means students are not only competing with AI but also altering their natural style to avoid being misidentified. The result is less creativity, less experimentation, and less trust between student and teacher. Ironically, in their efforts to prevent AI abuse, universities are inadvertently creating conditions that stifle authentic human writing.
Inequality only compounds these issues. Many of the best AI tools come with steep subscription fees. Students with money can buy premium access that offers stronger text generation, better summarization, and more advanced features. Students without that money are left behind, forced to use weaker free versions or to forgo AI entirely. This splits classrooms along economic lines, deepening divides that already exist through tuition, housing costs, and access to technology. Instead of democratizing learning, AI may reinforce privilege.
Universities themselves face a credibility test. If they allow AI to dominate, degrees risk losing value in the eyes of employers. A diploma certifies that a student can think, write, and solve problems. If employers believe that graduates have relied too heavily on AI, that trust will erode. Graduates may find themselves overlooked for jobs, their education discounted by the suspicion that their skills belong to an algorithm rather than themselves. In a tight labour market, that perception could make a major difference.
AI has undeniable uses, but those positives do not erase the larger risks. Every shortcut taken through AI comes at the cost of practice, growth, and identity. The literacy crisis is real, and AI threatens to worsen it. Detection systems deepen an already strained trust between professors and students. Inequality among students is likely to grow, as those who can afford better tools race ahead while others fall behind. Universities are graduating students who look qualified on paper but lack the very skills higher education is meant to instill.
At its best, AI can help students brainstorm, manage workloads, and access support that might otherwise be out of reach. At its worst, it hollows out the university experience, turning it into a performance of learning rather than the real thing. Students deserve more than that. The stakes are nothing less than the value of a degree — and the future of how we define education itself.