Artificial Intelligence
What Is Lost When Students Turn to AI
Personal Perspective: Students—and others—who rely on AI are selling themselves short.
Posted May 11, 2025 Reviewed by Devon Frye
Key points
- Recent reports suggest that ChatGPT and similar AI tools are generating less accurate content.
- Creators can't explain why or when AI will hallucinate.
- An over-reliance on this new technology is undermining skills for students in higher education.
- There are benefits to writing beyond just the product that students produce.
In many cases, the embrace of technological innovations by users has been a use-now-pay-later approach. In other words, innovations haven’t necessarily been met with the skepticism that perhaps they should be. AI serves as a primary example of this.
On top of other urgent concerns for faculty in higher education—already a high-stress profession right now, given the political and cultural climate—many of us also find ourselves contending with what to do when our students’ written work has been outsourced to AI. For instance, last year, an article in Wired magazine reported that Turnitin, a plagiarism detection company, reported that students have submitted more than 22 million papers with AI-generated content.
Yet turning to AI isn’t reserved for students in higher education. The Wired piece noted, for instance, that “traces of chatbots have been found in peer-reviewed, published academic writing.” The numbers may be even higher than what's been reported, both on the student and researcher side.
Setting aside the ethical and intellectual considerations regarding relying on these tools to write papers—more on that shortly—an equally compelling concern is that AI tools are being readily embraced by the masses, despite the fact that oftentimes the data they spit out is wildly inaccurate. In fact, just this week, The New York Times reported that rather than becoming more accurate as these tools become more widespread—and presumably, more refined—that newer AI systems are actually making information up at rates as high as 79 percent.
Hallucinations is the common term AI researchers use when referring to this phenomenon. The Wired article states that “generative A.I. has been known to hallucinate, creating its own facts and citing academic references that don’t actually exist.”
Just as distressing as these numbers is the fact that their creators can’t actually explain why these tools are producing hallucinations at higher rates. It appears that many students are relying on these tools to complete work that they are meant to be doing in their courses, without considering that what they are submitting, in addition to not reflecting original work, is lacking in accuracy.
Of course, as alluded to earlier, lack of accuracy is only one concern when confronting students’ overreliance on these tools. The clear ethical concern is the extent to which using AI-generated content constitutes a form of cheating.
Moreover, it is also concerning that the willingness of so many undergraduate (and likely graduate) students to turn to these tools further suggests being outcome-oriented rather than process-oriented. There is an intrinsic value to working through the process of cultivating one’s writing skills and learning how to express oneself on paper in an academic environment. Such a skill is clearly not being valued when students are outsourcing this process to AI.
I see this as part of a larger trend where technology is replacing many human skills—fundamentally altering what it means to be human, and how we engage with the world. In my experience, administrators oftentimes promote faculty to “embrace” these tools, touting how they may make our jobs easier by potentially abetting tasks such as grading. However, such promotion fails to recognize the extent to which these tools are undermining the very purpose of being in an academic environment.
It is often through the process of writing that students develop their critical thinking skills. They are confronted with the challenge of organizing their thoughts, figuring out what they want to express, and expressing it in a way that presents theory to the reader in a cogent, compelling, and engaging way. Writing is a process that continually requires editing, rethinking, and refinement.
In other words, as with many processes, it requires dedication and patience. When students outsource a given written assignment to AI, they aren’t just engaging in a shortcut; rather, they are selling their own abilities short and not giving themselves the chance to figure out how they think and feel about a given topic. Nor are they giving themselves the opportunity to think deeper about whatever topic they were tasked to write about.
Psychology has long since identified the potential healing power of expressive writing. Given that interpretation of experiences is one of the ways that writing can be therapeutic, it also stands to reason that such interpretation is also beneficial even for more traditional academic writing. Thus, the purpose of assigning a written paper in an academic environment is not just about the final product the student produces, but also the process that enables the student to put together that submission.
From my perspective on the other side of this process, while it can be extremely frustrating to grade papers written by students who are still not fully skilled in academic writing, it is also an opportunity to guide our students, offer them the feedback they need, and learn more about their inner lives. In fact, some of the most transformative experiences I have had in dialogue with my students have been over the course of grading a written work of theirs.
Perhaps relying on AI would enable me to be a “faster” grader, but I trust the feedback and editing that I can offer far more than that of a bot. I also imagine students would feel equally uneasy in the same way that we do as faculty when uncovering AI-generated content in their submissions, if they were to find out their work was being solely “graded” by AI, rather than their professor.
These are the less obvious casualties of a move-fast-and-break-things ethos that has permeated Big Tech from its inception, and has now penetrated virtually every trusted institution in the larger culture. And on top of that, the AI-generated content students are submitting likely isn’t even accurately reflecting the topic of choice.
There has to be a metaphor in there somewhere about the precarious state of our increasingly technology-dependent culture.
Copyright Azadeh Aalai 2025
References
Hoover, A. (2024, April 9). Students Are Likely Writing Millions of Papers with A.I. Wired. Retrieved on May 10, 2025 from: https://www.wired.com/story/student-papers-generative-ai-turnitin/
Metz, C., Weise, K. (2025, May 9). A.I. is more Powerful, but Its Hallucinations Are Getting Worse. The New York Times: Business. [Print]
Murray, B. (2002, June). Writing to Heal. Monitor on Psychology: APA, 33(6), 54. Retrieved on May 10, 2025 from: https://www.apa.org/monitor/jun02/writing