Artificial Intelligence
How Task Design Transforms AI Interactions in the Classroom
Are AI learning tools breaking barriers or building dependency?
Posted September 8, 2025 Reviewed by Jessica Schrader
Key points
- Individual writing tasks often lead to problematic copy-paste behavior and AI over-reliance in students.
- Team debates with restricted AI access fostered critical evaluation and real-time synthesis skills.
- Public accountability and time pressure transformed students from passive consumers to active synthesizers.
Co-authored by Hannah Farrell, Xiaoyan Dong, and Michael Hogan.
As generative AI tools like ChatGPT become more embedded in education, a critical question emerges: how do we design learning activities that promote critical engagement rather than passive dependence? While recent research has documented concerning patterns of student over-reliance on AI, many universities' AI guidelines remain vague, lacking concrete strategies to support deep, reflective engagement. The key lies in examining how specific features of task design shape the way students interact with AI tools.
Contrasting Approaches to AI Integration
Luther et al.'s (2024) individual writing study, which we discussed in our previous post, exemplified exactly these concerning patterns, with many students copying AI-generated text with minimal revision, treating ChatGPT as a ghostwriter rather than collaborator. But might it be possible to change our task design to alter these dynamics? In a recent study, Zhang, Sun, and An (2025) analysed over 1,000 minutes of recorded debate sessions and interview transcripts totaling 370,551 words to examine a fascinating GenAI use case. They observed 22 students working together iteratively over four weeks in teams of five in fast-paced, time-constrained classroom debates. Teams of five could leverage support from GenAI, but only one student had access to the sixth team member, ChatGPT. This use case scenario required real-time collaboration, rapid decision-making, and strategic use of AI-generated responses under time pressure—10 minutes preparation, five minutes initial arguments, 25 minutes cross-examination—which created conditions for clearly differentiated team roles to emerge. By restricting AI usage to a single device, the researchers observed how students developed unique collaborative strategies where one person became the "AI user" mediating between the group and ChatGPT, while others emerged as information gatherers, content evaluators, and ad-hoc taskers.
The study found that this emergent team role structure and division of labor created natural accountability systems where students did not treat the AI with unquestionable authority. Instead, they filtered, questioned, rephrased, and even challenged ChatGPT’s outputs. The AI user role became particularly critical, as these students would integrate multiple arguments into their statements and present the information in their own words, transforming from mere conduits into active synthesizers. As one participant remarked, “We didn’t treat AI as an expert—we critiqued its responses and used them to spark further discussion.” The study findings indicate that students engaged in higher-order thinking using processes of information synthesis, evaluation, and collaborative negotiation. Rather than relying solely on ChatGPT, the design constraints fostered critical engagement where students collaborated and drew on their own knowledge to strengthen their arguments.
Is Debate the Difference? How Task Design Shapes Engagement
Time pressure alone didn’t drive critical engagement; it was the specific structure of the debate task that prompted deeper interaction with AI. In Zhang et al.’s study, only one team member had access to ChatGPT, which required students to verbalise their needs and actively collaborate. This setup not only deepened engagement with the content but also led to teams adopting defined roles—AI user, information gatherer, content evaluator, and ad-hoc tasker— that ultimately distributed cognitive responsibility and enhanced group performance. The public nature of debating, with its emphasis on rebuttal and persuasion, further introduced social accountability. Students had to critically interpret AI responses, adapt them in real time, and reach team consensus. These conditions fostered intersubjective understanding and reflection, encouraging students to think more deeply about both the debate topic and the AI’s reliability.
However, critical engagement wasn't guaranteed. Some students admitted cognitive dependency and failure to engage in higher-order thinking, noting “AI gives so many answers at once that our team skips the thinking phase.” These admissions indicate that task design alone cannot prevent disengaged behaviour if students are not supported on how to use AI. To counteract this, students should be taught simple prompting and interaction strategies that encourage deeper thinking. For example, simply adding phrases like "explain your reasoning" at the end of a prompt can nudge the AI toward more reflective responses and model the kind of critical engagement educators aim to cultivate. Modeling and monitoring information synthesis, evaluation, and collaborative negotiation strategies may also help students to sustain these behaviours in their interactions with AI.
Designing for Better Human–AI Teamwork
What determines whether AI promotes deep learning or passive consumption is how we ask students to engage with it. Task designs that prioritise interaction, collaboration, and reflection are fundamental in promoting purposeful use of AI tools. These insights point to the need for redesigning educational approaches to ensure that students remain active participants in their own learning. Zhang et al. (2025) offer a hopeful vision of how well-structured collaboration can amplify student thinking. In contrast, Luther et al. (2024) highlight the risks of poorly scaffolded AI use, where tools like ChatGPT become ghostwriters rather than learning partners. The challenge for educators and policymakers is not whether to allow AI integration into the classroom, but how to design tasks that ensure AI enhances—not replaces—human thought and creativity.
References
Connect with the authors: Hannah Farrell, Xiaoyan Dong, Michael Hogan.