Chat GPT and Artificial Intelligence: How Schools Can Respond

Douglas Reeves*

May 4, 2023

 

            There was a time when educators feared that the use of handheld calculators would encourage students to cheat on math homework because they would lose the ability to do mental math.  Similarly dire predictions were made about the spell-check functions on word processors and, later on, by programs that corrected grammar and usage errors in student essays.

The latest concern over how technology might impair student learning is Chat GPT, the artificial  intelligence program than is writing student essays and responding to test questions in a broad array of subjects.  In January of 2023, a professor at the Wharton School at the University of Pennsylvania reported that the Chat GPT responses to a final exam in a business course was not perfect, but were sufficiently good to pass the test.  This article offers three potential responses for educators who face the inevitable use of Chat GPT and other AI systems to complete assignments.

           

            Practice in Class, Not at Home

 

            We assign homework because we know that students need practice.  The problem is this:  The characteristics of effective practice are feedback, response to feedback, and application of feedback.  That does not happen at home unless the teacher is making house calls.  Without that feedback, students often turn to the available sources of technology that will solve their math and science problems and write their essays for social studies and language arts.  When teachers give up lecture time and have students get out of their chairs and practice during class, the teachers witness authentic student work and can provide immediate feedback to improve student performance.  It’s a real person doing the work, not a robot.

 

Require Evidence of Understanding

 

            It’s one thing to present a paper to the teacher.  It’s quite another to defend the reasoning, claims, and evidence in that paper.  Because critical thinking is such an important skill for all students at every grade level and every subject, requiring students to defend their reasoning in an oral presentation, either in class or in a short video, might allow teachers not only to improve the cognitive challenges presented to the students, but also to distinguish between papers that are the authentic work of students and those that are generated by AI.

 

            Require the use of AI as a First Draft and Require Editing and Revision

 

            Too many assignments for students are “one and done,” a process that elevates getting the assignment right the first time over the more demanding process of respecting teacher feedback and then revising the first draft to improve everything from lab reports, to essays, to math solutions. Rather than fight AI solutions and the many easily available homework helper programs , consider allowing students to use Chat GPT and other technology assists, but also require that they submit both the computerized essay or other work product, and also submit how they revised and improved the technology-supported work.  This sends two essential messages to our students.  First, an essential human skill in the 21st Century is improving on the work of computers, not passively accepting the premise that bots are better than people.  Second, it reinforces the critical thinking skills that are inherent in revision and improvement. 

 

            In my interviews with college professors and employers, one of the most consistent findings was that they want students and employees who can accept and apply feedback.  They rail against the “get it right the first time” mentality in which feedback is irrelevant.  However sophisticated the technology may be, our students can and must add value to computerized solutions. 

 

            Against the Luddites

 

            In 1811, Ned Ludd and his followers set out to destroy textile machinery, certain that if they could destroy the advance of technology, they would save their jobs.   By 1817, their rebellion had been crushed, in many cases with military violence.  While those who wish that AI did not exist need not fear the fate of the Luddites, they can learn some cautionary tales.  AI has left Pandora’s box and attempting to stop it in schools is as futile as expecting that students will return to the dress codes of  1953.  Nevertheless, educators, leaders, and policy makers can help to use AI to improve, rather than diminish, the critical thinking abilities of students.  We need not oppose AI.  Rather, we can use it to help our students meet the critical thinking challenges of today.

 

*Douglas Reeves is the founder of Creative Leadership Solutions and the author, most recently, of Fearless Schools.   Contact Doug at Douglas.Reeves@CreaetiveLeadership.net or 781.710.9633. 

Related Posts

  • Using Text Annotation to Support the Writing Process

    April 29, 2026
    Contributing author: Dr. Marisa Rivas

    Read More
  • Is It Really Alternative—or Just a Different Address?

    In my work supporting alternative schools and programs, I’ve found that too often continuation and alternative settings inherit the same graduation requirements, schedules, grading systems, instructional routines, and pacing that failed students the first time. They are simply in a smaller setting and frequently with even fewer resources. In many cases, rigid credit requirements minimize flexibility for students and instead condemn them to hours of tedious, computer-based credit recovery.

    Read More
  • A Team of One: Rethinking Singletons in Collaborative Learning Teams

    It’s one of the most common, and most limiting, statements we hear when it comes to PLCs, or what we call Collaborative Learning Teams (CLTs). Whether it’s a lone 5th grade teacher, a single PE teacher, a music teacher, the only Chemistry teacher, a specialist, or someone teaching across multiple grade levels, the conclusion is often the same: there’s no one to collaborate with. And just like that, the work stops, not because it can’t happen, but because we’ve defined collaboration too narrowly.

    Read More