The AI Classroom — A Teacher’s Reflection on AI in the Classroom
As a proud millennial, I often find myself straddling the two worlds I grew up in: the world before, when you had AOL and dial-up and were lucky to have one desktop computer in the house that everyone shared, and the world that came after. I remember the long wait to get online, the screechy siren sound before you were told that “You’ve got mail”; this new frontier was filled with poorly constructed websites and chat rooms. Talking to my friends on Instant Messenger and writing about my daily struggles on my LiveJournal. Then came Google and faster internet speeds. We no longer needed to tie up the phone line to get online; now we could do it with a few clicks and a router. In high school and college, we navigated MySpace, Facebook, and Tumblr. But in class, we were still turning in handwritten assignments and laboriously researching lab reports. We spent time in the library organizing our research on index cards and deciding where to place the comma versus the period as we flipped through the newest MLA or APA style guide. The internet was there, but we hadn’t become consumed by its existence. We had computer classes and typing tests. Plagiarism was a serious offense, the boogeyman of academia.
Little did we know that, in less than two decades, every student would have not one but multiple devices, constantly connected to the internet. Our phones were no longer tethered to the walls of our homes; they now live in our pockets and palms. Elementary school students are given iPads and Chromebooks. Almost all work is submitted online—SeeSaw, Canvas, Google Classroom, Managebac. Computer classes are no longer deemed essential because the new generations, Gen Z and Gen Alpha, were born with the entire world at their fingertips. The majority of social interactions have moved from the playground, hallways, and cafeteria to social media. They Snap, Tweet, and put it all on Instagram.
And then we were introduced to something that had only previously existed in Science Fiction: Artificial Intelligence. ChatGPT and other large language models (LLMs) seemingly arrived overnight. They could process information at lightning speed and were growing “smarter” by the day. Before we knew what was happening, AI replaced the previous methods of learning, processing, and remembering. And it’s only continuing to expand.
When I was first introduced to ChatGPT, I was appalled. Here was another way for students to bypass the writing process. Here was another thing I would have to try to manage. While early iterations of AI were clunky and easily identifiable, the latest models demonstrate that AI is growing and adapting at an alarming rate. I have seen bots that can “humanize” writing and bypass AI detection software. AI detection software is unreliable and is not growing at the same rate as the newest models.
As teachers, we are limited in our options. We can decide one of three things: 1) to police AI, trying to track down AI misuse for every assignment students complete, 2) to give in to AI and assess what we are given regardless of if the work is original, or 3) to find a balance where we attempt to teach our students how to learn with AI instead while also appreciating the messy, sometimes strenuous process of critical thinking.
I don’t claim to be an AI expert. Like many of you, I am learning more each day about how I feel about these new tools, how comfortable I am using them in my own practice, and how I plan, above all, to maintain my philosophy that learning to think analytically is the real business of education. I didn’t sign up to teach students how to get robots to do their work, but I can teach them how the robots work and how to keep human ingenuity at the forefront of what we do.
What I offer you in this post is not a guide to determining whether students are using AI. They are. Even if they are not trying to. It seems embedded in every system they come into contact with. This is more of an exploration of what I’m thinking about, what I’ve tried, and the small successes that I’ve encountered on this new journey of AI in education.
Know what’s available to students.
At the beginning of this school year, a notice went out to our students that they now had access to Google Gemini Pro. Our school also has Grammarly Plus for all students. I teach a creative nonfiction writing course, and my teaching partner and I walked students through the importance of voice, demonstrating how AI strips their writing of uniqueness.
We want to see the messy parts, the comma splices, fragments, and run-ons. Some of those “mistakes” make for beautiful writing. So we taught them how to turn off these features and remove the temptation to second-guess themselves. We gave them confidence by sharing our own works in progress. And we showed them how imperfections help us craft great writing.
Some students were excited by this. Some students were scared. Many of our students are not native English speakers, and they have been using Grammarly since middle school. Many of them had no idea what their authentic voice sounded like. They had to trust us enough to turn off Grammarly's and Gemini's generative AI features to learn who they were as writers and thinkers. All this to say, we must be aware of what is available to our students. We can’t address a problem unless we know what the problem is.
Talk with students about their use of and reliance on AI.
This one seems simple, but I find that many teachers talk to students rather than with them when it comes to AI. One of our design teachers, Linus Velez, uses the analogy of a brain when speaking with students about AI use. He will often begin by asking, “Which brain is growing right now, yours or the AI’s?” I find this to be an easy way to get students to discuss how they use AI.
One time, my co-teacher and I saw that a student had been using AI in every single one of his classes, multiple times, even when trying to understand content vocabulary. His conversations with the AI were among the most advanced I have encountered. He was training AI to replicate his work in every way. The heart of this was that the student did not believe he could learn the material as quickly as his classmates. He didn’t want to be perceived as unintelligent, and he feared that if he stopped using AI, the knowledge gap between himself and his classmates would be mortifyingly obvious. When we began working with him on lessening his reliance on AI, it was not without tears. This young man’s entire academic identity had been camouflaged by AI, and without it, he felt exposed. I share this anecdote as an example of brain growth. This student’s “AI brain” has grown immensely over the past few years. Unfortunately, he now has to be re-taught how to struggle, how to find the answers he seeks, and how to speak up and ask for help. He has to create neuroplasticity in his own brain, not AI’s.
If we come from a place of curiosity, as in, “I’m curious how you’re using AI” or “Can you show me how you’re using AI,” students tend to be more open rather than defensive. They are not in trouble. I always make this clear from the outset. I’ve asked students if they’d be willing to share their ChatGPT chat logs, and most have agreed. I want to work with them to build confidence in not relying on AI in the ways they have been. To shift from having AI do their thinking to having AI deepen their thinking.
Here is an example from a unit reflection where students used a writing coach I made for them in FLINT AI. Simple surveys like this help me keep a pulse on how reliant they are on AI tools, even when I am the one providing them.
Know what’s in your locus of control.
As I’ve already stated, AI is everywhere. Trying to completely eliminate it from student use is a losing battle. I’ve heard of colleagues who have gone “back to pencil and paper,” only to learn that students have prepped themselves using AI at home or will remotely access AI on their smartwatches.
Trying to police AI use is exhausting. My GOA teaching partner and I started out trying to combat AI misuse in our online course, but it quickly became an overwhelming battle. Along the way, we realized that we were losing the fight. If I was going to still enjoy teaching, I had to shift my thinking and focus on what was in my locus of control. I can control how I monitor the process. Students now submit all documents to me in Google Docs and grant me editing access (something Google Classroom does automatically). This allows me to see their entire workflow in a document. In my school, we use the tab functions for every component of their work. If they are not working digitally, they are working in a journal, and can talk me through their writing process and thinking. I can also control the medium that students use and work with.
Humans Versus Machines
Dr. Jared Cooney Horvath, a former teacher, spoke about the decline in students' cognitive development following the introduction of 1-to-1 technology in schools. One statement he made that stuck with me was that “We are redefining education to better suit the tool. That’s not progress, that is surrender.” As I reflect on my twenty years in education, I know that surrendering wholly to AI means surrendering our students' cognitive development. If we do so, we give up not only their cognitive development but also their affective development. With that in mind, my next post will address how I think we can keep the human at the center of our instruction, while also navigating AI in the classroom.