Why I’m Banning Student AI Use This Year
Chanea Bond will ban AI this year to give her high school English students the opportunity to develop foundational skills that she believes the tech hinders.
Your content has been saved!
Go to My Saved Content.While many high school English language arts teachers plan to let their students lean on ChatGPT and other generative AI tools to help them brainstorm or to provide feedback on the grammar, vocabulary, and structure of their drafts, Chanea Bond has other plans this school year.
Recently, the Texas teacher, who is starting her 10th year in the classroom, declared on X to her nearly 60,000 followers that she will enforce a strict no-AI-use policy in her classes. Students who break the policy will receive a zero on assignments, “and there will be no exceptions,” Bond said.
The post was viewed over 100,000 times and caused a bit of a stir.
“AI can do a lot more than write for your students. Brainstorm, outline, grammar, spelling, vocabulary, tone, feedback,” wrote one educator in response. “Are you proposing none of these use cases be allowed & = ZERO score?”
“Yes,” Bond replied. “All of the things you mentioned are skills my students are supposed to develop in school.”
Bond told me that her policy isn’t about asking students to bury their heads in the proverbial sand. She’s more concerned with what her students are learning—or more often, not learning—by leaning on AI to help them formulate and write their assignments.
Bond believes that allowing students to outsource their ideas and rough-draft thinking to AI doesn’t help them and in fact devalues vital literacy skills like originality, creativity, analysis, and synthesis.
“The original ideas are the most important component in a student’s writing,” Bond told me. “You can polish everything else. But how are you going to polish an idea that you didn’t originally have, that you didn’t originally think of, and that you don’t really have any investment in?”
I spoke to Bond recently about her work as an English teacher, what prompted her to develop her AI ban, and what she makes of the discussion in education circles around her choice.
ANDREW BORYGA: How long have you been teaching high school English?
CHANEA BOND: For the last nine years, I’ve taught English to mostly freshmen and sophomores, but this year I’ll be teaching college composition, American literature, and AP Lit to upperclassmen.
BORYGA: What do you love most about your job?
BOND: I know it’s so corny to say, but it’s the kids. When I first meet students—and especially Black and Brown students—and they say things like “I haven’t read a book since fourth grade” or “I don’t like to write, I don’t like to read,” I love creating these moments where they either connect with a text or they start writing these sentences that become these really big ideas, and I can turn to them and say, “See, you’re a writer, you’re a reader.”
BORYGA: What are some specific skills you’re trying to teach your students, particularly when it comes to writing and expressing their own ideas?
BOND: One big thing is the importance of getting their whole idea down so that they can critically engage with it, before sharing it with others. Things move so fast now that we don’t even have time to think about the totality of our ideas before they’re expressed.
Clarity is key, too. Students need to be able to present their arguments concisely but also think to themselves: Are these words, in this order, what I really want to say? Are there better words I can use that more clearly articulate what I want to say?
BORYGA: That need for students to articulate their ideas brings me to your AI ban. When did you begin to understand that AI might not be helpful to your goals in the classroom?
BOND: In the fall of last year, I wrote a paper with my dual-credit American literature students utilizing AI. I was reading all these articles about how AI is inevitable, so I wanted to give it a try in the classroom, with my students.
I had them take a poem, read it critically, and annotate it, and then I gave them the option of writing their own literary analysis thesis statements or feeding their notes to AI and asking it to write a thesis statement for them. Those that chose the AI option had to use those thesis statements to write their papers—and the papers they wrote were really, really bad.
It was after that experience, and spending the rest of the year reading these AI-written short answer responses, papers, and presentations that students were turning in, that I realized there was a problem—my students don’t have the skills necessary to be able to take something they get from AI and make it into something worth reading. I also realized that they’re not using AI to enhance their work. They are using AI instead of using the skills they’re supposed to be practicing. So, I decided we’re not going to use it in the classroom.
BORYGA: Can you say more about why those papers were so bad?
BOND: When I modeled the process for them, I knew the poems well enough to be able to look at the responses AI spat out and say, “No, the poem isn’t about this at all.” But my students didn’t know the poems they were feeding into AI well enough to tell right from wrong. And when they wrote their papers, they didn’t know the poem well enough, or their literary devices well enough, to take what they got from AI and make it their own.
I envisioned that AI would provide them with the skeleton of a paper based on their own ideas. But they didn’t have the ideas—the analysis component was completely absent. And it makes sense: To analyze your ideas, they must be your ideas in the first place.
The students who didn’t use AI in that assignment—which, for the most part was because they didn’t want to spend the time figuring it out—their papers also needed work, but their ideas were phenomenal. I could tell that they had critically engaged with the text. Seeing the difference between the two versions of the papers was a huge moment for me.
BORYGA: You write in your new policy that for you to do your job, you must read student writing and know your students’ individual voices. How is AI a threat to the development of a young writer’s voice?
My job is to help kids develop foundational skills. Using AI at this point in time is not a foundational skill.
chanea bond
BOND: In order to be the most authentic version of yourself, you have to know what you want to say. You have to know how you want to say it.
One of my favorite assignments is to have students compare an author and their work from pre-1865 America to someone who is living. Recently, one student chose an older author and a manga author. It was a unique contrast, but when it came time for his presentation, it was clear that the words on all the slides came from AI. I asked why, and he said he was trying to sound “professional” and “academic.”
I had him get out his outline and deliver the presentation like he was talking about his favorite thing in the world. And it was the best presentation we had all year. He laughed and joked and got really excited. I looked at him and said, “That was you.” I had to explain to him that what he did—analyzing these two texts and pulling out very engaging insights for us in his own voice—was the real “academic” work.
BORYGA: Many of the educators who disagreed with your policy argued that by not allowing students to engage with AI, you’re robbing them of an opportunity to use a tool that will shape their lives and careers.
BOND: There are a lot of things we don’t teach kids how to do that they end up using in their careers. That’s not my job. My job is to help kids develop foundational skills. Using AI at this point in time is not a foundational skill. If they need it, they will learn it on the job, in a job-specific way—just like we are doing right now.
BORYGA: In the replies of your tweet, you mentioned that you plan to go as far as not allowing kids to use tools like Grammarly. Can you explain that decision?
BOND: If you’ve taught high school students, or know teachers who have, you’ll know that many students struggle with basic things like capitalization or writing in complete sentences. And why is that? I think it’s because of autocorrect. And I understand. Sometimes, I don’t even write out an idea completely, or focus on my grammar, because I know it’s going to be fixed. However, the difference is, I know when a sentence is supposed to be fixed—and I know how it’s supposed to be fixed—because I have learned those skills.
I’ve seen Grammarly tweak a sentence a student has written, and the output will be grammatically correct but different from what the student intended to say. I need them to be able to write a sentence, know what they want to say, and be able to use their brains to fix it. If they can’t do that on their own, then using a tool that does it for them—especially when they don’t have the discernment necessary to evaluate the output of that tool—isn’t useful.
BORYGA: You wrote that to root out the use of AI this school year, you’ll use an “inquiry process” you’ve developed. Can you share more about what that process entails?
BOND: I have composition notebooks where students are going to draft everything in class and reflect on their process before I allow them to use a word-processing tool like Google Docs to write their final papers. I’m going to be writing in class alongside them, too.
The idea is that all of their initial writing, their outlines, every component that goes into their final drafts, will be written down first. This allows both of us to see and evaluate their ideas first, and then focus on getting those ideas into the final product, and then step back and reflect on the differences, on how did you get from point A to point B?
BORYGA: What will this process look like for, say, the first big essay kids turn in?
BOND: In my composition course, the first thing kids will write is a personal narrative. We will read three personal narratives, and then the kids will start out by writing about something they know. It can be a memory, an event—I just want them to write.
They’ll highlight the parts of their writing they think are important and then think about what’s missing. We’ll discuss structure and how to put loose ideas together. We’ll also focus on individual sentences. My favorite things in the world are really beautiful, meaningful sentences, and I want them to find those in their own work. What are your anchor sentences, and how do you build on those? What is a sentence doing in relation to others, and how does that create the paragraph?
It’s going to be tedious, exhausting, and hard. But it’s also going to be awesome. Eventually, they’ll create an outline, and after that they can transfer the essay to their computers. In the meantime, I’ll have a chance to get eyes on everything they’re up to before the paper is written, so nothing is ever really surprising—unless it is, and that’s when I get some red flags.
BORYGA: What do you recommend to other teachers out there who are also skeptical of AI and want to limit students using it in their classrooms?
BOND: I think they need to go back and look at their standards. We need to think about what we’re responsible for our students learning. And then we need to think about how AI hinders students’ ability to do that learning.
I’ve studied the standards for my district and the state of Texas in depth, and there are very few where AI should be used—especially if students haven’t completely mastered a skill. If the standards say kids are supposed to be thinking critically, and they’re supposed to be drafting and revising, then AI shouldn’t be doing that for them. So if you’re a teacher and you’d like to do something like this and you’re getting some pushback, I say go back to the standards.
This interview has been edited for brevity, clarity, and flow.