Sam Chivers for Edutopia
Technology Integration

How Forward-Thinking Schools Are Shaping the Future of AI in Education

Districts across the country are creating new leadership roles, identifying best practices, and writing flexible policies to prepare students and teachers for an AI-driven world.

November 7, 2025

Your content has been saved!

Go to My Saved Content.

Early on in Robert Dickson’s tenure as the chief information officer for Wichita Public Schools, he reviewed the district’s technology every five years. The pandemic cut that cycle to every three years. Then, in late 2022 a little known company called OpenAI unleashed a flood of almost magical tools that could write papers, draft lesson plans, and solve math problems in seconds. Overnight, students had access to them, and the company committed to regular improvements with the stated goal of sharpening the tools until they became “superintelligent” and reshaped the knowledge economy as we know it. 

Dickson’s technology strategy needed to change again—and fast. 

“I don’t feel like I can even plan where things are going a year out,” Dickson said. “I have to constantly be active and ready to pivot.” 

Since ChatGPT’s public debut, school leaders across the country have scrambled to figure out how to handle AI in schools. What began with questions like “should we even allow this?” has evolved as the tools have improved and gained traction with both educators and students. 

Today, ignoring AI feels as impractical as banning computers or calculators.

“AI isn’t something you can block anymore—it’s reality,” said Dyane Smokorowski, coordinator of digital literacy at Wichita Public Schools. She noted that every major search engine students use now comes with built-in large language model features. “It can write papers, answer questions—everything ChatGPT can do—and you can’t really stop it.”

In the face of a revolutionary change that many educators believe will alter the career trajectories of school-aged kids and the instructional methods of their schools, a handful of districts across the country, from Gwinnett County Schools in Georgia to Canyons School District in Utah, have responded with drastic measures to meet the moment. 

In Gwinnett, a new high school’s computer science pathway is teaching students how to build AI tools to tackle real-world problems, and in Utah and Wichita, new executive-level roles have been created to test, refine, and disseminate AI best practices across a wide range of disciplines, from science to ELA to history. Meanwhile, administrators in Missouri are deploying their own chatbots to help teachers write everything from lesson plans to IEPs, with the goal of rolling out discipline and grade-level specific chatbots for student use next. 

“We have to think about what the world will look like in the future for today’s kindergartners,” said Dr. Kevin Carl, the superintendent of the Hancock Place School District in Missouri. “For us to not be responsive at the same speed at which the technology is evolving is really a disservice to our students.” 

PROACTIVE EARLY EFFORTS 

The districts leaning into AI tend to share one trait: Where others see risk with AI tools, they see possibilities. For at least one district that vision came far earlier than most. 

Lisa Watkins, executive director of instructional technology and innovation at Gwinnett County Public Schools in Georgia said her district’s AI preparation dates back to 2017, when her superintendent read a prescient McKinsey Global Institute report about the future of work in a world powered by advanced machine learning, AI assistants, and other technologies capable of performing increasingly difficult cognitive tasks. Watkins said the report made one thing clear: “AI is going to be a thing, and we’re going to need to lean into it.”  

In 2019, years before ChatGPT was a household name, the district developed a cluster of schools feeding from kindergarten to high school with the express goal of integrating “both discrete and embedded AI learning experiences” into their curriculum.

In addition to traditional coursework, as part of the sequenced program, first graders start with the basics: describing the steps required to assemble a peanut butter and jelly sandwich—an early attempt to mimic the sequential and logical precision required in coding—and then build toward more sophisticated problem-solving skills such as programming Lego robots in third grade, Logan Malm, director of elementary science, told the Atlanta Journal-Constitution. By middle school, students are taking part in drone and robotics clubs and using Python to program rudimentary charts that decipher people’s genetic codes to determine the possible genotypes and phenotypes of future children. 

In turn, those cluster schools feed into Seckinger High, a 2,000 student school that opened in 2022 and offers students an advanced, three-course AI pathway that Watkins said provides “rigorous technical learning” for students interested in careers developing AI tools. The three courses include an introduction to the foundations of AI, which delves into the basics of computer programming, data science, and mathematical reasoning, followed by two more courses building on these foundations and pushing students to design and apply AI solutions to real-world problems using machine learning and professional software development tools. 

These schools are not operating in isolation. Watkins said her district is using them to guide how they approach embedding AI literacy and proficiency into the curriculum of the nearly 200,000 students who make up the district. A framework spells out the skills district leaders believe future graduates will need to be successful—such as modeling and visualizing data, computational thinking, design thinking, and logical reasoning—which is used to help district teachers integrate AI concepts into their instruction in ways that align with their subject matter. Meanwhile the district’s evolving AI policy makes it clear that this integration is expected, noting that students and teachers “must be future-ready by understanding AI and demonstrating responsible and ethical use of AI,” and clearly stating appropriate student use cases, such as using AI to support research, foreign language development, or brainstorming.

Together the policies and framework, Watkins said, are meant to push the district to produce graduates who, at a base level, have the ability to understand and use AI technology proficiently and ethically. They’re also designed to offer a smaller group of students the opportunity to go deeper and learn how to develop AI tools and apply them to real-world problems. 

GIVING AI A SEAT AT THE TABLE

While most districts aren’t running as far ahead as Gwinnett County, many are making AI a priority by creating entirely new positions, often with executive authority, to help them analyze emerging research, craft new policies, and helm their district-wide approach.  

In Utah’s Canyons School District, “We realized pretty quickly that we needed someone to be able to steer and guide what AI integration would look like in our district, and to help create the documentation and systems to support that work, too,” said Emma Moss, the district’s AI lead. Prior to 2023, Moss was a digital teaching and learning specialist, but due to her acute understanding of emerging AI tools thanks to a background in neuroscience and designing large computer networks, she was elevated to lead the 33,000-student district’s plan to integrate AI into its classrooms. 

Moss said her skill set gave her the foresight to understand early on that generative AI would be far more pervasive than just one app, and would become a “general purpose technology that would be integrated into many things and would eventually become unavoidable.” While many neighboring districts in Utah deployed rigid policies banning nascent AI tools on their servers, Moss steered Canyon to create a policy that, like Gwinnett County’s, viewed AI as an “innovative and implementable educational tool” that could prepare students for future careers and spelled out guiding principles for how students and teachers could partner with AI to brainstorm approaches to learning tasks, create and scaffold lesson plans and assessments, and “clarify understanding by asking for explanations or examples.” 

The policy also addresses ethical concerns related to cheating and plagiarism, and instructs students to ensure they’re using AI properly by asking themselves questions like: “Is how I am using AI helping support my learning or is it learning for me?” and “Does what I just learned help me create, collaborate, communicate, or be more creative? Or does it simply give me the answer?” 

It’s an approach other recently elevated AI leaders have suggested in other districts. At Wichita Public Schools, AI specialist Katelyn Schoenhofer, a former technology coach, spends most of her time tracking the fast-moving field—new tools, model updates, research on student learning, and insights from teachers in the classroom—all of which inform the district’s AI policy. According to Schoenhofer, positions like hers, responsible for merging technical expertise and research with classroom needs and training, are necessary in this new AI-powered reality: “Without that person, or that team, the second order of change doesn’t happen. Things stay surface level.” 

AVOIDING THE INFLEXIBILITY TRAP

Eric Hudson, a former teacher and AI consultant who helps district leaders formulate smart AI policies, says that a key idea he initially imparts to leaders is that their plans must grapple with how fundamentally different AI tools are compared to tech they integrated in the past. “AI is not like a smart board. You can’t decide if you want to buy it or not. It’s already there.” 

His most valuable advice, he said, is to “shift away from super detailed, draconian policies and move more towards using a set of ethical guidelines to educate students and adults in effective use and decision-making.” It’s equally important, he added, that district leaders make peace with the fact that whatever approach seems reasonable today will keep changing, and constant iteration should be expected and pursued through feedback from all corners of a school community. “Given how fast AI evolves, I don’t know how you could even articulate a policy that captures every potential use case.”

Dickson, the CIO of Wichita Public Schools, said that a lot of the adjustments he’s made to his policies and plans come from insights that emerge in a monthly meeting with other district leaders, teachers, instructional coaches, IT staff, and parents. Together they discuss AI policies and use cases, which helps him ensure his approach to AI is both effective and appropriately paced. “I’m very tech forward,” Dickson admitted. “It’s great to have parents on this committee say to us, ‘Well, I don’t know if I’m ready for my kid to do this type of thing, or to be exposed to this.’ That friction helps us navigate in a more intentional way, and not get so far ahead of ourselves.” 

Feedback Dickson has received has helped sharpen the district’s approach to new AI experiments, such as using district-created chatbots to ease teacher workloads. At one environmental magnet school, for example, Schoenhofer built a custom lesson-planning assistant trained on both the school’s grade-level standards and its environmentally-focused curriculum. Teachers run lessons through the bot to ensure they’re aligned with the school’s aims. Another chatbot assists special education instructors with the time-consuming drafting of IEPs, dramatically slashing administrative time. The district plans to learn from both trials, with the long-term goal of designing chatbots for different grade levels and subjects that students will use in the classroom, too.

In Utah, Moss said a lot of the best insights she learns about day-to-day use of AI in the classroom come from teachers. Her staff spends hours sifting through the usage logs of AI tools to identify teachers who are generating hundreds of prompts and asking them: “What are you doing? What’s working?” 

Her team visits classrooms, collects the most interesting use cases, and distributes them as “promising practices” for others to try, organized by subject, such as “AI in English,” or “AI in Music.” Examples they’ve highlighted in the past include using AI tools in a kindergarten class to create a fictional rap battle between Henry Ford and the Wright Brothers that not only gives students access to background knowledge about the history of world-changing innovations, but introduces students at an early age to AI’s fallibility. 

Highlighting these practices can help “lower the barrier of entry” for educators to try using AI tools in their own classroom, Moss said. “Teachers can look at an example and think, ‘Oh, I can do that.’” The strategy seems to be working: By the end of last year, Moss said that nearly 80 percent of teachers in her district were regularly using AI tools.

A COMMITMENT TO FEWER TOOLS 

With new AI tools debuting almost daily, school districts are exploring narrowing their scope and focusing on only a handful to vet, use, and master, rather than wasting valuable time and resources chasing new products. 

At Hancock Place School District in Missouri, for example, leaders spent a year visiting education conferences and listening to experts to survey the landscape of new tools, before committing to licenses for just three tools they believed met their criteria for being safe, controlled, and committed to regular refinements: Brisk, Snorkl, and School AI. They created high-quality opportunities for their teachers to learn how to use the tools, such as inviting influential author and edtech specialist Holly Clark to lead a hands-on, required staff training over the summer. 

Teachers were paid for their time and had to leave with a specific use case to try in the coming school year. “It couldn’t just be, ‘Oh, I’ll ask AI to help with an assignment,’” said Dirksen. “It had to elevate the learning experience for students and tie back to curriculum goals.” Teachers workshopped their ideas not only with Clark and school leaders but with representatives from the licensed tools, who helped refine applications in real time.

The goal, Carl explained, is to build momentum until AI use becomes routine. He estimates that within a year, 80 percent of his teachers will be using AI. “Because we’ve let them experiment and not put constraints on it, we’re letting them be creative. And when I as a teacher can be creative—that’s the game changer.”

Gwinnett County has taken a similar approach to tools, narrowing its focus to three: Magic School AI, Microsoft Copilot, and Google’s Gemini. Limiting the tools the district uses allows them to provide higher-quality professional development that delves deeper into practical application, Watkins said. Each school in the district has an instructional technology coach trained on approved tools to assist teachers experimenting with real use cases tied to their learning objectives. “The idea is for teachers to start with their goals and work backwards to see where AI might help,” Watkins explained. “At the end of the day, we’re still very tied to standards.”

KEEPING HUMANS AT THE CENTER

Even as many districts lean into AI, leaders say they’re careful not to lose sight of the fundamentals that make a school system successful: creating the conditions for excellent human teachers to thrive, and ensuring students feel challenged as they build timeless skills like problem solving, critical thinking, and creativity. 

Dickson stressed that for him, knowing where the line is often involves keeping in mind that struggle is essential for human learning, and that any tool that robs students of that experience isn’t worthwhile. “If [learning] is not a little miserable, it’s probably not memorable.” 

While many teachers remain resistant to the idea of deploying AI in the classroom, framing tools as a supplement—and easing teachers into using them by not making use required and providing training and guidance, as well as peer examples to learn from—can often shift mindsets, Hudson said. “Suddenly they’re playing around with it and realize, ‘Oh, this is actually pretty helpful. It’ll save me six hours a week, it’ll make me think differently about my lesson plan.’”

Creating proficient, perhaps even advanced users of a revolutionary technological tool is the real end goal, district leaders say. They’re not intent on chasing a new trend or even rethinking how school works, but they are concerned with preparing their students for what they see as an unavoidable future. Watkins, for example, pointed to a recent World Economic Forum report noting that employers are already seeking workers with AI skills, reorienting their business models around AI, planning to phase out roles that can be automated, and firing employees who don’t have specific AI skills or a desire to learn them. 

For her, the takeaway is simple: “AI is changing the job market, it’s changing how we work, and it’s changing the skills students will need,” she said. “All of that data is our why for figuring out how to get this right.”

Share This Story

  • bluesky icon
  • email icon

Filed Under

  • Technology Integration
  • ChatGPT & Generative AI
  • Education Trends

Follow Edutopia

  • facebook icon
  • bluesky icon
  • pinterest icon
  • instagram icon
  • youtube icon
  • Privacy Policy
  • Terms of Use
George Lucas Educational Foundation
Edutopia is an initiative of the George Lucas Educational Foundation.
Edutopia®, the EDU Logo™ and Lucas Education Research Logo® are trademarks or registered trademarks of the George Lucas Educational Foundation in the U.S. and other countries.