How to Harness AI in the Classroom
Collaborators at the Deeper Learning New York Conference discuss how they’re using AI as a tool for teaching and learning
As more and more students use artificial intelligence (AI) for educational purposes, an “abstinence-based” policy is not a viable path forward for school and district leaders. Instead, we should focus on creating an atmosphere in which students and teachers use this powerful technology safely.
At the annual Deeper Learning New York (DLNY) conference, hosted by Ulster BOCES and focused on the theme “Leading for Deeper Learning,” we recently presented a Deep Dive session on using AI in the classroom. In this wide-ranging half-day session, we discussed how AI's ability to aggregate and analyze vast amounts of information allows teachers and students to customize teaching and learning—but only when it is used responsibly.
Here are some of the key takeaways from the insightful conversations we had with district leaders at DLNY.
Embracing AI as a Tool
Our presentation began with an interactive game show in which we asked participants to determine whether a given piece of work was created by AI or by a human. We called it “The Bot or Not Game Show.” The goal was to spark a conversation about AI and challenge common assumptions regarding the technology. For some educators, AI has a similar status as calculators did when they were first introduced: it seems like a shortcut that bypasses actual learning. Of course, that attitude changed and calculators are widely seen as indispensable tools. We see AI as a tool for learning in the same way that probes are a tool in science classes: it changes the way students do the work, but it doesn’t do the work for them.
We discussed how AI is in a period similar to the early days of social networks. Some educators have adopted what we call “abstinence-based” policies, but our hope is that schools won’t miss the opportunity to embrace AI in the way that many of us missed the opportunity to use social media as a teaching tool. To do that, of course, teachers and students need guidance from school district leaders.
Our district has had many discussions about how best to support responsible use of AI. While we don't yet have strict, written guidelines in place yet, we remain focused on student data privacy and academic integrity. Our general rule for teachers is, “Unless the tech department has purchased the tool, don’t input any student data into it.”
Tools for Students and Teachers
While we urge our teachers to be cautious, we use multiple AI tools to generate creative work, and encourage students to leverage AI to boost their creativity across various mediums. For example, in creative writing, using Grammarly frees students to focus on expressing their ideas rather than worrying about grammatical errors.
Tech & Learning Newsletter
Tools and ideas to transform education. Sign up below.
Another tool we use to support teaching and learning is School AI. One of its most notable features is its student facing generative bot that doesn’t just generate text—it asks guiding questions to help students develop and refine their own ideas and clarify misunderstandings. The goal is to enhance the learning process rather than do the work for them, creating a personalized learning experience that mimics one-on-one teaching. A great example of this technology in action is in special education, where teachers can input IEP goals in the program, such as writing objectives, and the chatbot will act as a personalized tutor, guiding students to meet their goals and providing personalized support along the way.
Another tool we use in our district is Ink Wire, which assists students in creating portfolios for their work in STEM programs. In this technology, AI takes more of a supportive role, helping teachers with lesson planning and students with refining their writing for their portfolios. The purpose of these technologies is clear: while they handle some of the more tedious tasks, they do not replace the work students need to do to learn. Instead, these tools help personalize instruction and streamline processes, allowing students and teachers to focus on teaching and learning.
A Hippocratic Oath for AI
We introduced an intriguing concept during our session: an educator’s Hippocratic Oath for AI. While still in its early stages, the idea revolves around fostering open discussions about the responsible and ethical use of AI in the classroom—something we believe was missing during the rise of social media, when some educators were told, “Don't talk about it, don't let students use it,” and that led to kids making mistakes on social media. As educators, it’s our job to teach students how and when to use AI, just as it’s our job to teach kids how to be safe, how to be good people, and how to interact with others.
During the conference, as is the case in our district, the consensus was that an abstinence-based AI policy is not the most beneficial approach for teachers and students. Engaging in discussions about how to use AI and having an open mind to reframe concepts such as creation, plagiarism, and cheating will be more productive than simply saying “no.” Before the current school year began, our district held a two-day workshop with 40 teachers from a variety of disciplines including, K-12, special education, and reading. We discussed AI through the lens of student privacy.
As AI continues to evolve and educators learn to embrace it, we’re deeply excited about its potential. The teachers who joined the workshop felt the burden of responsibility to spread the word about AI. We look forward to seeing how we can shift teachers’ mindsets and help our students know when and how to use (and not to use) AI.
Dr. Andrew Taylor is the Director of Technology and Innovation for Chappaqua School District.
Dr. Ellen Moskowitz is the Director of Technology and Innovation at the Croton-Harmon Union Free School District.