As artificial intelligence changes the education landscape—presenting issues of inequity and endless possibilities as it does—the nation’s largest teachers’ union is seeking to address classroom use of the technology through policy actions.
The roughly 6,000 delegates voted by voice to approve a policy statement on July 4, the first day of business at the union’s annual representative assembly, at which representatives for the roughly 3 million member organization vote on their priorities, budget, and strategic vision for the upcoming year.
NEA’s policy statements, like its resolutions, are permanent and updated every few years.
The statement broadly seeks to lay the foundation for what educators should advocate for in policies, practices, and professional development to use AI safely and equitably in their classrooms. A report released alongside the statement, provides further guidance and recommendations for implementation.
It looks at the issues of equity, data protection, and environmental impact, and pushes for educators to be involved in discussions about implementation in classrooms. It emphasizes the centrality of humans for teaching, however, arguing that AI shouldn’t be used to replace jobs of educators, or learning.
Roughly a dozen states have issued guidance for the technology, according to a report from the task force, which was convened at last year’s assembly to look into the matter of AI. But there’s no consistency to the policies cropping up at the state and district levels, said Noel Candelaria, a special education teacher and former teacher aide from El Paso, Texas, and secretary-treasurer of the NEA.
The use of AI has grown expediently in the past year, with the Center for Democracy & Technology finding that teachers who reported using AI tools increased by 32 percentage points between the 2022-23 and 2023-24 school years. But even as its prevalence grows, 71 percent of educators have received no professional development on using AI in the classroom.
Many teachers have reported receiving slim guidance from school leadership about how to use the technology, with some embracing it and considering it a “game changer” for administrative work and others more skeptical of its place in schools because of concerns about data privacy and cheating.
NEA leaders hope the policy will help conversations begin where no policies exist, or prompt evaluation of existing policies, he said.
“We’re positioning ourselves to make sure that our members have the tools that they need,” said Candelaria, who served as chairman of the task force. “So if their state is not having the conversation, but we know that at the district level they are, then how can we provide guidance for them to be able to make sure that they’re asking the right questions, to ensure that educators are at the table, not as an afterthought?”
The need for a policy statement from the NEA arose last year during the 2023 representative assembly. NEA President Becky Pringle convened the task force—which had a variety of educators from K-12 and higher education, a school psychologist, and more—after the topic repeatedly came up.
“Artificial intelligence has evolved into a permanent fixture in our communities and schools. Using these new tools equitably, fairly, and safely is essential for our nation’s educators to guide and inspire their students and classes,” Pringle said in a statement. “Utilizing this technology in a manner that supports invaluable face-to-face relationships between educators and students as well as effective pedagogy should always be a priority.”
NEA’s policy is composed of five key principles for educators, looking particularly at: keeping students and teachers at the center of education, using evidence-based technology as an enhancement to education, the ethical development of AI and use of strong data protection practices, equitable access and use of the technology, and ongoing education.
Educators grapple with the challenges AI presents, while embracing its potential
The policy statement contends with the pitfalls of AI, particularly its gaps in equity and access. Task force members point to the fact that developers tend to look a certain way—male, white, cisgender, heterosexual—which creates bias in AI’s functions, particularly for marginalized students. Inequity also exists in what students can benefit from utilizing such technology.
The policy calls for educators to be “intentional and proactive” in preventing bias from impacting how students use the technology. It also cautions against overuse of the technology for students with disabilities and emergent multilingual learners “to be relegated to using AI only for rote memorization, standardized assessment, or answers to factual questions.”
“One of the things that I am most concerned about is exacerbating the digital divide,” said Wil Page, a task force member and Los Angeles Unified School District teacher. “We can’t have that.”
With policies being rolled out inconsistently from state to state or district to district, the more economically advantaged districts could not only embrace the technology faster, but develop professional development to use it effectively.
“You could have kids who live in communities that are adjacent to each other, who end up going off to the same career or college pathway that have a massive difference in their artificial intelligence efficacy,” Page said.
But that’s where the potential comes in, too, said Candelaria. Educators are preparing a generation of future developers, who will come from all backgrounds and abilities, he said. Literacy on the tools is important so students can understand AI’s shortcomings, its potential for misinformation, and simply how to use it.
“There’s a lot of work to do in that area, but I think we have a tremendous responsibility as educators to prepare the next generation of AI developers,” he said.
Rural communities, like in task force member Angie Powers’ native Kansas, also face gaps in access. Though broadband connectivity has improved after remote learning under COVID-19 forced the nation to contend with its deficiencies, Kansas has a ways to go, she said.
But those rural communities also face the environmental burden of AI, something the policy statement also pushes educators to confront. Rural communities are typically the sites of the data centers housing the computer and server power fueling AI, causing a drain on local resources—like water—due to the amount of energy they sap, Powers said. Generating one image through AI uses the same amount of energy as fully charging a cell phone, according to the committee’s report.
“That’s changing not only our digital spaces, but has an impact on our physical spaces,” said Powers, a high school teacher in Kansas City. “And the students in our classroom care about this, because that’s going to impact them the rest of their lives.”
The policy states the NEA will advocate at the federal, state, and local levels “for the environmental impacts of AI to be considered in the decision-making processes around the development and application of AI tools.”
One day the technology might evolve to not require so much energy. But until then, Page added, “It’s going to be incumbent on educators to really talk about that aspect of it so our students, and our educators, are utilizing it as responsibly as possible.”
The task force also highlighted ways the technology could help simplify a teacher’s day-to-day workload. For new teachers, who are balancing learning classroom management and pedagogy, AI could be a partner in developing engaging lesson plans, or streamlining administrative tasks. Page thinks it could help with recruitment and retainment as teachers face burnout and quit.
“If all of the sudden you could start to utilize tools to do administrative tasks and to really sort of help you really just be able to learn the craft of being an educator and from there grow and allow for your students to have fun and you to have fun, maybe we get more who are staying longer and who realize the value of this as part of the common good,” he said.
The policy statement asserts that educators “must be afforded high-quality, multifaced, ongoing professional learning opportunities that help increase their AI literacy and understand what, how, and why specific AI is being used in their educational settings.”
The possibilities for it to assist students with disabilities also feel limitless, said Powers.
“It’s like that idea with the sidewalk cutouts,” she said. “We developed those for a specific reason, but parents with strollers benefit from it. So as we use AI to lift up our students with disabilities, I think we’re going to find surprising ways perhaps that AI can be used to help support students maybe with disabilities that haven’t been diagnosed, [or who] maybe just learn a little differently, to appeal to their interests a little bit more. We don’t even know what all the possibilities are yet, and that’s really, really exciting.”
The importance of a human behind the technology—and the teacher-student connection—remains vital, however, task force members said. They cautioned against schools using AI for disciplinary actions, determining grade level promotion, diagnosing students, or evaluating educators on job performance. The policy envisions “AI-enhanced technology as an aid to public educators and education, not as a replacement for meaningful and necessary human connection.” NEA also advised that it “should never be used for high-stakes or determinative decisions.”
“AI uses great amounts of data to spit out output. But that’s not the same as me knowing your story and knowing that maybe you’re hungry, and maybe that’s what you need to learn today,” Powers said. “AI can’t necessarily do that for you.”