Preparing Our Schools: AI Guardrails for State and School District Leaders to Consider

Explore how states and districts can responsibly harness AI in K‑12 education with clear guardrails that protect student privacy, support teachers and ensure safe, effective learning tools.

Innovation

Artificial intelligence or AI is becoming a part of our everyday lives. It affects the way we travel, the way we document our lives, maybe even the way we buy homes in a not-so-distant future. 

That means it’s also showing up in schools, where students are using it to receive personalized feedback and as a thought partner for brainstorming ideas or writing essays. 

Many headlines focus on the potential downsides of AI with respect to the learning process: Students are using it to cheat. It’s become a shortcut for the foundational basics. Is it making our kids dumber? 

But with appropriate guardrails, AI also can be a tremendous asset in the classroom for students and educators alike. From personalized math tutors to adaptive reading assistants, there’s enormous potential to help students master difficult concepts and meet them where they are on their learner journeys. 

However, without clear rules of the road, we risk compromising student privacy and the overall integrity of the learning experience. 

That’s why ExcelinEd is releasing our “Model Policy: Guardrails for AI-Powered Educational Tools in K-12 Schools.” This policy provides a framework for state leaders and school districts to ensure that the benefits of AI in the education space don’t come at the cost of safety. 

It’s important to note that this model policy is just a first step. A starting ground. As AI continues to evolve—and as schools discover new and better ways to use it—our policy will evolve, too. We expect these guardrails to grow and adapt alongside the technology so that states can continue empowering educators and students in a rapidly changing world. 

How Students Are Using Artificial Intelligence Today 

New research from the College Board reveals a significant and rapid increase in the adoption of generative AI among U.S. high schoolers, with 84% of students reporting in May 2025 that they use these tools for schoolwork. 

Students primarily utilize AI for brainstorming ideas, editing essays and conducting research, but they remain deeply divided on whether the educational benefits outweigh risks like misinformation and overreliance.  

Roughly 60% of parents and a large majority of school administrators agree it’s better for students to use generative AI for schoolwork than not, but a healthy dose of more general skepticism and a critical gap in formal guidance persists. Nearly 40% of schools ban the technology entirely, and others have no formal policy in place.  

As states, schools and classrooms adapt, our model policy makes recommendations in four key areas we believe are foundational to harnessing the power of AI in K-12 educational settings: 

Data Privacy: Putting Students First 

In the age of AI, data is currency. However, a student’s personal information should never be for sale or used to train a corporate algorithm. Our policy mandates that all AI providers: 

Operator Accountability: Transparency as a Requirement 

We shouldn’t have to guess how an AI tool makes decisions. To build trust, our policy requires AI operators to maintain an open-book policy with schools and parents: 

Software Design: Education, Not Addiction 

AI should be designed to spark curiosity, not compulsive use. We’ve already seen the harms from social media engagement loops; we cannot allow those same tactics in instructional software. Our model policy requires: 

Defining the Roles of the State and School District 

We can’t say this often enough: Consistency is key when it comes to AI in education.  

Instead of 1,000 different districts trying to vet 1,000 different tools, the policy tasks the State Department of Education with maintaining a list of reviewed and approved tools. This streamlines the process for local schools while ensuring every piece of tech in a student’s hands has met a high bar for instructional alignment, privacy and accessibility. 

A Path Forward for AI in K-12 Education 

This policy isn’t about slowing down technology. It’s designed to make sure technology serves the student. By setting basic guardrails now, we can move toward a future where AI tools are trusted, effective and transparent partners in education. 

Our model policy is a starting point for a conversation at the state level that will span the years to come. We have intentionally left blank a section related to student learning impact as we are still receiving input on the appropriate standard to set in this area.  We welcome your feedback and look forward to collaborating.  

CITATIONS

News & Media Sources

Research & Institutional Reports

ExcelinEd Policy Documents

Solution Areas:

Digital Policy