The Rule-Makers Have Arrived
After two years of “figure it out yourself,” states are writing actual AI laws for schools. Whether that’s progress depends on what they put in them.
There are now 53 AI-in-education bills moving across 25 state legislatures. Idaho just signed one into law. Arizona is close behind. Florida wants statewide K-12 AI standards by July.
Two years ago, most states offered guidelines. Suggestions. Frameworks with words like “encourage” and “consider.” That era is ending. The language now is “require,” “mandate,” and “prohibit.”
The rule-makers have arrived. The question is whether they’re building guardrails or just building walls.
From Guidance to Gavels
The shift is striking. South Carolina’s bill would require parental opt-in for any use of AI and explicitly ban AI from replacing licensed teachers in core instruction. Colorado wants mandatory AI professional development for every teacher in grades 6 through 12. Maryland’s proposed Artificial Intelligence Ready Schools Act would require every district to adopt an AI policy aligned with the state's annually updated guidance.
These aren’t advisory memos. They carry enforcement mechanisms, compliance timelines, and consequences.
Meanwhile, the FTC’s amended COPPA rule takes full effect on April 22. That’s two weeks from now. The updated rule adds biometric data to the protected list, bans indefinite data retention, and introduces penalties up to $51,744 per violation per day. Every edtech company serving kids just got a hard deadline.
The regulatory machinery is moving fast. But fast and smart are different things.
The Ground Tells a Different Story
While legislators draft one-size-fits-all rules, the institutions actually doing the work are going in five different directions at once.
Inside Higher Ed profiled five colleges and their AI strategies. Agnes Scott is making AI ethics mandatory for every first-year student. Bryn Mawr turned its libraries into AI sandboxes with librarians leading the way. Cornell developed a 75-minute critical-thinking module that 7,000 students have already completed. DeVry is embedding AI literacy into every single course by year’s end.
No two approaches look alike. That’s not a bug. It’s what happens when institutions actually respond to their students instead of waiting for permission from a statehouse.
And then there’s the other end of the spectrum. Alpha School is preparing to open a K-8 campus in Texas, built on a two-hour AI-driven learning model with no traditional teachers. Experts are already asking the obvious question: are these kids learning, or are they just completing tasks?
The range between Bryn Mawr’s librarian-led AI sandbox and Alpha School’s teacherless classroom is enormous. Good policy would account for that range. Most of these bills don’t.
India Went Big. And That’s Worth Watching.
India’s CBSE just launched a national AI and computational thinking curriculum for students in grades 3 through 8. Starting this school year. For millions of kids.
The approach is interesting because of what it prioritizes. Younger students learn through puzzles and games woven into existing subjects. Older students engage directly with foundational AI concepts. The emphasis is on reasoning and ethics, not just tool use.
It’s also interesting because India didn’t wait for perfect conditions. They built a framework aligned with their National Education Policy and shipped it. Compare that to the U.S., where 25 states are still debating what to require.
There’s a lesson in the contrast. Perfect policy that arrives too late is worse than good-enough policy that arrives on time. Perfect has always been the enemy of good.
What Good Legislation Would Do
ISTE’s Chief Learning Officer Joseph South wrote something last week in EdSurge that should be required reading for every legislator drafting an AI-in-education bill. He argued that AI should eliminate busywork, not the struggle of learning. The distinction between “desirable difficulty” and unproductive friction is exactly the kind of nuance that legislation tends to flatten.
Good AI policy should protect the struggle. It would ensure that schools can still assign hard work that builds thinking, even when AI makes shortcuts easy.
Good AI policy should stay flexible. The five-college profile shows that context matters enormously. A rural community college and a research university need different approaches, not the same compliance checklist.
Good AI policy should have a shelf life. Any law written about AI today will be partially obsolete within 18 months. The best bills build in mandatory review cycles and sunset clauses. The worst ones lock in 2026 assumptions as permanent requirements.
The Real Risk
Fifty-three bills across 25 states sounds like progress. And some of it is. Requiring districts to have an AI policy is better than hoping they’ll get around to it. Protecting children’s biometric data is important.
But the gap between legislation and learning has never been wider. The technology changes quarterly. The bills move annually. The students are making decisions right now.
The rule-makers have arrived. The question isn’t whether we needed them. We did. It’s whether they’re listening to the classrooms before they write the rules.
The Learning Edge publishes twice weekly on the intersection of AI and education. If this resonated, share it with a colleague who’s navigating these questions too.



