America's AI Report Card — Key Findings | aiedge.live
9.05/10
Massachusetts — #1. Vermont #2 at 8.80. Only two states above 8.0.
14
U.S. states have published zero AI education guidance for students.
2.30
New Jersey — lowest of all 36 scored states. #36 of states, #37 of jurisdictions (which include Puerto Rico).
0
federal standards for AI in schools. Each state writes policy from scratch.

Seven findings that change the conversation.

Finding #1
The Effort Crisis Finding — Only 5 states earn a strong rating on student cognitive development
5 of 37 states
Only 5 of 37 scored jurisdictions (Massachusetts, Vermont, New Mexico, Utah, and West Virginia) earn a strong rating (8 or higher out of 10) on the Student Effort / Cognitive Development criterion, which measures whether a state's guidance protects what students should still do themselves before turning to AI. Vermont's January 2026 guidance is the only document in the study to name cognitive offloading directly. Most other states touch on student effort in passing, but do not treat it as a content standard. 62% of scored jurisdictions land at 6 or below on this criterion, making it the second-lowest-scoring of all 7 rubric categories. The guidance landscape treats AI as a tool to be allowed, restricted, or taught, without distinguishing which cognitive tasks AI should not replace.
Criterion: Student Effort / Cognitive Development (15% weight). Strong rating defined as a score of 8.0 or higher. Adapted from UNESCO Human Agency competency and TeachAI Principle #7.
Finding #2
The Policy Gap — 33 of 36 states issued voluntary guidance. Districts can ignore it.
Only 3 states
Among the 36 scored states, only three have enacted laws requiring schools to act: Ohio (HB 96, districts must adopt AI policies by July 1, 2026), Tennessee (SB 1711, mandatory district policies since March 2024), and Utah (HB 218 + HB 273, mandating AI literacy in a required digital skills course and integrating AI into K-12 CS standards). All three rank in the bottom half or middle of our study. Tennessee is the clearest illustration of the mandate paradox: the law compels every district to adopt a policy, but it does not specify what the policy must contain. The result is a compliance floor, not a content standard. Districts can meet the mandate by pasting boilerplate into a handbook without teaching AI literacy, protecting student thinking, or training a single teacher. Tennessee ranks among the bottom of the study precisely because the mandate drove adoption, not quality. Every other state's guidance is advisory. There is no enforcement mechanism — a district in Colorado, Georgia, or Michigan can read the state's AI guidance document and do nothing. The U.S. has no federal AI education standard, no minimum hours requirement, and no national accountability structure. † Idaho enacted SB 1227 (2026) mandating a statewide AI framework, but had not published guidance as of our scoring date — keeping it among the 14 unscored states. New Jersey's AI literacy bill (A4352/S2862) passed both chambers as of April 2026 but had not been signed into law.
Source: MultiState Insider legislative tracker, through April 9, 2026.
Finding #3
The Teacher Training Gap — 86% of early adopters offer real PD. Almost no one else does.
86% vs. <1.5%
Among the 183 publicly documented AI early-adopter districts in the CRPE Early Adopter Database, 86% offer sustained, multi-session professional development tied to implementation benchmarks. That sounds encouraging until you do the math: 183 districts is less than 1.5% of the approximately 13,000 U.S. school districts. The teacher training happening in the vanguard districts is not reaching the other 98.5%.
Source: CRPE 2025-26 Early Adopter Database. "Sustained, multi-session PD" defined per CRPE methodology (not one-time webinars).
Finding #4
The Student Gap — Districts track AI for staff. Rarely for students.
8% of districts
Within the CRPE deep-analysis subsample of 79 early adopters with full course and standards data, only 27% publicly share information about AI courses for students. Only 8% have adjusted learning standards to account for AI. Districts are building infrastructure and training teachers — but the evidence of that investment reaching students in structured, documented ways is thin.
Source: CRPE 2025-26 Early Adopter Database, subsample of 79 districts with full course/standards data.
Finding #5
The Surveillance Paradox — Schools use more AI to watch students than to teach them.
6M+ students monitored
AI-powered student surveillance is the dominant K-12 AI deployment in the United States — not instruction. Gaggle alone monitors roughly 6 million students across approximately 1,500 districts (Gaggle corporate data). GoGuardian was deployed by 7,000+ schools and districts as of 2021 (U.S. Senate Committee on the Judiciary investigation). Combined with Bark, Securly, and similar monitoring tools, AI-assisted student surveillance reaches tens of millions of students — a scale that dwarfs documented AI-for-instruction adoption. Schools are more comfortable deploying AI to watch students than to teach them.
Sources: Gaggle corporate disclosure; U.S. Senate Committee on the Judiciary, investigation into student surveillance tools, 2021.
Finding #6
The Bright Spots — Districts leading without waiting for their state.
183 documented
Iowa City Community School District built a required AI curriculum with no state guidance. Students in San Ramon Valley Unified (CA) built their own AI-powered study app. Students in Bullitt County (KY) built an anti-bullying AI tool. These 183 publicly documented early adopters show what's possible — but they also reveal how much depends on individual district leadership, budget, and initiative rather than systemic state policy.
Source: CRPE 2025-26 Early Adopter Database; district AI documentation verified through public sources.
Finding #7
The Stale Guidance Problem — Nearly half of states have not updated their primary AI guidance since 2024.
18 of 37 jurisdictions
The document dates in this study reflect when each state published its primary guidance document — not when we scored it. Our April 2026 verification sweep found that some states had issued meaningful updates (new versions, revised frameworks, supplementary toolkits) since original publication. After accounting for those updates: 4 jurisdictions have guidance from 2026 (Vermont, California, Kentucky, North Carolina). 15 have guidance from 2025. 17 still have primary documents unchanged since 2024. One state — Oregon — has not issued new guidance since 2023. That means 18 of 37 scored jurisdictions (49%) are operating on documents that predate GPT-4o, Claude 3.5, DeepSeek, and a full year of rapid AI adoption in classrooms. The technology kept moving. Nearly half the states did not.
Dates sourced from individual state DOE documents as catalogued by aiforeducation.io and verified through direct DOE site review (April 2026).

Top 10 States — And the Bottom 10

All 37 jurisdictions scored on a 7-criterion weighted rubric (1–10 scale). Scores updated April 2026 following a full landscape sweep of all jurisdictions. Arrows indicate score changes from October 2025 baseline.

How to read the rankings: Rankings run #1–#37 because they include 36 U.S. states plus Puerto Rico (a U.S. territory). New Jersey is #36 among U.S. states and #37 among all 37 scored jurisdictions. The 14 U.S. states that published nothing carry no rank at all — they fall entirely outside the scoring system. Ranked last and unranked are two different things.

Top 10

#StateScoreChange
1Massachusetts9.05↑ New
2Vermont8.80↓ 0.30
3North Carolina7.90
4New Mexico7.85
5Maine7.80
6Washington7.70
7Utah7.65
8California7.65
9Oklahoma7.35
10Montana7.20

Bottom 10 (of scored states)

#StateScoreChange
37*New Jersey2.30↓ 1.80
36Ohio4.40↓ 2.40
35Tennessee4.95
34Alabama5.15
33Indiana5.70
32Michigan5.75
31Oregon5.85
30Minnesota5.90
29Connecticut5.90
28Louisiana6.00

Full alphabetical listing of all 37 scored jurisdictions available at aiedge.live/report-card. Downloadable dataset: Americas-AI-Report-Card-Data-Study.xlsx.

14 U.S. States Have Published Zero AI Education Guidance

These states were not scored because there is no document to score. Their absence is itself a finding — but not all absence looks the same.

No formal guidance published

Arkansas Florida Illinois Iowa* Kansas Maryland Nebraska New Hampshire New York Pennsylvania† South Carolina South Dakota Texas* Idaho*

* Iowa, Texas, Idaho: Legislation passed requiring guidance — none published as of April 2026. Idaho's law requires guidance by July 2026.

† Pennsylvania: Passed AI safety legislation (deepfakes, minors). No guidance on how students should use AI as a learning tool. Protection without preparation.

Note: Puerto Rico is a scored U.S. territory (#18, score 6.90). The 14 states listed above are U.S. states only (out of 50). An additional 10 states show emerging activity (task forces, coalition documents) but no formal DOE guidance.

How the U.S. Compares to Countries That Have Committed

200M+
China — Mandatory for every student. And escalating.
China made AI education mandatory for all 200+ million primary and secondary school students on September 1, 2025 — minimum 8 class hours per year, from age 6. On April 8, 2026, China's "AI+ Education" Action Plan (Ministry of Education + 4 agencies) mandated AI integration through lifelong education and added AI literacy to the national teacher qualification exam. Target: comprehensive AI literacy for all citizens by 2030. Source: People's Daily, April 2026.
27
EU member states — AI literacy required by law since February 2025.
Article 4 of the EU AI Act (effective February 2025) requires any organization deploying AI systems to ensure sufficient AI literacy of staff. This is a binding legal requirement across all 27 EU member states. Schools must demonstrate compliance. The U.S. has no federal equivalent and no binding standard at any level.
Age 9
Singapore — Structured AI literacy starting in primary school.
Singapore's Ministry of Education integrates AI literacy beginning at Primary 4 (age 9-10) as part of its EdTech Masterplan 2030. AI tools are intentionally withheld at lower primary grades to protect foundational development. Their AI-in-Education Ethics Framework is built on four principles: Fairness, Accountability, Transparency, and Safety. The U.S. has no national AI curriculum at any grade level.
Restarted
South Korea — Stalled, then restarted with new government pressure.
South Korea set an aggressive target of AI integrated across all K-12 schools by 2025. The plan stalled when Parliament reclassified AI textbooks as supplementary materials. As of 2026, the new government is pushing again: mandatory AI courses at national flagship universities, expanded K-12 programs, and significant infrastructure investment. South Korea has a national goal and a national timeline. The U.S. has neither.

How This Study Was Conducted

Scoring Framework

  • 37 jurisdictions scored (36 states + Puerto Rico) using publicly available state DOE AI guidance documents
  • 7-criterion weighted rubric (1–10 scale), adapted from TeachAI, UNESCO AI Competency Frameworks, and AI4K12
  • Multi-source verification: Each state's guidance was first identified through aiforeducation.io (primary aggregator), then cross-verified against the state's own Department of Education website (primary source), the TeachAI Policy Tracker, the FutureEd state legislative tracker, MultiState Insider, Ballotpedia, and the Center for Democracy & Technology (CDT). Where sources disagreed, the state DOE document served as the authoritative source.
  • Two-pass scoring: First pass catalogued all available guidance documents as of October 2025. Second pass (April 2026) searched for policy updates, new legislation, and programs published since. State document dates reflect when each state published its primary guidance — not when we scored it. After our April 2026 verification sweep: 17 states remain at their original 2024 document; 15 states have 2025-dated guidance; 4 jurisdictions have 2026 guidance (Vermont, California, Kentucky, North Carolina); 1 state (Oregon) has not updated since 2023.
  • 17 of 37 states received score adjustments after the April 2026 sweep (new documents, legislation, programs found)
  • Tiebreaker rule: When weighted scores are close, the Student Effort / Cognitive Development sub-score serves as the tiebreaker criterion. Under the Tightened Rubric v2, Massachusetts leads at 9.05 and Vermont follows at 8.80. Vermont is the only jurisdiction that explicitly names cognitive offloading in its guidance (Student Effort sub-score 9 vs. Massachusetts' 8), but Massachusetts' strength across all other criteria gives it the higher overall weighted score.
  • Scored vs. unscored status: New Jersey is scored (2.30) because its Department of Education published "AI in Education: Guidance and Considerations" in March 2024, which is sufficient to score against the rubric. Idaho is unscored because SB 1227 (2026) requires a statewide AI framework by July 2026 but no guidance document has been published yet. Published guidance is the threshold for being scored; legislation without a published document is not.
  • Two-rater scoring: Both Kevin J. Roberts, M.A. (author of The Effort Crisis) and Henry Dan, B.S. in Data Science (machine learning emphasis, Post University), scored independently against the pre-specified rubric. Discrepancies were reconciled by consulting the source document together. We flag small rater panel as a known limitation; the Q3 2026 update will expand the panel and report inter-rater reliability (Cohen's kappa) by criterion. Full rubric is published for public audit and re-scoring.
  • Reweighting sensitivity: The top of the table holds under alternate weighting schemes. Dropping Student Effort (the criterion tied to The Effort Crisis) to zero weight does not move Massachusetts and Vermont out of the top 5. Equal-weighting all 7 criteria produces substantively identical top/bottom rankings. The rubric does not structurally advantage Kevin's thesis.
  • CRPE district sample disclosure: The 79-district CRPE Early Adopter Database is a self-nominated, non-random sample of districts that publicly identified themselves as AI early adopters. It is not representative of the roughly 13,000 U.S. school districts. The "less than 1% of districts" framing is a comparison of this documented-leader sample against the total district population, not a national prevalence study.
  • Data vintage: State documents through October 2025 (first pass) and April 2026 (verification sweep); legislative actions through April 9, 2026; district data from CRPE 2025-26 Early Adopter Database.
  • Funding and disclosure: This study is self-funded. No external funding, sponsorship, grants, or organizational ties. The only financial interest connected to the findings is the sale of Kevin J. Roberts' book The Effort Crisis (Amazon, April 2026).
  • Respond to the data: Scoring corrections, state updates, new guidance documents, and alternate assessments welcome. Email kevin@kevinjroberts.net. Corrections will be incorporated in the Q3 2026 update (July 2026).

The 7 Criteria (weighted): Teach vs. Ban (20%) · AI Literacy (15%) · Critical Thinking (15%) · Student Effort / Cognitive Development (15%) · Teacher Training (15%) · Assessment Adaptation (10%) · Equity (10%)

Full rubric, crosswalk, and state-by-state data: Americas-AI-Report-Card-Data-Study.xlsx (free download)

Full methodology: aiedge.live/report-card (Methodology section)

Request the Full Dataset or a Press Interview

The complete Excel dataset (9 tabs, all 37 states, full rubric, scoring crosswalk, pitch angles, and district targets) is available to journalists and researchers upon request.

Press Inquiry or Dataset Request View Full Report Card →