Education Policy Brief #208 | Steve Piazza | August 17. 2025
As the Trump Administration carries out its crusade to reduce the size of government, one of the targets has been the Department of Education (DOE). The DOE is made up of a number of agencies and offices that have been severely impacted by these actions, one of which is the Institute of Education Sciences (IES).
The IES, considered the evaluative and statistical arm of the DOE, is charged with, amongst other things, producing and reporting on the results of the National Assessment of Educational Progress (NAEP, often referred to as the Nation’s Report Card). This enormous task is performed by one of the four agencies under IES, the National Center for Education Statistics (NCES).
Such reductions have far reaching implications when it comes to assessing the progress of students across the country and in comparison to the rest of the world. More than record keeping, these analyses assist in the development of policy and decisions about funding to states and local education agencies nationwide. Not only that, relevant comparisons help identify disparities so that support finds its way to where it’s needed most.
Not coincidentally, it appears that the NCES missed their deadline for reporting this past year’s results, most likely since their staff of 800 is reportedly now just down to a handful due to the administration’s actions. The DOE says that the reports are going to be handled by another agency, but what has resulted for this round is a vague snapshot rather than an in depth view that is a valuable resource for states.
The effect this might have on student achievement is anyone’s guess, but having such statistics unavailable should be alarming to lawmakers and policy makers as they run head on into the complexity of basing decisions on incomplete data regarding student performance.
Policy Analysis
The extent of the danger in diminishing or eliminating the important work of national data collection on student assessment regarding performance and graduation is best understood in terms of what it does and does not provide.
The NAEP, issued periodically to grades 4, 8 and 12, provides comparative data on how students in the country perform as a whole. For example, the reporting shows that the average 2024 score for all students Grades 4 and 8 in Math have gone up or stayed the same as the previous cycles of testing (2022), while for Reading there’s been a decrease in scores for that time period. All of these are lower than from pre-Covid levels.
Drilling down even more reveals that states may or may not themselves be consistent across grade levels and subject matter. Again, using the most recent reporting year (2024), a state such as Massachusetts leads the country in each grade tested in Math and Reading, whereas Florida is near the top in 4th Grade Reading but is below the national average in the rest of the categories.
It will be interesting to see if these patterns remain the next time the NAEP is administered in 2026. There is already a bit of confusion on the NCES site as it states that Math and Reading assessments for 4th and 8th grade are scheduled for January, 2026, but then it goes on to says it’s also piloting new assessments for 4th, 8th, and 12th in public schools as its updating its testing frameworks. The reason for omitting private schools in the pilot is not provided, but it can only add to degrading the integrity of results.
The tracking of scores around high school graduation is more complicated since less secondary students take the NAEP as it’s voluntary. Still the comparison is available and can offer some insight. It should be noted that IES does not report on graduation exams because states are charged with setting standards and administering all tests, and states may drastically differ. At present, only six states have graduation exit exams: Florida, Louisiana, New Jersey, Ohio, Texas, and Virginia. Massachusetts and New York have eliminated them or are in the process of doing so. Over all, this is a reduction of 18 states since 2013.
On a global scale, the Program for International Student Assessment (PISA) provides valuable statistics on how U.S. students stack up against the rest of the world. For example, it can be useful to know the last time the PISA was given (2022), the U.S. showed a ranking of 31st in Math and Reading according to World Population Review.
NCES oversees the U.S. involvement with the PISA. And since each nation has differing approaches to educational governance, negotiating multiple international bureaucracies with deceased personnel is complicated enough. Also, Trump Administration actions cannot be blamed for the past test cycle’s one-year delay due to the pandemic, but after the unsteadiness of the current round of testing (already delayed since it was scheduled for Spring, 2025) and the recent decision to increase the time between future tests from every three years to four, the recent chaos created by the Executive Branch is most likely the cause.
The future of PISA might be unclear, but what is clear is that without it we cannot see if the U.S progresses beyond where it was in 2022. Again, reducing the amount of information available could artificially improve that ranking, but playing the statistics game is not the same as teaching students to progress in their ability to read or solve math problems. It’s vital the country knows that the U.S. has been slipping, and by not having these results it will be hard to measure where it stands.
Overall, whether the IES is in the business of reporting critical aspects of achievement or not, if these comparisons did not exist, or that results were incomplete or tampered down, states might find themselves going it alone and managing funds would become an exercise based on arbitrary decisions that could only lead to grumblings of more political inequities than already exist. Many students could even face an unrealistic understanding of where they rank particularly when they have an eye toward college.
There’s a reason independent agencies like these exist. Simply put, large amounts of useful data are still cumbersome. Neglecting to maintain a reasonable measure of containment via impartial yet scientific analyses makes it meaningless and subject to arbitrary conclusions that only muddle students’ rankings, home and abroad.
We have seen what happens when politics plays into the reduction and manipulation of available data. This should sound an alarm that students may grow up to be adults not even being able to understand how the weakened system has failed them.
Engagement Resources
FairTest provides information on testing best practices and works towards improving the benefits that student assessment may provide.
The Education Commission of the States provides a good overview of student testing here.