For the nation, this year’s NAEP results have garnered a lot of attention, shining a light on low-performance, minimal progress, and large achievement gaps. However, while NAEP has brought renewed attention to the severity of the state of public education, it still hasn’t shown us why large financial investments and policy innovations have yielded, seemingly, no returns. These conversations around “why” are more complicated, as ideology and politics can influence one’s interpretation of why a problem exists. Yet, it is more important than ever to identify the root causes of these problems so that we can begin to see movement on the Nation’s Report Card. For more information on what NAEP showed us, read on below.
The National Assessment of Educational Progress (NAEP) is a congressionally-mandated test that is administered across all states by the National Center for Education Statistics (NCES) which is a part of the U.S. Department of Education. Starting in 1969, this test is administered every two years to similar groups of students across the country allowing direct comparisons to be made across states and monitor progress over time. This is the only mechanism for stakeholders to compare results in all 50 states and a selection of large districts, including Denver, CO.
NAEP vs. CMAS
NAEP is a nationally administered test that allows for the comparison across states on groups of students and academic progress. Only a sample of statistically representative schools and randomly selected students take NAEP.
CMAS is specific to Colorado and is intended to measure how students are performing to specific content-area standards that Colorado policymakers have deemed necessary in order for success in life after high school.
CMAS is trying to measure how all students are doing relative to these standards and CDE encourages all students to participate to provide a whole picture of how Colorado students are performing. Additionally, when all students take the assessment year over year, families and students can better track their progress, and the state can measure student growth. CMAS scores don’t allow us to compare our progress to that of other states.
What happened?
Across the country, most states and large districts saw either no change or a statistically significant decrease in scores. Experts and pundits have many different theories why this may be, often based on whichever frame they have when approaching the education policy debate, including residual impacts from the great recession, federal policy issues, a combination of the two or wholly other reasons. This drop is disheartening, but not wholly different from a decade of largely stagnant achievement.
Why it matters?
This stagnant or worsening progress is acutely alarming in light of very low proficiency to begin with. In Colorado, 44% of 4th graders met standards in Math and 40% met standards in Reading. Those proficiency rates were lower for 8th graders. Not even half of our students are meeting expectations, a problem exacerbated by the fact that we know proficiency isn’t as random as 1 in 2. While, again, Colorado’s achievement gaps were less wide than national achievement gaps, there are significant gaps by race and ethnicity, income, and English language learner status.
What does this mean for Colorado students?
CDE’s press release stresses that Colorado students performed a few percentage points higher than the national averages across several groups. However, as their release states, “Colorado’s average scores have not changed significantly in about 10 years across all NAEP grades and subjects.” This trend of slow progress, with far too few students meeting standards, has been illustrated in much of our own research on schools in Denver, Aurora, and Statewide.
It is more important than ever to learn from those places that are seeing improvement and seriously use evidence-driven policies to drive change. Grounding conversations on improvement in research is critical as people from all over the country approach this topic from very different perspectives.