These posts are the opinions of the writers and do not necessarily reflect the views of A+ Colorado.


The Mis/Non-measure of Schools, Time to Re-think the SPF?

A quality school is the most important factor–after family–in determining a person’s economic and social development. The research perspective is clear: schools can have a huge impact on not only academics but also on a person’s character and social skills. We see this in the impact that schools like KIPP have had at successfully getting students through college, or the value-add of programs like Advanced Placement or International Baccalaureate. We can also reflect on what nations like South Korea, China, India or Finland have done over recent decades to see the impact of investing in improving schooling.

Schools matter enormously, yet most of us remain blind to whether schools are delivering results. Here’s how the latest push to provide measures of school quality fall short.

The Single Measure

I have long been an advocate for schools, districts, and the state, defining and communicating school quality to families and communities.  We are fortunate to have had Denver Public Schools leading the state and nation for over a decade in data transparency and providing information to families about the quality of schools through the School Performance Framework (SPF).   

While I think it’s time for other districts and the Colorado Department of Education to learn from Denver’s work, we need to have a series of discussions about how to have more school performance data provided to families and how/whether this information is rolled up into a single measure of “school quality.”

We Are Missing the Data Behind the Rating

First, in the case of summary measures, the whole is only as good as the sum of its parts. The current SPFs (in Denver and otherwise), particularly at the elementary and middle school levels, are primarily a combination of metrics derived from the state summative assessment, which is now PARCC. The metrics included at the secondary level are slightly broader. This underlying data is quite literally foundational, and yet we are at a point where this information is becoming increasingly difficult to access.

For example, the most recent release of the Denver SPF included no publicly available file with the data that fed into the SPF. Communities had no way to know why they were “green” on status (absolute academic achievement measures), and “yellow” on the equity indicator.  Which groups of kids didn’t reach the same expectations as their peers? And in which subjects? DPS does require schools to have conversations with families about the SPF results, but why are there not other avenues for families to access the underlying information about how students in the school are performing?

And this is in a district that is a leader in providing information about schools to the broader community.  Denver has done a remarkable job compared to most school districts by providing a simple color guide from Blue (excellent), Green (good), Yellow (progressing), Orange (poor) to Red (failing) that incorporates dozens of student performance indicators. The district should be applauded for putting the ratings on the school enrollment guides and using these ratings to drive decision making in the district.   

At the other extreme, we have most Colorado districts sharing next to nothing with families. There are few districts other than Denver that have made any substantive efforts to communicate or translate this information for families or the general public.  This seems to be particularly true for some districts that have the lowest performing schools in the state.  Aurora, Adams 14, Westminster 50, JeffCo, DougCo and others do little beyond sharing the state ratings to families. It is rare to find school performance data on any district or school websites.  Often the best you can find is a link to the CDE website or SchoolView that requires a bit of skill to navigate.   

Further, the Colorado Department of Education seems to be actively working to obscure important school performance information, as data continues to disappear. For example, the 2016 PARCC result release masked hundreds of additional data points due to new suppression rules, gave little information about the variance of student performance within schools or districts, and gave no information about the achievement levels of different student groups.

An Overemphasis on Student Growth

Second, I am concerned that in developing a summary measure, we are overemphasizing student growth. DPS for example weights growth over status at a ratio of 3 to 1. This is quite a bit higher than how the state weights growth over status (a ratio of 1.5 to 1 at elementary and middle school, and 1.3 to 1 in high school).

A+ has raised the issue that Denver’s SPF model weights growth too heavily a number of times before. We do deeply value growth—the measure that indicates how students are progressing year to year regardless of whether or not they meet grade level standards. However, growth is only helpful if kids get where they need to be going, which is why we have advocated for a better balance between growth and proficiency on the SPF. Additionally, growth is a fairly volatile metric: it can change significantly year over year. Since such a variable measurement is a major component of the overall rating, the SPF is more apt to move around a lot as well.

Setting the Right Expectations

Third, I am convinced that the bar must be raised for school quality ratings.

DPS held a press conference to unveil the latest school ratings at Trevista in Northwest Denver. Trevista has been struggling for years, pass rates on state tests in 2013, 2014, and 2015 put the school in the bottom five percent of schools in Colorado.  The most recent state test scores from PARCC in spring 2016 indicate that there has been improvement in pass rates in some grades, but performance from other grades was suppressed due to low numbers of students meeting grade-level standards. While this information signals improvement, Trevista was a “Red” school in 2013 and 2014; because of its most recent median growth scores were near the 60th percentile (good but not close to what is needed for these students), the school is now “Green.”  Doesn’t it make sense that Trevista should be “Yellow” or “Orange” for improvement rather than “Green” meaning that the school has arrived?

There is more than one school that is rated “green” on the new Denver SPF that falls within the bottom 10 percent of schools in the state in terms of student achievement. A few of these schools may have made some improvement in helping some more students reach grade-level expectations. But, when you peel back the data (which is a challenge because much of that data is masked), you’ll see that at many of these schools fewer than 15 or 20 percent of students can read or do math at the level needed to be on track for college or career.   

There are far too many low “green” schools that the district is claiming are good, yet are not.  I suspect most folks would not say a school is good if there is a less than 1 in 6 chance of a student being prepared for success in life. Trevista deserves to be “Yellow” or “Orange.” Improvement is critical, but few parents I know would send their own kids to a school like Trevista because of its poor performance in reaching important academic standards.  

Similarly, 70% of Colorado’s schools have been assigned a “Performance Plan,” i.e. labeled as good, by CDE. This includes a number of truly good schools, but it also includes far too many schools where the vast majority of students are not reading, writing, or doing math at grade level, and where there is little evidence they will ever get to grade level.  

Additionally, while for nearly twenty years Colorado has had a school accountability system that has been focused on supporting students to get to achieve at a predetermined standard (a criteria based system), CDE just decided that all of the state school ratings will be based on average scale scores.  In other words, no longer will the state designate a school rating on how many reach standard, it will now be just on how the average student scores.  Last I checked, Colorado’s average was still far below where most kids need to be in order to be college or career ready.  

Conflating Quality and Performance

Fourth, I’m just not convinced these amalgamated measures are of school “quality.” Much like an IQ or SAT should not be the sole measure of a person, the School Performance Framework (SPF) rating should not be the only measure of a school.  There are far too many important indicators that get lost when you average so many indicators into a single measure. Is it possible to have a school that does very well on writing, not so much on math or some other high school that has average ACT scores gets large numbers of kids into great colleges? I worry that there are important aspects of a school like whether all kids are growing, how much, and whether kids are reading at grade level that get lost in the quest to create the perfect overall measure.   

Further, these ratings were originally designed to be school performance measures which help guide the state and district in managing the schools, less for a school quality indicator for families and the public.  

The SPF helps the district understand which schools are passing a litmus test for academic outcomes that we expect for all of our students.  Specifically, are students reading, writing and doing math at grade-level?  What the SPF does not answer, and does not intend to answer, are bigger questions about school quality that we all think about when we chose a school for our kids.  The SPF cannot answer whether or not kids feel supported to explore new things, whether they have access to rich arts programming, whether there are opportunities for play and physical activity, whether their unique learning needs are supported, whether families feel welcomed into the building, whether there is a culture that fits for the students and families.

The SPF, in both Denver and at the state level, is really about understanding which 5%-10% of schools that are the farthest from reaching those basic academic needs of students. This information does, and should, directly affect how the state and districts manage schools. And while this information also provides critical information families about schools, there is additional information about school “quality” that is not captured.

And so we face a conundrum with the SPF: too many individual metrics are included and  we use it as a proxy for indicators that are simply not included.  

The Bigger Picture

The education community has taken significant steps in the past decade to measure the quality of our schools. And now we have entered an age in which there is an increasing focus on education data and the assignment of this data to schools as an indication of quality.  It reminds me a bit of the science in the late 19th century.  

One of my favorite thinkers, Stephen J. Gould often wrote about the power and misuse of science in understanding our world.  His book, the Mismeasure of Man, is a history and critique of the idea that a person’s “worth can be assigned to individuals and groups by measuring intelligence as a single quantity.”

The book’s cover begins with a quote from the Voyage of the Beagle by Charles Darwin: “If the misery of our poor be caused not by the laws of nature but by our institutions, great is our sin.”  Gould debunks statistics, measurements and arguments used to construct an absolute singular measure of intelligence along with biologic determinism are not only wrong but also of great danger for society.

With the continued push from the state and the feds to have single ratings of school quality, are we headed down the wrong path?

A Better Way Forward?

The FDA appropriately requires food and drugs to be accurately labeled for ingredients and side effects, yet there are few requirements and no enforcement that schools or districts share academic performance with community or families.  We have far more public information about any consumer products than we do about schools. I find it both mind boggling and maddening.  

We know that to improve schools we must empower people with actionable information about what is working and not. Parents need to know what makes a good school and whether their school is working. The public and policymakers must know which schools are working so that we can begin the work of improving public education.  It should not be left up to A+ or Colorado School Grades to help parents know about school quality; this should be a primary job for the state and districts.

Districts like Denver have made significant progress, it is now time for Denver to continue to improve while the rest of the state catches up. Families in Aurora, JeffCo and across the state deserve to have access to accurate information about their quality of their schools. It is time for CDE and districts to make this happen.