These posts are the opinions of the writers and do not necessarily reflect the views of A+ Colorado.


Not Knocking It Out of the PARCC

It’s been about nine months since kids started taking PARCC tests and one month since the state released some of the data on schools, and yet many families still do not have results for their kids. The data we do have is not disaggregated by student groups (a key tool for monitoring that all students are being educated equally), nor do we have enough data from the state to make definitive school by school comparisons.

What is up with that? While I know it is not true, it makes you wonder if this PARCC data rollout has been designed to undermine any effort to evaluate and improve public education by making it impossible for families, educators, policy makers, and the community to know how kids, schools and districts are doing. Wasn’t this supposed to help all of us have a better, not worse understanding of how kids and our schools are doing?

We have been pouring over the data to shed light on just that question. We look forward to sharing our analysis in several upcoming reports. The state was only just able to review our methodology last week because they are so understaffed for this work. Our hats go off to CDE’s Accountability Team for being able to dig through millions and millions of cells of PARCC data and make sense of them.

As a preview of our PARCC analysis, we can say there are school districts and schools throughout the front range making progress: in elementary reading we’re seeing progress in Denver, Sheridan, Adams 12, and Mapleton; in elementary math Denver, St. Vrain, Adams 12, and Jeffco are showing improvement. But again, with somewhat incomplete and, at times, totally unusable PARCC data released by the state, it is difficult to decipher what results we can access actually mean.

As bad as things are in Colorado, it may be worse in other states where there has been little or no transparency regarding last year’s PARCC assessments. We recently discovered that New Jersey’s district and school scores were just released at the beginning of this month. Before that release even school leaders couldn’t access scores for other schools in the state.

One more reason to have some concerns are recent revelations that students that take the computer-based test do worse than students that take the paper/pencil test by significant margins (4-13 percentage points). By the way, this is not a unique problem, which makes you wonder why this is coming up now and why this was not discussed when selecting the PARCC test.

We are by no means suggesting that we abandon PARCC, a higher quality and more rigorous assessment than TCAP, but we do need to be more thoughtful about how it is administered, and how results are reported. And if these issues are not addressed ASAP, we may need to reassess. A+ has been a strong supporter from our start on high quality assessments tied to the Colorado Standards. PARCC can work, but there must be dramatic improvement on these fronts. The CO Department of Education, Pearson Education, and PARCC have had years to prepare for this. Let’s hope the results from this year’s tests (happening in a few months) are released before kids leave for the summer so that all of us have some sense of how our schools are doing.