Compare Ohio school district testing changes across the state (Search your district here)

school buses

Shifting to the Common Core and having three different state tests in as many years makes it hard to compare how your school district is doing from year to year. The Plain Dealer lets you see how your district did, relative to all other districts, for each of those years.

(dhendrix73, Creative Commons via Attribution-No Derivs 2.0 Generic)

CLEVELAND, Ohio - Did your district really deserve lower grades on state report cards?

Did it really do any better? Or worse, through all these testing changes?

Here's one way to cut through the confusion of Ohio's state report cards and state tests with a comparison of your district's scores to all the scores across the state.

Click on the circle for your county at the bottom of the chart below (unclick Cuyahoga County too, if you live elsewhere) to see a Plain Dealer calculation of how your school has performed on state tests for each  the last five years, compared to the average scores of other schools across Ohio.

That gives you three years of scores under Ohio's old state tests, then a look at how they changed with PARCC in 2014-15 and AIR in 2015-16.

Bars to the right of the center line show how far above the state average your school scored. Bars to the left show how far below.

Don't look for grades or for how many points more or less your school scored. Scoring has changed so much from year to year that a direct comparison is hard. These charts instead give a statistical look at how much scores deviated from the average test results statewide in each of those years.

The Ohio Department of Education was given an early look at the methodology before the latest report cards were released and at final results this week and raised no objections, other than noting that it specifically chose not to compare years because they are so different.

Click here for why changes in state tests and grades prompted this comparison.

See below the charts for a full discussion of the methodology behind this comparison.

Methodology

Ohio's report cards on school performance are supposed to help parents and taxpayers understand how well schools educate children.  But the test, and test grading, have changed several times in the last few years. That complicates tracking progress over time.

This year and last, scores fell for many schools in Ohio. They fell so much so that the cause of the drop had to be a change in the test or the scoring, rather than a steep state-wide decline in the quality of education. And, in fact, state officials say they're trying to raise the performance bar.

To permit year-over-year comparisons, The Plain Dealer needed a consistent yardstick. So we used a system familiar to many former students: We graded on the curve.

We sorted schools into peer groups: Elementary schools, middle schools, and high schools, for instance. We calculated the average (mean) score for all schools in each group for each year. Then we compared each school's scores to that average, and to how far from the average other scores typically fell.

For school districts, we did something similar, but using the state's reported Performance Index Score for each district.

Those calculations produced a relative performance measure, known as a z-score, showed which schools had improved versus their peers and which had slipped in test performance.

It's a little bit like a footrace on an ocean liner. The whole track is moving, but what matters is how the runners compare to each other.

Z-scores can be translated into percentages:

  • -3 equals the lowest one percent of performers.
  • -2 equals the lowest five percent
  • -1 equals roughly the lowest third.
  • 0 is right at the middle of the group.
  • +1 means better than 68 percent of the group.
  • + 2 means better than 95 percent.
  • +3 equals the top 1 percent.

The comparisons are not perfect. The state compiles its index scores by looking to the performance of each student in any school or district. For the sake of time and simplicity, The Plain Dealer used summary data supplied by the state, which reached only to the level of individual school buildings. Some of our results may vary from some state figures, because school and district enrollment varies - some are bigger than others.  Still, our results should be in the ballpark and should be consistent over time, allowing for reasonable year-over-year comparisons.

Also, though the state treated charter schools as a separate category in its recent reports, we did not make that distinction. The Plain Dealer decided to count them in calculations just like any other public school.

In many cases both district and charter schools did not fit the elementary, middle and high school structure, so there was no perfect comparison category. Cleveland's district, for example, generally is organized into K-8 elementary schools and high schools. Many charters have a K-12 program. We tried to count each school within the group of schools it most resembled. In some cases, we compared charters to both elementary schools and high schools.

Finally, there are questions about how fair it is to apply measures like this to evaluating schools. If a district's students are growing significantly poorer and more disadvantaged, its scores may fall despite the best efforts of teachers and administrators. Those questions are beyond the reach of this story. But they are one reason we are offering readers many different ways to view the data.

If you purchase a product or register for an account through a link on our site, we may receive compensation. By using this site, you consent to our User Agreement and agree that your clicks, interactions, and personal information may be collected, recorded, and/or stored by us and social media and other third-party partners in accordance with our Privacy Policy.