John Tierney has an interesting piece up at the Atlantic about how national high school rankings are not only meaningless, but actually harmful.
He doesn’t quibble much with local rankings, and agrees that if done correctly they can provide good information for residents. As for national level rankings though, he says this:
let’s call national rankings of high schools what they are: nonsense. There is no way to say, with any degree of accuracy at all, where any given high school ranks in relation to others in terms of how good it is or how challenging it is.
Now this seems pretty sensible to me. Ranking all the schools in a given state against each other can be meaningful, though more so within ranges than with strict numbers (is there really a meaningful difference between #34 in the state and #35?). But to pluck a few from around the country? That’s not even useful. When my husband and I went to buy a house, we knew the general area we were looking at, and school rankings were one of many factors we looked at when picking a town to buy in. I’m pretty sure most people do something similar. This works of course because we already had the region picked out and knew the trade offs that came with the individual regions. You’re not going to use national rankings like this.
Additionally, he notes that at least one of the national lists (the “Challenge Index” from the Washington Post) literally uses only one metric to determine a challenging high school: the number of AP (or similar) tests taken by the seniors at the high school, divided by the number of seniors:
Note that the numerator is not even the number of such exams passed, but merely the number taken. So, a given school can rise on the list by increasing the number of its students who take “advanced” classes. Conversely, schools that are more discerning and thoughtful about which students ought to be taking AP classes end up suffering in the rankings. So, the list produces nonsensical anomalies such as high schools with very low graduation rates ranking much higher on the “Challenge Index” than excellent schools that don’t game the ranking system…
This idea of ranking interested me. Ultimately, we actually picked the school district we did in part because of the options it holds for the not-so-academically inclined. Don’t get me wrong, it’s in the top 20% of high schools in the state, but not by much. More importantly, the regional technical high school is here, and there’s opportunities to learn how to make a good living even if college isn’t your thing. I live in a state with a great educational system, and my town is no exception. I’m less worried about AP tests, and more worried about school districts that might push kids in to inappropriate classes to keep their numbers up, to the detriment of the child. While a certain baseline level of knowledge should be mandatory, I want my son to be challenged, but not tortured. I’m suspicious of schools who try to hard on these lists, because school ranking and the best interest of the child don’t always collide.
Looking further down the line, it’s interesting to note that even more advanced methodologies almost always use the percent of kids headed to college as a judge of the high school’s rigor. As college costs continue to spiral and become and worse and worse investment, I’m curious if we’re going to see a bigger and bigger divide between rich neighborhoods and poor neighborhoods in terms of rankings. This could drive people out of the poorer neighborhoods, not because the schools were actually worse, but because the metric used to assess them is so contingent on parents have the cash to send their kids to college. Things to ponder.