The Weight of Evidence

I’ve been thinking a lot about the law and evidence this week, for 3 reasons:

First, this article my lawyer father sent me about the Supreme Court’s aversion to math. It reviews a case about gerrymandering  I’ve mentioned before, and the attempts of statisticians/computer guys to convince the court that their mathematical  models are worth using. While the case hasn’t been decided yet, some researchers were fairly annoyed at how reflexively some of the justices dismissed the models presented, and their invocation of the “gobbledygook” doctrine.

Second was this article I stumbled on that discussed an effort to fact-check supreme court decisions, and found a rather alarming number of them contain factual errors. This one was concerning for two reasons: some of the errors actually appeared to be related to the ultimate decision and some of the errors appear to have come from the Justices doing their own research.

Finally, this article about yet another evidence scandal in my state. Apparently our state lab has been systematically withholding evidence of failed breathalyzer calibrations, calling in to question hundreds of DUI convictions. This is not an aberration…for those of you not from around here, Massachusetts has been on a bad run with our state crime/forensics lab. This is our 3rd major scandal in the past few years, and we now have the dubious distinction of being cited in every report about the problems with forensics.

This got me thinking about a few things:

  1. The line between gobbledygook and “good idea, needs work” is often familiarity. In reading some of the Supreme Courts skepticism of mathematical models and contrasting it with the general acceptance of forensics despite serious concerns, it’s hard not to think that this has something to do with familiarity. Forensics is a science that was quite literally built to support the criminal justice system, whereas computer modeling was built to support….well, all sorts of things. I suspect that’s why one gets more scrutiny than the other.
  2. Mathematical models have to simplify and/or those who build them have prioritize explaining them to people who are not on their side The new wave of mathematical models is intriguing, exciting, and a little bit frightening all at once. Complexity is necessary at times, but ultimately can be used to hide assumptions and get your way. The justices on the Supreme Court know this, and their first suspicion is going to be that all that math is just there to hide something. Anyone hoping to build a model that effects policy should probably keep in mind that for everyone they impress, they will make someone else suspicious. As with any argument, trying it out on someone not inclined to agree with you will teach you a lot about where the holes might be.
  3. Lawyers need to learn more about statistics This one has been the subject of many long talks with my Dad. Unless they were required to take it for their undergrad degree, many lawyers can get through their whole higher ed career without touching a stats class. This seems like a gap to me, especially now that so much of the evidence they’re seeing requires some knowledge of probability and evidence. I’ve mentioned before that doctors struggle with the concept of false positives and false negatives and base rates,  and it seems clear many people in law enforcement do as well.  With all the new types of evidence out there, it seems like this is a gap.
  4. The Supreme Court needs a fact checker Seriously. Are you really telling me there’s not one clerk out there who would be willing to just read through the decisions and find citations for each stat? Or better yet, someone who’d read through each briefing filed with the court and error check them before they got to the Justices? In the case the article cited, the stat in question wasn’t a common controversial one (the % of workplaces that drug tested employees), but the answer provided (88%) apparently had no source at all. I feel like of all groups, the Supreme Court should have figured out how to get this stuff screened out before it biases them.

I am thinking there’s a presentation in here somewhere. If you have any more good articles, send them my way!

The SCOTUS and perception of statistics

Finally got internet in the new house.  Can’t complain too much….the guy finished running the wire to our house even though a thunderstorm started.  Clearly that man was getting paid by the job, not the hour.

Anyway, had an interesting chat with my father (a lawyer) after our closing on Thursday about the Supreme Court ruling on health care.  He mentioned that a coworker was griping that the Supreme Court meant nothing any more because they only voted on party lines.  My father, being the good data accuracy man that he is, quickly dissented.

He looked it up, and asserted that nearly half of the decisions last year were unanimous.  For this year, 7-2 votes were the least common (8%), then 8-1 (11%), 6-3 (17%) and then 5-4 (20%).  So overall,  they agree nearly as much as they disagree, and they are only completely divided on about 1 in 5 cases.  Kennedy and Roberts voted with the majority over 90% of the time.  Ginsburg was the least likely to vote in the majority.  Lots of interesting stats to be run on this, another good breakdown of some of the data is here.

It seems the perception that every vote is political is heavily skewed by the very few court cases most of us hear about every year.  I would wager even highly political citizens probably couldn’t rattle off more than a handful.  When you break down the 5-4 decisions exclusively, about 2/3rds of them vote down ideological lines…..which totals to about 10 cases for 2011.

This kind of skewing of perception is common when a few high profile events dramatically overshadow regular operations.  Thanks Dad, for pointing that out.