One of the most interesting things I’ve gotten to do since I started blogging about data/stats/science is to go to high school classrooms and share some of what I’ve learned. I started with my brother’s Environmental Science class a few years ago, and that has expanded to include other classes at his school and some other classes elsewhere. I often get more out of these talks than the kids do…something about the questions and immediate feedback really pushes me to think about how I present things.
Given that, I was intrigued by a call I got from my brother yesterday. We were talking a bit about science and skepticism, and he mentioned that as the year wound down he was having to walk back on some of what I presented to his class at the beginning of the year. The problem, he said, was not that the kids had failed to grasp the message of skepticism…but rather that they had grasped it too well. He had spent the year attempting to get kids to think critically, and was now hearing his kids essentially claim it was impossible to know anything because everything could be manipulated.
Oops.
I was thinking about this after we hung up, and how important it is not to leave the impression that there’s only one way to be wrong. In most situations that need a judgment call, there’s actually two ways to be wrong. Stats and medicine have a really interesting tool for showing this phenomena: a 2×2 contingency matrix . Basically, you take two different conditions and sort how often they agree or disagree and under what circumstances those happen.
For example, for my brother’s class, this is the contingency matrix:
In terms of outcomes, we have 4 options:
- True Positive: Believing a true idea (brilliant early adopter).
- False Negative (Type II error): Not believing a true idea (in denial/impeding progress).
- False Positive (Type I error): Believing a false idea (gullible rube)
- True Negative: Not believing a false idea (appropriately skeptical)
Of those four options, #2 and #3 are the two we want to avoid. In those cases the reality (true or not) clashes with the test (in this case our assessment of the truth). In my talk and my brother’s later lessons, we focused on eliminating #3. One way of doing this is to be more discerning with what we believe or we don’t, but many people can leave with the impression that disbelieving everything is the way to go. While that will absolutely reduce the number of false positive beliefs, it will also increase the number of false negatives. Now, depending on the field this may not be a bad thing, but overall it’s just substituting one lack of thought for another. What’s trickier is to stay open to evidence while also being skeptical.
It’s probably worth mentioning that not everyone gets into these categories honestly…some people believe a true thing pretty much by accident or fail to believe a false thing for bad reasons. Every field has an example of someone who accidentally ended up on the right side of history. There also aren’t always just two possibilities, many scientific theories have shades of gray.
Caveats aside, it’s important to at least raise the possibility that not all errors are the same. Most of us have a bias towards one error or another, and will exhort others to avoid one at the expense of the other. However, for both our own sense of humility and the full education of others, it’s probably worth keeping an eye on the other way of being wrong.
The dwarves in CS Lewis’s The Last Battle
LikeLike
Pingback: Three Ways to Be Wrong in Narnia – Two ways to be wrong