Presentation: How they Reel You In (Part 1)

Note: This is part 1 in a series for high school students about reading and interpreting science on the internet. Read the intro and get the index here.

When I first sat down to write a talk for high school science students about how to read science on the internet, one of my priorities was to immediately establish for the kids why this was different from anything else they were learning. The school I was going to had won awards for their science teaching, and these students were no slackers. I was concerned that given their (probably justifiable) confidence in their skills, many of these kids would assume that they would actually be great at interpreting random articles they stumbled across online. This actually made me think of a totally different study on a completely different topic: sexual harassment. In that study, college aged women were asked what they would do if an interviewer sexually harassed them. All said they would confront/walk out/report.  However, when the experimenters actually put women in fake job interview scenarios and had an interview harass them, none of them did any of those things. The general conclusion is that we have the right answers when we know what we’re being tested on, but in real life we often don’t know what is even being asked of us.

I didn’t end up using the harassment study, but I did feel comfortable putting that framing on my introduction. When you read science in a classroom, the teacher is going to be clear what you’re supposed to get out of the lesson. An astute student can be relatively confident what they’ll be tested on. I’ve had plenty of tests where I changed or re-evaluated my answer because I suspected the teacher was doing something a bit different than the initial reading would suggest. On the internet however, no one gives you even the briefest of heads up as to what material is going to be covered. When you encounter an interesting science story or number, you are almost always going to be thinking about something that is not at all related. Because of this, the first impression of a story is important….it may be all you get.  Now with that in mind, we have to realize that information can be read and absorbed in a few seconds, so an instantaneous skepticism is key.  The first part of this is simple, but critical: make sure whatever you’re seeing is actually true. Like, at all.  That’s why Part 1 here is called

False Information, Deceptive Memes and Other Fake Stuff

Okay, so what’s the problem here?

The problem is, some stuff on the internet is fake. I know, total shocker. But seriously, it’s actually pretty stunning how often people take entirely made up news stories, glance at them, and end up believing they’re real. There’s a whole website called Literally Unbelievable that catalogs peoples reactions to fake news stories from well known sites.   Like this one:

But the issues don’t end with just satire, sometimes people are making things up just for the heck of it, like the guy who spent a couple weeks putting up fake facts with pictures behind it to see who would call him out on it. This was my favorite:

Sometimes people make things up to push a political agenda, as Abraham Lincoln warned us:

Okay, so what kind of things should we be looking out for?

Well, anything with a picture on it designed to be catchy should be immediately suspect, and that goes double if it’s political or has an agenda. Also, anything that falls in to an area you feel pretty confident about should also be scrutinized. It turns out people who feel confident in their expertise on a topic can be more likely to believe they know the definition of made up terms.

Why do we fall for this stuff?

Well, a couple of reasons. Like I said in the intro, sometimes we just flat out don’t have our skepticism engaged. If you’re scrolling through Facebook thinking about your ex, or your friends, or the awkward political commentary your cousin is making, you might be less likely to even consciously register a meme about lightning and cells phones. You may find yourself believing it later because you really never thought it through to begin with.

Conversely, confirmation bias is a powerful force, and it frequently leads us to apply less scrutiny to things we’d like to believe.  That’s why political falsehoods are so easy to pass along….people believe that they have some “truthiness” to them (as Stephen Colbert would say) or that they were “fake but accurate” (as the New York Times would say).

Compounding both of these problems is our own perception of how smart we are. Earlier I linked to this study that showed that people who think they know a lot about a topic can be even more susceptible to accepting fake terms. And lest we think this is just for people who only think they’re smart, I would point you to the Neil Degrasse Tyson/George W Bush quote controversy. Neil Degrasse Tyson is possibly the most famous scientist in the US today, and he was caught quoting George W Bush inaccurately. It took some rather dogged determination by an opposing journalist to get him to admit that he got the quote and it’s context wrong. Now if Neil Degrasse Tyson can get tripped up by wrong information, who are we to claim to be better?

So what can we do about it?


It won’t help you every time, but a good first step is simply to Google the information. If you can’t verify, don’t post. Some items are disputed (we’ll get to that later), and their interpretation may questioned, but completely fake stories should have a pretty good Google history to let you know that. For satirical websites, even taking a look at other stories they post can tip you off.  The site has a good list to get you started. Some hoax sites are really trying to trick you….for example is a real news site, and is not. For general viral stories can point you in the right direction.  Again, this won’t help much if the story is disputed, but it should point you in the right direction for completely made up stuff.  In future posts, we’ll get in to the nuances, but for now, remember that sometimes there is no nuance. Sometimes things are just fake.


Want more? Click here for Part 2.

Anti-conservative bias and social psychology

My most popular blog post of all time was the one I did on conservative trust in the scientific community vs retraction rates.   I called it “Paranoia is just good sense if people really are out to get you” because I had a suspicion (confirmed when I ran the data) that conservatives might actually be behaving rationally when they said they trusted science less, given the ever increasing retraction rates in prominent journals.

Now, a new study shows that this distrust of the scientific community is even more well founded than I originally thought.

In a survey conducted by two self proclaimed liberals, it was found that there is heavy evidence that conservatives are being systematically discriminated against in the field of social psychology.  What unnerved the authors even more is that this was not a case where people were hiding their bias:

To some on the right, such findings are hardly surprising. But to the authors, who expected to find lopsided political leanings, but not bias, the results were not what they expected.
“The questions were pretty blatant. We didn’t expect people would give those answers,” said Yoel Inbar, a co-author, who is a visiting assistant professor at the Wharton School of the University of Pennsylvania, and an assistant professor of social psychology at Tilburg University, in the Netherlands.
He said that the findings should concern academics. Of the bias he and a co-author found, he said, “I don’t think it’s O.K.”

The study isn’t available yet, so I can’t say I’ve read the nuances.  Still, it’s hard for me to believe two liberal authors would have attempted to skew the results in this direction.  Conservatives have claimed this bias exists for years (look no further than the ethics complaint lodged against Mark Regnerus for proof), and will no doubt find nothing shocking about the results.  For liberals to have to face what this means however, that’s something new.  Even in the comments on this article, the vitriol is surprising, with many saying that conservatives are so out of touch that it is an ethical responsibility to keep them out of fields like social psychology.


It is much to my chagrin that social science gets lumped in with harder science, but since findings in this field are so often reported in the media, it makes sense to take them in to account.  We have a vicious cycle here now where some fields are dominated by one party, who then do studies that slam the other party, then accuse that party of being anti-science when they don’t agree with the results.  This is crazy.  The worst thing that can happen to any scientific research is too much consensus….especially when it involves moving targets like social psychology.  With 40% of the population identifying as conservative, how can we leave those perspectives out?  Everyone, liberal and conservative, should be troubled by these findings.  Those untroubled by this should take a good look at themselves and truly ask the question “what am I so afraid of?”.

Moral obligations and Lazy Truth

I was going to include this in a Friday link post, but I really felt it deserved it’s own spotlight.  

There’s a new gmail gadget called “Lazy Truth” that promises to send you a fact check email every time you receive a (forwarded) email it deems to be of dubious content.

I haven’t tried it, so I’m not sure what it’s set up to flag, or how accurate the “fact check” email is, but I was immediately intrigued.  I’ve actually been working on a much longer post that covers just this topic, so it’s something I’ve been giving a lot of thought.

I’ve been mulling over the rise of Facebook/email/Twitter lately, and wondering…..for those of us who value our integrity and our truthfulness, and do not believe ends justify means, what exactly are the moral implications of hitting forward or share on information that we could have easily proven to be false if we’d checked?

I was wondering if I was the only one worried about this, when I came across a blog post from Dr Michael Eades.  He’s a pro-low carb physician, who spends much of his time critiquing nutritional research.  In a post about the book “The China Study”, he describes finding what he consider a great critique of it on another person’s blog.  Then this:

…. I had fallen victim to the confirmation bias.  My bias was that Dr. Campbell was wrong, so I was more than happy to uncritically accept evidence confirming his error without lifting a finger to double check said evidence myself.  I knew that if a blogger somewhere had come out with a long post describing an analysis of the China study demonstrating the validity of all of Dr. Campbell’s notions of the superiority of the plant-based diet, I would’ve been all over it looking for analytical errors.  But since Ms. Minger’s work accorded with my own beliefs, my confirmation bias ensured that I accepted it at face value. 

Once the fact that I had succumbed to my confirmation bias settled in around me, I became suffused with angst.  I had tweeted and retweeted Ms. Minger’s analysis a number of times, giving the impression that I had at least minimally checked it out and had approved it.  I had emailed it to a number of people, many of whom, I’m sure, had forwarded it on.  I’m sure I played a fairly large role in the rapid dissemination of the anti Campbell/China study info.

In the end, he went back and realized that the post was good, but his panic attack was intriguing to me.  How many of us have had this same panic?  How many of us should have?  How many lousy graphs rip through Facebook like wildfire because no one bothers to double check if they’re even valid?  Is the liar the person who created the graph, or do those who share it share some blame?

I don’t pretend I have an answer for this.  I feel most of the people interested enough to read this blog probably do not fall in the category of those who would easily share skewed information without thinking about it, but I am hoping for some thoughts/feedback from you all.

Are we so used to hearing politicians of all stripes seamlessly repeat bad data that we’ve come to view it as acceptable?  Is this just a fact of life?  Is it possible that we will be saved by widgets like the one above?   Does religion matter, or is this an overall moral issue? Does confrontation work with this sort of thing?  Or is this something I just have to learn to live with?