There’s More to that Story: 4 Psych 101 Case Studies

Well it’s back to school time folks, and for many high schoolers and college students, this means “Intro to Psych” is on the docket. While every teacher teaches it a little differently, there are a few famous studies that pop up in almost every textbook. For years these studies were taken at face value, however with the onset of the replication crisis many have gotten a second look and have been found to be a bit more complicated than originally thought.  I haven’t been in a classroom for psych for quite a few years so I’m hopeful the teaching of these has changed, but just in case it hasn’t, here’s a post with the extra details my textbooks left out.

Kitty Genovese and the bystander effect: Back in my undergrad days, I learned all about Kitty Genovese, murdered in NYC while 37 people watched and did nothing. Her murder helped coin the term “bystander effect”, where large groups of people do nothing because they assume someone else will. It also helped prompt the creation of “911” the emergency number we all can call to report anything suspicious.

So what’s the problem? Well, the number 37 was made up by the reporter, and likely not even close to true. The New York Times had published the original article reporting on the crime, and in 2016 called their own reporting “flawed“. A documentary was made in 2015 by Kitty’s brother investigating what happened, and while there are no clear answers, what is clear is that a murder that occurred at 3:20am probably didn’t have 38 witnesses who saw anything, or even understood what they were hearing.

Zimbardo/Stanford Prison Experiment: The Zimbardo (or Stanford) Prison Experiment is a famous experiment in which study participants were asked to act as prisoners or guards in a multi-day recreation of a prison environment. However, things got quickly out of control and the guards got so cruel and the prisoners so rowdy that the whole thing had to be shut down early. This showed the tendency of good people to immediately conform to expectations when they were put in bad circumstances.

So what’s the problem? Well, basically the researcher coached a lot of the bad behavior. Seriously, there’s audio of him doing it. This directly contradicts his own statements later that there were no instructions given. Reporter Ben Blum went back and interviewed some of the participants who said they were acting how they thought the researchers wanted them to act. One guy said he freaked out because he wanted to get back to studying for his GREs and thought the meltdown would make them let him go early. Can bad circumstances and power imbalances lead people to act in disturbing ways? Absolutely, but this experiment does not provide the straightforward proof it’s often credited with.

The Robber’s Cave Study: A group of boys are camping in the wilderness and are divided in to two teams. They end up fighting each other based on nothing other than assigned team, but then come back together when facing a shared threat. This shows how tribalism works, and how we can overcome it through common enemies.

So what’s the problem? The famous/most reported on study was take two of the experiment. In the first version the researchers couldn’t get the boys to turn on each other, so they did a second try eliminating everything they thought had added group cohesion in the first try, and finally got the boys to behave as they wanted. There’s a whole book written about it and it showcases some rather disturbing behavior on the part of the head researcher Muzafer Sherif. He was never clear with the parents what type of experiment the boys were subjected to, and he actually both destroyed personal belongings himself (to blame it on the other team) and egged the boys on in their destruction. When Gina Perry wrote her book she found that many of the boys who participated (and are now in their 70s) were still unsettled by the experiment. Not great.

Milgram’s electric shock experiment: A study participant is brought in to a room and asked to administer an electric shock to a person they can’t see who is participating in another experiment. When the hidden person gets a question “wrong” they are supposed to zap them to help them learn. When they zap them, a recording plays of someone screaming in pain. It is found that 65% of people will administer a fatal shock to a person as long as the researcher keeps encouraging them to do so. This shows that our obedience to authority can override our own ethics.

So what’s the problem? Well, this one’s a little complicated. The original study was actually 1 of 19 studies conducted, all with varying rates of compliance. The most often reported findings were from the version of the experiment that resulted in the highest amount of compliance. A more recent study also reanalyzed participants behavior in light of their (self-reported) belief that the subject was actually in pain or not. One of the things the researchers told people to get them to continue was that the shocks were not dangerous, and it also appears many participants didn’t think what they were participating in was real, and it wasn’t. They found that those who either believed the researchers assurances or expressed skepticism about the entire experiment were far more likely to administer higher levels of voltage than those who believed the experiment was legit. To note though, there have been replication attempts that did find comparable compliance rates to Milgram’s, though the shock voltage has always been lower due to ethics concerns.

So overall, what can we learn from this? Well first and foremost that once study results hit psych textbooks, it can be really hard to correct the error. Even if kids today aren’t learning these things, many of us who took psych classes before the more recent scrutiny of these tests may keep repeating them.

Second, I think that we actually can conclude something rather dark about human nature, even if it’s not what we first thought. The initial conclusion of these studies is always something along the lines of “good people have evil lurking just under the surface”, when in reality the researchers had to try a few times to get it right. And yet this also shows us something….a person dedicated to producing a particular outcome can eventually get it if they get enough tries. One suspects that many evil acts were carried out after the instigators had been trying to inflame tensions for months or years, slowly learning what worked and what didn’t. In other words, random bad circumstances don’t produce human evil, but dedicated people probably can produce it if they try long enough. Depressing.

Alright, any studies you remember from Psych 101 that I missed?

2 thoughts on “There’s More to that Story: 4 Psych 101 Case Studies

Comments are closed.