Home > Talking to Strangers(13)

Talking to Strangers(13)
Author: Malcolm Gladwell

Interviewer: In the current game did you find the questions difficult?

Philip: Yes, some were. I was like, “Well, what is that?”

Interviewer: If you would scale them one to ten, if one was easy and ten was difficult, where do you think you would put them?

Philip: I would put them [at] an eight.

Interviewer: An eight. Yeah, they’re pretty tricky.

 

Philip is then told that he and his partner did very well on the test. The interviewer asks him why.

Philip: Teamwork.

Interviewer: Teamwork?

Philip: Yeah.

Interviewer: OK, all right. Now, I called Rachel out of the room briefly. When she was gone, did you cheat?

Philip: I guess. No.

 

Philip slightly mumbles his answer. Then looks away.

Interviewer: Are you telling the truth?

Philip: Yes.

Interviewer: Okay. When I interview your partner and I ask her, what is she going to say?

 

At this point in the tape, there’s an uncomfortable silence, as if the student is trying to get his story straight. “He’s obviously thinking very hard,” Levine said.

Philip: No.

Interviewer: No?

Philip: Yeah.

Interviewer: OK, all right. Well, that’s all I need from you.

 

Is Philip telling the truth? Levine has shown the Philip videotape to hundreds of people and nearly every viewer correctly pegs Philip as a cheater. As the “partner” confirmed to Levine, Philip looked inside the answer-filled envelope the minute Rachel left the room. In his exit interview, he lied. And it’s obvious. “He has no conviction,” Levine said.

I felt the same thing. In fact, when Philip is asked, “Did you cheat?” and answers, “I guess. No,” I couldn’t contain myself, and I cried out, “Oh, he’s terrible.” Philip was looking away. He was nervous. He couldn’t keep a straight face. When the interviewer followed up with, “Are you telling the truth?” Philip actually paused, as if he had to think about it first.

He was easy. But the more tapes we looked at, the harder it got. Here is a second case. Let’s call him Lucas. He was handsome, articulate, confident.

Interviewer: I have to ask, when Rachel left the room, did any cheating occur?

Lucas: No.

Interviewer: No? You telling me the truth?

Lucas: Yes, I am.

Interviewer: When I interview your partner and I ask her the same question, what do you think she’s going to say?

Lucas: Same thing.

 

“Everybody believes him,” Levine said. I believed him. Lucas was lying.

Levine and I spent the better part of a morning watching his trivia-quiz videotapes. By the end, I was ready to throw up my hands. I had no idea what to make of anyone.

The point of Levine’s research was to try to answer one of the biggest puzzles in human psychology: why are we so bad at detecting lies? You’d think we’d be good at it. Logic says that it would be very useful for human beings to know when they are being deceived. Evolution, over many millions of years, should have favored people with the ability to pick up the subtle signs of deception. But it hasn’t.

In one iteration of his experiment, Levine divided his tapes in half: twenty-two liars and twenty-two truth-tellers. On average, the people he had watch all forty-four videos correctly identified the liars 56 percent of the time. Other psychologists have tried similar versions of the same experiment. The average for all of them? 54 percent. Just about everyone is terrible: police officers, judges, therapists—even CIA officers running big spy networks overseas. Everyone. Why?4

Tim Levine’s answer is called the “Truth-Default Theory,” or TDT.

Levine’s argument started with an insight that came from one of his graduate students, Hee Sun Park. It was right at the beginning of Levine’s research, when he was as baffled as the rest of his profession about why we are all so bad at something that, by rights, we should be good at.

“Her big insight, the first one, was that the 54-percent deception-accuracy figure was averaging across truths and lies,” Levine said. “You come to a very different understanding if you break out…how much people are right on truths, and how much people are right on lies.”

What he meant was this. If I tell you that your accuracy rate on Levine’s videos is right around 50 percent, the natural assumption is to think that you are just randomly guessing—that you have no idea what you are doing. But Park’s observation was that that’s not true. We’re much better than chance at correctly identifying the students who are telling the truth. But we’re much worse than chance at correctly identifying the students who are lying. We go through all those videos, and we guess—“true, true, true”—which means we get most of the truthful interviews right, and most of the liars wrong. We have a default to truth: our operating assumption is that the people we are dealing with are honest.

Levine says his own experiment is an almost perfect illustration of this phenomenon. He invites people to play a trivia game for money. Suddenly the instructor is called out of the room. And she just happens to leave the answers to the test in plain view on her desk? Levine says that, logically, the subjects should roll their eyes at this point. These are college students. They’re not stupid. They’ve signed up for a psychological experiment. They’re given a “partner,” whom they’ve never met, who is egging them on to cheat. You would think that they might be even a little suspicious that things are not as they seem. But no!

“Sometimes, they catch that the instructor leaving the room might be a setup,” Levine says. “The thing they almost never catch is that their partners are fake.…So they think that there might be hidden agendas. They think it might be a setup because experiments are setups, right? But this nice person they are talking and chatting to? Oh no.” They never question it.

To snap out of truth-default mode requires what Levine calls a “trigger.” A trigger is not the same as a suspicion, or the first sliver of doubt. We fall out of truth-default mode only when the case against our initial assumption becomes definitive. We do not behave, in other words, like sober-minded scientists, slowly gathering evidence of the truth or falsity of something before reaching a conclusion. We do the opposite. We start by believing. And we stop believing only when our doubts and misgivings rise to the point where we can no longer explain them away.

This proposition sounds at first like the kind of hairsplitting that social scientists love to engage in. It is not. It’s a profound point that explains a lot of otherwise puzzling behavior.

Consider, for example, one of the most famous findings in all of psychology: Stanley Milgram’s obedience experiment. In 1961, Milgram recruited volunteers from New Haven to take part in what he said was a memory experiment. Each was met by a somber, imposing young man named John Williams, who explained that they were going to play the role of “teacher” in the experiment. Williams introduced them to another volunteer, a pleasant, middle-aged man named Mr. Wallace. Mr. Wallace, they were told, was to be the “learner.” He would sit in an adjoining room, wired to a complicated apparatus capable of delivering electrical shocks up to 450 volts. (If you’re curious about what 450 volts feels like, it’s just shy of the amount of electrical shock that leaves tissue damage.)

Hot Books
» House of Earth and Blood (Crescent City #1)
» A Kingdom of Flesh and Fire
» From Blood and Ash (Blood And Ash #1)
» A Million Kisses in Your Lifetime
» Deviant King (Royal Elite #1)
» Den of Vipers
» House of Sky and Breath (Crescent City #2)
» The Queen of Nothing (The Folk of the Air #
» Sweet Temptation
» The Sweetest Oblivion (Made #1)
» Chasing Cassandra (The Ravenels #6)
» Wreck & Ruin
» Steel Princess (Royal Elite #2)
» Twisted Hate (Twisted #3)
» The Play (Briar U Book 3)