cow backsides
Ideas

Why We Fall for So Much B.S.

In today’s interconnected world, detecting bullshit should be easy, right? Yet we’re just as bad at it as we were 100 years ago. Here’s why.

By Adrian Furnham, Ph.D.

To paraphrase Shakespeare, bullshit by any other name would smell as foul. Nevertheless, words like rubbishnonsensecasuistry, and obscurantism aren’t quite as catchy as the classic. Philosopher Harry Frankfurt didn’t call his 2005 bestseller On Casuistry

For Frankfurt, bullshitting is different from lying. A bullshitter is a publicist whose principal aim is to impress others with profound, counterintuitive, or surprising ideas or facts. The motive is often to make money and or gain followers.

These days, social media presents a golden opportunity for bullshitters to spread their manure. It should also make it easier to spot bullshit in action, but that’s not the case. Much like we’re all victims of the Lake Wobegon effect—the tendency to overestimate our achievements and capabilities in relation to others—we also misguidedly believe we can easily tell the difference between fact and fiction. Truth is, we’re terrible at it.

Sign up for the monthly TalentQ Newsletter, an essential roundup of news and insights that will help you make critical talent decisions.

In 2016, a group of researchers published a paper in Judgment and Decision Making called, “On the reception and detection of pseudo-profound bullshit.” Their goal: find out which kinds of people believe in bullshit, and what variety is most often detected versus accepted. They gave people a range of false assertions that were presented as true and meaningful. Think pseudo-profound bullshit like, “Imagination is inside exponential space time events.” They then tested their intelligence, reasoning, ontological confusion, and religious beliefs.

In their first study, they found brighter analytic thinkers were more likely to detect the bullshit, whereas more confused and religious subjects were often fooled. In a second study, they discovered the more faith people had in intuition—and the more they accepted paranormal beliefs—the worse they were at identifying hogwash. 

In a third study, the researchers showed those who thought mundane statements profound (“Most people enjoy some sort of music”) were also poor bullshit detectors. And in a fourth experiment, they discovered that subjects who practiced alternative, complementary medicine and bought into conspiracy theories were also more likely to believe the bunk.

Researchers say two mechanisms explain the results. One is the uncritical open-mind hypothesis, which suggests some people are less critical, skeptical, or cynical. The second concerns the difference between active-passive and reflective-unreflective open-mindedness. The former examines ideas before accepting them; the latter just runs with them.

The Barnum Effect at Work

Psychologists have long been interested in the Barnum Effect, named after P.T. Barnum, a notorious 19th century master of spin. The effect occurs when people accept generalizations that are true of nearly everybody to be true of themselves. 

Over 60 years ago, psychologist Ross Stagner gave a group of managers a personality test in which he offered bogus feedback in the form of statements derived from horoscopes, graphological analyses, and the like. More than half the managers felt their profiles were an accurate description of them, and almost none believed them to be wrong.

In a 1948 experiment, a professor named Bertram Forer gave personality tests to his students, ignored their answers, and delivered each student an identical evaluation pulled from a newspaper astrology column. The first three items: 

  • “You have a great need for other people to like and admire you.”
  • “You have a tendency to be critical of yourself.”
  • “You have a great deal of unused capacity you have not turned to your advantage.”

Forer’s explanation for the Barnum Effect: People tend to accept claims about themselves in proportion to their desire that they be true, rather than in proportion to their accuracy as measured by a non-subjective standard. As you can see, we have a long history of being duped. If you think you’re impervious to believing someone else’s bullshit, well, you’re really just falling for your own. 

Adrian Furnham, Ph.D., is an organizational and applied psychologist, management expert, and professor of psychology at University College London. In addition, he has been a consultant to more than 30 major international companies.