Skip to content

The Irrationality of Irrationality: The Paradox of Popular Psychology

Here’s my latest on ScientificAmerican.com 

In 1996, Lyle Brenner, Derek Koehler and Amos Tversky conducted a study involving students from San Jose State University and Stanford University. The researchers were interested in how people jump to conclusions based on limited information. Previous work by Tversky, Daniel Kahneman and other psychologists found that people are “radically insensitive to both the quantity and quality of information that gives rise to impressions and intuitions,” so the researchers knew, of course, that we humans don’t do a particularly good job of weighing the pros and cons. But to what degree? Just how bad are we at assessing all the facts?

To find out, Brenner and his team exposed the students to legal scenarios. In one, a plaintiff named Mr. Thompson visits a drug store for a routine union visit. The store manager informs him that according to the union contract with the drug store, plaintiffs cannot speak with the union employees on the floor. After a brief deliberation, the manager calls the police and Mr. Thompson is handcuffed for trespassing. Later the charges were dropped, but Mr. Thompson is suing the store for false arrest.

All participants got this background information. Then, they heard from one of the two sides’ lawyers; the lawyer for the union organizer framed the arrest as an attempt to intimidate, while the lawyer for the store argued that the conversation that took place in the store was disruptive. Another group of participants – essentially a mock jury – heard both sides.

The key part of the experiment was that the participants were fully aware of the setup; they knew that they were only hearing one side or the entire story. But this didn’t stop the subjects who heard one-sided evidence from being more confident and biased with their judgments than those who saw both sides. That is, even when people had all the underlying facts, they jumped to conclusions after hearing only one side of the story.

The good news is that Brenner, Koehler and Tversky found that simply prompting participants to consider the other side’s story reduced their bias – instructions to consider the missing information was a manipulation in a later study – but it certainly did not eliminate it. Their study shows us that people are not only willing to jump to conclusions after hearing only one side’s story, but that even when they have additional information at their disposal that would suggest a different conclusion, they are still surprisingly likely to do so. The scientists conclude on a somewhat pessimistic note: “People do not compensate sufficiently for missing information even when it is painfully obvious that the information available to them is incomplete.”

In Brenner’s study, participants were dealing with a limited universe of information – the facts of the case and of the two sides’ arguments. But in reality – especially in the Internet era – people have access to a limitless amount of information that they could consider. As a result, we rely on rules of thumb, or heuristics, to take in information and make decisions. These mental shortcuts are necessary because they lessen the cognitive load and help us organize the world – we would be overwhelmed if we were truly rational.

This is one of the reasons we humans love narratives; they summarize the important information in a form that’s familiar and easy to digest. It’s much easier to understand events in the world as instances of good versus evil, or any one of the seven story types. As Daniel Kahneman explains, “[we] build the best possible story form the information available… and if it is a good story, [we] believe it.” The implication here is that it’s how good the story is, not necessarily its accuracy, that’s important.

But narratives are also irrational because they sacrifice the whole story for one side of a story that conforms to one’s worldview. Relying on them often leads to inaccuracies and stereotypes. This is what the participants in Brenner’s study highlight; people who take in narratives are often blinded to the whole story – rarely do we ask: “What more would I need to know before I can have a more informed and complete opinion?”

The last several years have seen many popular psychology books that touch on this line of research. There’s Ori and Rom Brafman’s Sway, Dan Ariely’s Predictably Irrational and, naturally, Daniel Kahneman’s Thinking, Fast and Slow. If you could sum up the popular literature on cognitive biases and our so-called irrationalities it would go something like this: we only require a small amount of information, often times a single factoid, to confidently form conclusions and generate new narratives to take on new, seemingly objective, but almost entirely subjective and inaccurate, worldviews.

The shortcomings of our rationality have been thoroughly exposed to the lay audience. But there’s a peculiar inconsistency about this trend. People seem to absorb these books uncritically, ironically falling prey to some of the very biases they should be on the lookout for: incomplete information and seductive stories. That is, when people learn about how we irrationally jump to conclusions they form new opinions about how the brain works from the little information they recently acquired. They jump to conclusions about how the brain jumps to conclusions and fit their newfound knowledge into a larger story that romantically and naively describes personal enlightenment.

Tyler Cowen made a similar point in a TED lecture a few months ago. He explained it this way:

There’s the Nudge book, the Sway book, the Blink book… [they are] all about the ways in which we screw up. And there are so many ways, but what I find interesting is that none of these books identify what, to me, is the single, central, most important way we screw up, and that is, we tell ourselves too many stories, or we are too easily seduced by stories. And why don’t these books tell us that? It’s because the books themselves are all about stories. The more of these books you read, you’re learning about some of your biases, but you’re making some of your other biases essentially worse. So the books themselves are part of your cognitive bias.

The crux of the problem, as Cowen points out, is that it’s nearly impossible to understand irrationalities without taking advantage of them. And, paradoxically, we rely on stories to understand why they can be harmful.

To be sure, there’s an important difference between the bias that comes from hearing one side of an argument and (most) narratives. A corrective like “consider the other side” is unlikely to work for narratives because it’s not always clear what the opposite would even be. So it’s useful to avoid jumping to conclusions not only by questioning narratives (after all, just about everything is plausibly a narrative, so avoiding them can be pretty overwhelming), but by exposing yourself to multiple narratives and trying to integrate them as well as you can.

In the beginning of the recently released book The Righteous Mind, social psychologist Jonathan Haidt explains how some books (his included) make a case for how one certain thing (in Haidt’s case, morality) is the key to understanding everything. Haidt’s point is that you shouldn’t read his book and jump to overarching conclusions about human nature. Instead, he encourages readers to always think about integrating other points of view (e.g., morality is the most important thing to consider) with other perspectives. I think this is a good strategy for overcoming a narrow-minded view of human cognition.

It’s natural for us to reduce the complexity of our rationality into convenient bite-sized ideas. As the trader turned epistemologist Nassim Taleb says: “We humans, facing limits of knowledge, and things we do not observe, the unseen and the unknown, resolve the tension by squeezing life and the world into crisp commoditized ideas.” But readers of popular psychology books on rationality must recognize that there’s a lot they don’t know, and they must be beware of how seductive stories are. The popular literature on cognitive biases is enlightening, but let’s be irrational about irrationality; exposure to X is not knowledge and control of X. Reading about cognitive biases, after all, does not free anybody from their nasty epistemological pitfalls.

Moving forward, my suggestion is to remember the lesson from Brenner, Koehler and Tversky: they reduced conclusion jumping by getting people to consider the other information at their disposal. So let’s remember that the next book on rationality isn’t a tell-all – it’s merely another piece to the puzzle. This same approach could also help correct the problem of being too swayed by narratives – there are anyways multiple sides of a story.

Ultimately, we need to remember what philosophers get right. Listen and read carefully; logically analyze arguments; try to avoid jumping to conclusions; don’t rely on stories too much. The Greek playwright Euripides was right: Question everything, learn something, answer nothing.

7 Comments Post a comment
  1. Reblogged this on Kritikos & Bodhi and commented:
    Relevanz: Es geht um eine grundsätzliche Verzerrung in der menschlichen Entscheidungsfindung. Wir sind kaum in der Lage alle Fakten sorgfälltig abzuwägen. Wir verlassen uns eher auf eine knackige Gedschichte.

    Inhalt: Probanden werden mit allen Details einer strittigen Angelegenheit vertraut gemacht, in der sich zwei Meinungen gegenüber stehen. Anschließend hören sie zusätzlich die Auffassung eines Vertreters einer Seite. Es zeigt sich, daß die Probanden dazu neigen, dieser zuletzt gehörten Auffassung zu folgen, obwohl sie zuvor alle Details präsentiert bekommen haben. Es scheint erwiesen zu sein, daß Menschen sich an gut erzählten Geschichten eher orientieren als an umfassenden Faktensammlungen. Selbst Präsentationen, daß wir uns derart verhalten können dieses heuristische Verhalten nicht ohne weiteres kompensieren. Es ist natürlich für uns komplexe Sachverhalte in leichter verdauliche Stücke zu zerlegen. Ein Teil der Lösung ist, sich mit der Auffassung der Gegenseite vertraut zu machen. Man sollte, auch wenn es unbequem ist, sich auf die Argumente der Gegenseite einlassen: “Wir sollten uns das klar machen, was Philosophen verstanden haben. Höre gut zu; analysiere Argumente logisch; vermeide vorschnelle Konklusionen; verlasse dich nicht zu sehr auf Geschichten. Euripides hat es so formuliert: Hinterfrage alles, lerne etwas, beantworte nichts.”

    April 30, 2012
    • sammcnerney #

      I wish I could read German… (:

      April 30, 2012
      • Hi sammcnerney, it is only a note about your post with a bit of a summary. I did forget that the reblog-function does also a post on the original blog. Matthias

        April 30, 2012
  2. To go one step further, consider Socrates’ and Einstein’s statements that the highest knowledge is the recognition that there are things we can never know (I’m paraphrasing). Both had the wisdom and humility to admit mankind, or the “intellectual mind” that other scientists worship, is a limited thing, if not feeble. It is this arrogance that marks the inferior scientific mind. In my own explorations, becomes clearer that our “science” is hardly something to worship (although it does have much in common with other religions, e.g. assumptions, beliefs, self-evident truths). Our “science” is more accurately called “technology” and proceeds mostly accidentally in fits and starts. But “proceeds” does not mean it gets closer to the truth. The history of science has been the history of “absolute certainty” (e.g. Newtonian physics) being continually overturned by the next “absolute certainty” (e.g. relativity–quantum physics). This is true of all sciences, and must be particularly embarrassing to medical science, which has adhered to mistake after mistake after mistake, and is now up to its neck in conflicts of interest and corruption. Of course, the “elephant in the room” of science is the total focus on phenomenon along with the total overlooking of “consciousness” or whatever you call the principle that phenomenon arise in. I personally think science came to a screeching halt about 80 years ago when relativity and quantum physics could not be reconciled. To this day it has not been reconciled (although we have perfectly “reasonable” scientists saying that reality is made of of 10 dimensional “strings”, or that, every decision results in the creation of an entirely new, parallel universe. Come on folks, get real. Can’t you see you are looking into the unknowable that Socrates and Einstein had so much respect for? Anyway, that’s my two cents. Science is a joke, and it has given us an improved quality of life (by some measures) as we careen toward the destruction of the human habitat…I enjoy and I am happy to explore this with anyone who is seriously curious, but I will not respond to diatribes….Dave Trindle

    April 30, 2012
    • sammcnerney #

      You had decent points but I lost you when you said “science is a joke.”

      April 30, 2012
      • Have a look at it for yourself. It is not what it claims to be. The “joke” is that we worship it. Consider an example, the “big bang” theory. If you look under the hood, you will see that, while it is a fun model to play around with, it is at its very root based on some widely glossed-over unproved/unprovable assumptions about what we see in the sky. Now, I’m ok with making assumptions and running models and having fun with it. But
        it can’t claim to be the truth or even near the truth until those assumptions are proven. And there are some who say they are unprovable–which is probably why they are glossed over…I’m just saying this is the way it seems to me, I would love to be proven wrong.

        April 30, 2012
  3. Since I am a psychotherapist now, I should have commented on the lead-in to the article “the paradox of popular psychology.” I am also a mathematician and statistition by training and experience in an earlier career. For whatever confidence that may give you in my credibility, you might be interested that “psychology” is not a science, and is the laughing stock of the scientific world. “Popular Psychology” is to psychology as National Enquirer is to the New York Times. During graduate school, I was shocked at the conclusions that were drawn from uncontrolled and statistically non-credible “data.” There is no worse science than psychology in its current state. If you peek under the hood of any psychological study, you will find a can of worms. Even where data is somewhat controlled (which is almost never), the psychologists draw conclusions totally unsupported by their statistical analysis of the data. They don’t understand statistics. They are breathing heavy and hell bent on getting away with announcing something provocative, period. Which, of course, sells. I challenge any one to send me the full documentation of ANY psychological experiment, and I am 95% confident I can deconstruct and debunk it on more than one level…Dave Trindle

    p.s. all of this counts triply for the so-called “positive psychology” movement that has come out of one of our most respected educational institutions, and has made several people there quite rich.

    April 30, 2012

Leave a Reply, Be Constructive, and Let's Debate

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: