Skip to content

Posts tagged ‘intuition’

Why The Future of Neuroscience Will Be Emotionless

In Phaedrus, Plato likens the mind to a charioteer who commands two horses, one that is irrational and crazed and another that is noble and of good stock. The job of the charioteer is to control the horses to proceed towards Enlightenment and the truth.

Plato’s allegory sparked an idea that perpetuated throughout the next several millennia in western thought: emotion gets in the way of reason. This makes sense to us. When people act out-of-order, they’re irrational. No one was ever accused of being too reasonable. Around the 17th and 18th centuries, however, thinkers began to challenge this idea. David Hume turned the tables on Plato: reason, Hume said, was the slave of the passions. Psychological research of the last few decades not only confirms this view, some of it suggests that emotion is better at deciding.

We know a lot more about how the brain works compared to the ancient Greeks, but a decade into the 21st century researchers are still debating which of Plato’s horses is in control, and which one we should listen to.

A couple of recent studies are shedding new light on this age-old discourse. The first comes from Michael Pham and his team at Columbia Business School. The researchers asked participants to make predictions about eight different outcomes ranging from American Idol finalists, to the winners of the 2008 Democratic primary, to the winner of the BCS championship game. They also forecasted the Dow Jones average.

Pham created two groups. He told the first group to go with their guts and the second to think it through. The results were telling. In the American Idol results, for example, the first group correctly predicted the winner 41 percent of the time whereas the second group was only correct 24 percent of the time. The high-trust-in-feeling subjects even predicted the stock market better.

Pham and his team conclude the following:

Results from eight studies show that individuals who had higher trust in their feelings were better able to predict the outcome of a wide variety of future events than individuals who had lower trust in their feelings…. The fact that this phenomenon was observed in eight different studies and with a variety of prediction contexts suggests that this emotional oracle effect is a reliable and generalizable phenomenon. In addition, the fact that the phenomenon was observed both when people were experimentally induced to trust or not trust their feelings and when their chronic tendency to trust or not trust their feelings was simply measured suggests that the findings are not due to any peculiarity of the main manipulation.

Does this mean we should always trust our intuition? It depends. A recent study by Maarten Bos and his team identified an important nuance when it comes to trusting our feelings. They asked one hundred and fifty-six students to abstain from eating or drinking (sans water) for three hours before the study. When they arrived Bos divided his participants into two groups: one that consumed a sugary can of 7-Up and another that drank a sugar-free drink.

After waiting a few minutes to let the sugar reach the brain the students assessed four cars and four jobs, each with 12 key aspects that made them more or less appealing (Bos designed the study so an optimal choice was clear so he could measure of how well they decided). Next, half of the subjects in each group spent four minutes either thinking about the jobs and cars (the conscious thought condition) or watching a wildlife film (to prevent them from consciously thinking about the jobs and cars).

Here’s the BPS Research Digest on the results:

For the participants with low sugar, their ratings were more astute if they were in the unconscious thought condition, distracted by the second nature film. By contrast, the participants who’d had the benefit of the sugar hit showed more astute ratings if they were in the conscious thought condition and had had the chance to think deliberately for four minutes. ‘We found that when we have enough energy, conscious deliberation enables us to make good decisions,’ the researchers said. ‘The unconscious on the other hand seems to operate fine with low energy.’

So go with your gut if your energy is low. Otherwise, listen to your rational horse.

Here’s where things get difficult. By now the debate over the role reason and emotion play in decision-making is well documented. Psychologists have written thousands of papers on the subject. It shows in the popular literature as well. From Antonio Damasio’s Descartes’ Error to Daniel Kahneman’s Thinking, Fast and Slow, the lay audience knows about both the power of thinking without thinking and their predictable irrationalities.

But what exactly is being debated? What do psychologists mean when they talk about emotion and reason? Joseph LeDoux, author of popular neuroscience books including The Emotional Brain and The Synaptic Self, recently published a paper in the journal Neuron that flips the whole debate on its head. “There is little consensus about what emotion is and how it differs from other aspects of mind and behavior, in spite of discussion and debate that dates back to the earliest days in modern biology and psychology.” Yes, what we call emotion roughly correlates with certain parts of the brain, it is usually associated with activity in the amygdala and other systems. But we might be playing a language game, and neuroscientists are reaching a point where an understanding of the brain requires more sophisticated language.

As LeDoux sees it, “If we don’t have an agreed-upon definition of emotion that allows us to say what emotion is… how can we study emotion in animals or humans, and how can we make comparisons between species?” The short answer, according to the NYU professor, is “we fake it.”

With this in mind LeDoux introduces a new term to replace emotion: survival circuits. Here’s how he explains it:

The survival circuit concept provides a conceptualization of an important set of phenomena that are often studied under the rubric of emotion—those phenomena that reflect circuits and functions that are conserved across mammals. Included are circuits responsible for defense, energy/nutrition management, fluid balance, thermoregulation, and procreation, among others. With this approach, key phenomena relevant to the topic of emotion can be accounted for without assuming that the phenomena in question are fundamentally the same or even similar to the phenomena people refer to when they use emotion words to characterize subjective emotional feelings (like feeling afraid, angry, or sad). This approach shifts the focus away from questions about whether emotions that humans consciously experience (feel) are also present in other mammals, and toward questions about the extent to which circuits and corresponding functions that are relevant to the field of emotion and that are present in other mammals are also present in humans. And by reassembling ideas about emotion, motivation, reinforcement, and arousal in the context of survival circuits, hypotheses emerge about how organisms negotiate behavioral interactions with the environment in process of dealing with challenges and opportunities in daily life.

Needless to say, LeDoux’s paper changes things. Because emotion is an unworkable term for science, neuroscientists and psychologists will have to understand the brain on new terms. And when it comes to the reason-emotion debate – which of Plato’s horses we should trust – they will have to rethink certain assumptions and claims. The difficult part is that we humans, by our very nature, cannot help but resort to folk psychology to explain the brain. We deploy terms like soul, intellect, reason, intuition and emotion but these words describe very little. Can we understand the brain even though our words may never suffice? The future of cognitive science might depend on it.

Read more

The Irrationality Of Irrationality

Reason has fallen on hard times. After decades of research psychologists have spoken: we humans are led by our emotions, we rarely (if ever) decide optimally and we would be better off if we just went with our guts. Our moral deliberations and intuitions are mere post-hoc rationalizations; classical economic models are a joke; Hume was right, we are the slaves of our passions. We should give up and just let the emotional horse do all the work.

Maybe. But sometimes it seems like the other way around. For every book that explores the power of the unconscious another book explains how predictably irrational we are when we think without thinking; our intuitions deceive us and we are fooled by randomness but sometimes it is better to trust our instincts. Indeed, if a Martian briefly compared subtitles of the most popular psychology books in the last decade he would be confused quickly. Reading the introductions wouldn’t help him either; keeping track of the number of straw men would be difficult for our celestial friend. So, he might ask, over the course of history have humans always thought that intelligence was deliberate or automatic?

When it comes to thinking things through or going with your gut there is a straightforward answer: It depends on the situation and the person. I would also add a few caveats. Expert intuition cannot be trusted in the absence of stable regularities in the environment, as Kahneman argues in his latest book, and it seems like everyone is equally irrational when it comes to economic decisions. Metacognition, in addition, is a good idea but seems impossible to consistently execute.

However, unlike our Martian friend who tries hard to understand what our books say about our brains, the reason-intuition debate is largely irrelevant for us Earthlings. Yes, many have a sincere interest in understanding the brain better. But while the lay reader might improve his decision-making a tad and be able explain the difference between the prefrontal cortex and the amygdala the real reason millions have read these books is that they are very good.

The Gladwells, Haidts and Kahnemans of the world know how to captivate and entertain the reader because like any great author they pray on our propensity to be seduced by narratives. By using agents or systems to explain certain cognitive capacities the brain is much easier to understand. However, positioning the latest psychology or neuroscience findings in terms of a story with characters tends to influence a naïve understanding of the so-called most complex entity in the known universe. The authors know this of course. Kahneman repeatedly makes it clear that “system 1” and “system 2” are literary devices not real parts in the brain. But I can’t help but wonder, as Tyler Cowen did, if deploying these devices makes the books themselves part of our cognitive biases.

The brain is also easily persuaded by small amounts of information. If one could sum up judgment and decision-making research it would go something like this: we only require a tiny piece of information to confidently form a conclusion and take on a new worldview. Kahneman’s acronym WYSIATI – what you see is all there is – captures this well. This is precisely what happens the moment readers finish the latest book on intuition or irrationality; they just remember the sound bite and only understand brains through it. Whereas the hypothetical Martian remains confused, the rest of us humans happily walk out of our local Barnes and Noble, or even worse, finish watching the latest TED with the delusion feeling that now, we “got it.”

Many times, to be sure, this process is a great thing. Reading and watching highbrow lectures is hugely beneficial intellectually speaking. But let’s not forget that exposure to X is not knowledge of X. The brain is messy; let’s embrace that view, not a subtitle.

What is Reason Good For? The Rationality-Intuition Debate

Reason is under attack. Lobbing bomb shells is its twin brother who thinks unconsciously, quickly, and with less effort; I speak of intuition of course. It’s unclear when the rationality-intuition debate began, but its empirical roots were no doubt seeded when the cognitive revolution began and grew when Kahneman and Tversky started demonstrating the flaws of rational actor theory. Their cognitive biases and heuristic program, as it came to be known, wasn’t about bashing economic theory though, it was meant to illustrate not only innocuous irrationalities but systematic errors in judgment. What emerged, which is now beautifully portrayed in Daniel Kahneman’s new book, is a dualistic picture of human cognition where our mental processes are dictated by two types of thinking: system 1 thinking, which is automatic, quick and intuitive, and system 2 thinking, which is deliberate, slow and rational. We think, as the title reads, fast and slow.

It was only in the last decade that literature on system 1 and system 2 thinking made its way into the eye of the lay audience. Gladwell’s Blink, which nicely illustrated the power of thinking without thinking – system 1 – made a splash. On the other hand, Ariely’s Predictably Irrational spurred public debate about the flaws of going with your gut. In the wake of this literature, reason suffers from a credibility crisis. Am I rational or irrational? Should I go with my gut or think things through? Questions like these abound and people too often forget that context and circumstance are what really matter. (If you’re making a multimillion dollar business deal think it through. If you’re driving down the highway stick with your intuition!). Lately though, I’ve seen too much reason-bashing and I want to defend this precious cognitive capacity after reading the following comments, which were left in response to my last post by someone kind enough to engage my blog. His three points:

  • Consciousness-language/self-talk is trivial and epiphenomenal. It means very little and predicts less.
  • It is post-hoc pretty much anything interesting in brains processes > behavior
  • All other animals and living things get along just fine w/out it.

With the exception of his third point, which is worth a debate elsewhere, he (or she, but for the sake of writing I am just sticking with one pronoun) captures what many psychologists believe – that our vocalized beliefs are nothing more than post-hoc justifications of gut-reactions. Jonathan Haidt, for example, uses the metaphor of the rider atop an elephant where the rider ignorantly holds himself to be in control of his uncontrollable beast. There is more than a grain of truth to Haidt’s model, and plenty of empirical data backs it up. My favorite is one study in which several women were asked to choose their favorite pair of nylon stockings from a group of twelve. Then, after they made their selections researchers asked them to explain their choices. Among the explanations texture, feel, and color were the most popular. However, all of the stockings were in fact identical. The women were being sincere – they truly believed that what they were saying made sense – but they simply made up reasons for their choices believing that they consciously knew their preferences.

There is a problem with the whole sale reaction of reason. It is difficult to explain why humanity has made so much moral progress if we believe that our deliberations are entirely uncontrollable. For example, how is it, a critic of Haidt’s model may ask, that institutions like slavery, which were for the most of human history intuitively acceptable, are now intuitively unacceptable? In other words, if we really are solely controlled by the elephant, why aren’t we stuck in a Hobbesian state of nature where life is violent, brutish and short?

One answer is that through reason we were able to objectively look at the world and realize that slavery – and many other injustices and immoralities – made society worse. As Paul Bloom explains in a recent Nature piece: “Emotional responses alone cannot explain one of the most interesting aspects of human nature: that morals evolve. The extent of the average person’s sympathies has grown substantially and continues to do so. Contemporary readers of Nature, for example, have different beliefs about the rights of women, racial minorities and homosexuals compared with readers in the late 1800s, and different intuitions about the morality of practices such as slavery, child labour and the abuse of animals for public entertainment. Rational deliberation and debate have played a large part in this development.” Bloom’s point is thoroughly expanded in Pinker’s latest book, The Better Angels of our Nature, where Pinker argues that reason led people to commit fewer acts of violence. In his words: “At various times in history superstitious killings, such as inhuman sacrifice, witch hunts, blood libels, inquisitions, and ethnic scapegoating, fell away as the factual assumptions on which they rested crumbled under the scrutiny of a more intellectually sophisticated populace. Carefully reasoned briefs against slavery, despotism, torture, religious persecution, cruelty to animals, harshness to children, violence to women, frivolous wars, and the persecution of homosexuals were not just hot air but entered into the decisions of the people and institutions who attended to the arguments and implemented reforms.” In regard to my commenter’s first point – that conscious talk is trivial and epiphenomenal – I think there should be little question that reason played and plays an important role in shaping society for the better and that it is certainly not trivial or epiphenomenal as a result.

His second point – that reason is all post-hoc justifications – is also problematic. Although conscious deliberate thought depends on unconscious cognition, it does not follow that all reasons are post-hoc justifications. For example, solving math problems requires unconscious neurological cognition but nobody would ever say that 1+1=2 is a post-hoc justification. The same is true of scientific truths; are Newton’s laws likewise post hoc justifications? No. This is because there are truths to be known about the world and they can be discovered with reason. As Sam Harris explains, “the fact that we are unaware of most of what goes on in our brains does not render the distinction between having good reasons for what one believes and having bad ones any less clear or consequential.” Reason, in other words, separates correct beliefs from incorrect beliefs to justify truths from falsehoods. It requires unconscious thought as neuroscience now knows, but it does not follow that everything our rationality discovers is a post-hoc justification.

So, let’s not forget that one of our species’ most important assets – reason – is a vitally important cognitive capacity that shouldn’t be left by the way side. Psychologists have done insightful work to demonstrate the role of the cognitive unconscious but this is not to disregard the power of human rationality.

A Brief History of Popular Psychology: An Essay

It is unclear when the popular psychology movement started, perhaps with Malcolm Gladwell’s The Tipping Point or Steven Levitt and Stephen Dubner’s Freakonomics, or how it is defined, but it could be generally described by the public’s growing interest in understanding people and events from a sociological, economical, psychological, or neurological point of view.

Over the last decade the New York Times bestseller list has seen a number of these books: Ariely’s Predictably Irrational (2008) and The Upside of Rationality (2010), Gilbert’s Stumbling on Happiness (2006), Haidt’s The Happiness Hypothesis (2006), Lehrer’s How we Decide (2009), and Thaler & Sunstein’s Nudge (2008). What unites them is their attempt to “explore the hidden side of everything,” by synthesizing numerous academic studies in a relatable way, drawing upon interesting real-world examples, and by providing appealing suggestions for how one can understand the world, and his or her decisions and behaviors within the world, better.

The popular psychology movement is the result of a massive paradigm shift, what many call the cognitive revolution, that took place in the second half of the 20th century. Although it’s starting point is unclear, George A. Miller’s 1956 “The Magical Number Seven, Plus or Minus Two,” and Noam Chomsky’s 1959 “Review B. F. Skinner’s Verbal Behavior,” were, among others, important publications that forced psychology to become increasingly cognitive. Whereas behaviorists – who represented the previous paradigm – only considered the external, those involved in the cognitive revolution sought to explain behavior by studying the internal; the cause of behavior was therefore thought of as being dictated by the brain and not the environment.

The cognitive revolution naturally gave rise to the cognitive sciences – neuroscience, linguistics, artificial intelligence, and anthropology – all of which began to study how human brains processed information. A big part of the revolution revolved around the work done by psychologists Daniel Kahneman and Amos Tversky. Kahneman and Tversky developed a cognitive bias and heuristic program in the early 1970s that changed the way human judgment was understood. The heuristics and biases program had two goals. First, it demonstrated that the mind has a series of mental shortcuts, or heuristics, that “provide subjectively compelling and often quite serviceable solutions to… judgmental problems.” And second, it suggested that underlying these heuristics were biases that “[departed from] normative rational theory.”

Kahneman and Tversky’s work was vital because it questioned the notion that judgment was an extensive exercise based off of algorithmic processes. Instead, it suggested that people’s decisions and behaviors are actually influenced by “simple and efficient… [and] highly sophisticated… computations that the mind had evolved to make.”

Their work was complimented by Richard Nisbett and Lee Ross’s 1980 book Human Inference: Strategies and Shortcomings of Social Judgment, which outlined how people’s “attempts to understand, predict, and control events in their social sphere are seriously compromised by specific inferential shortcomings.” From this, a list of cognitive biases began to accumulate. These included: attentional bias, confirmation bias, the endowment effect, status quo bias, gambler’s fallacy, the primacy effect, and more.

The cognitive biases and heuristic program was just one part of the cognitive revolution however. The other equally important aspects came a bit later when psychologists began to empirically study how unconscious processing influenced behavior and conscious thought. These studies stemmed from the 1977 paper Telling More Than We Can Know: Verbal Reports on Mental Processes, by Richard Nisbett and Timothy Wilson. Nisbett and Wilson argued that, “there may be little or no direct introspective access to higher order cognitive processes,” thereby introducing the idea that most cognition takes place automatically at the unconscious level.

Wilson continued his research in the 80s and 90s, eventually developing the concept of the “adaptive unconscious,” a term he uses to describe our ability to “size up our environments, disambiguate them, interpret them, and initiate behavior quickly and non-consciously.” He argued that the adaptive unconscious is an evolutionary adaptation used to navigate the world with a limited attention. This is why we are able to drive a car, type on a computer, or walk without having to think about it.

Complimenting Wilson was Yale psychologist Jon Bargh who significantly contributed to the study of how certain stimulus influenced people’s implicit memory and behavior. In numerous experiments, Bargh demonstrated that people’s decisions and behaviors are greatly influenced by how they are “primed”. In one case, Bargh showed the people primed with rude words, such as “aggressively, bold, and, intrude,” were on average about 4 minutes quicker to interrupt an experimenter than participants who were primed with the polite words such as “polite, yield, and sensitively.”

Also in the 80s and 90s, neuroscientists began to understand the role of emotion in our decisions. In the 1995 book Descartes Error, Antonio Damasio explicates the “Somatic Markers Hypothesis” to suggest that, contrary to traditional western thought, a “reduction in emotion may constitute an equally important source of irrational behavior.” NYU professor Joseph LeDoux was also instrumental in studying emotions. Like Wilson, Nisbett, and Bargh, LeDoux advocated that an understanding of conscious emotional states required an understanding of “underlying emotional mechanisms.”

Along with emotion and the unconscious, intuition was another topic that was heavily researched in the past few decades. It was identified and studied as a way of thinking and as a talent. As a way of thinking, intuition more or less corresponds to Wilson’s adaptive unconscious; it is an evolutionary ability that helps people effortlessly and unconsciously disambiguate the world; i.e., the ability for people to easily distinguish males from females, their language from another, or danger from safety.

Intuition as a talent was found to be responsible for a number of remarkable human capabilities, most notably those of experts. As Malcolm Gladwell says in his 2005 best seller Blink, intuitive judgments, “don’t logically and systemically compare all available options.” Instead, they act off of gut feelings and first impressions that cannot be explained rationality. And most of the time, he continues, acting on these initial feelings is just as valuable as acting on more “thought out” feelings.

By the 1990s, when the “revolution in the theory of rationality… [was] in full development,” the line between rational and irrational behavior became blurred as more and more studies made it difficult to determine what constituted rational behavior. One on hand, some (mainly economists) maintained rationality as the norm even though they knew that people deviated from it. On the other hand, individuals like Herbert Simon and Gerd Gigerenzer argued that the standards for rational behavior should be grounded by ecological and evolutionary considerations. In either case though, rational choice theory was what was being argued. Because of this, the 1990s saw books such as Stuart Sutherland’s Irrationality (1994), Massimo Piattelli-Palmarini’s Inevitable Illusions: How Mistakes of Reason Rule Our Mind (1996), and Thomas Gilovich’s How We Know What Isn’t: The Fallibility of Human Reason in Everyday Life (1991). Each perpetuated that idea that behavior or decision-making was to be judged by a certain standard or norm (in this case, rational choice theory) as the titles imply.

However, when all of the facets of the cognitive revolution – cognitive biases and heuristics, the unconscious, emotion, and intuition – are considered, the idea that we act rationally begins to look extremely weak; this observation has heavily influenced the popular psychology movement. Pick up any popular psychology book and you will find Kahneman, Tversky, Nisbett, Wilson, Bargh, Damasio, Ledoux, and others heavily cited in arguments that run contrary to rational actor theory.

What’s interesting, and my last post touched on this, is that each popular psychology author has something different to say: Dan Ariely pushes behavioral economics to argue that we are all predictably irrational; Damasio argues that reason requires emotion; Gladwell, David Myers, and Wilson suggest that mostly thought is unconscious and our intuitive abilities are just as valuable as our rational ones; Daniel Gilbert and Jonathan Haidt illustrate how our cognitive limitations affect our well-being; Barry Schwartz shows how too much choice can actually hurt us; and Jonah Lehrer draws upon neuroscience to show the relationship between emotion and reason in our decision-making.

As a result of all these assertions, the human condition has become seriously complicated!

If there is something to conclude from what I have outlined it is this. Implicit in any evaluation of behavior is the assumption that human beings have a nature or norm, and that their behavior is deviating from this nature or norm. However, the popular psychology movement shows that our brains are not big enough to understand human behavior and our tendency to summarize it so simplistically is a reflection of this. We aren’t rational, irrational, or intuitive, we are, in the words of K$sha, who we are. 

Follow

Get every new post delivered to your Inbox.

Join 332 other followers

%d bloggers like this: