Skip to content

Posts from the ‘Uncategorized’ Category

Relaxation & Creativity: The Science of Sleeping on It

A new post up at Big Think!

Sigmund Freud postulated that dreaming is a reflection of the unleashed id; it represents one’s deep sexual fantasies and frustrations implanted during childhood. But what happens when we fall asleep is usually much less dramatic; we dream about the problems of everyday life. Now scientists understand dreaming as an integral part of the creative process – it’s not just about the problems of everyday life, it’s about solving them.

In 2004, the neuroscientists Ullrich Wagner and Jan Born published a paper in Nature that examined the relationship between sleep and problem solving. In one experiment, they tasked participants with transforming a long list of number strings. The task required participants to apply a set of algorithms that would scare off most save a handful of math geeks. However, the researchers integrated an elegant shortcut that made the task easier. How many people, Wagner and Born asked, would catch it?

Continue reading here

What Improv Teaches Us About Creativity

My latest at my Big Think Blog. I explore improv and the idea of agreement.

The most important rule in improvisation comedy is the idea of agreement, the notion that a scene flourishes when all the players accept anything that happens to them. Improv isn’t about wisecracks and one-liners. It’s about creating a structure where characters and narratives are quickly created, developed, sometimes forgotten and other times resolved. With just a tip-bit – usually a one-word suggestion at the beginning of the show – good improvisers generate compelling and captivating stories that engage the audience. Comedy is the natural byproduct.

The question, of course, is how do they do it?

Consider a study conducted several years ago out of Johns Hopkins University by neuroscientist Charles Limb. Limb designed a clever experiment that measured the brains’ of jazz pianists using an fMRI machine as they improvised on a MIDI keyboard. His study focused on two parts of the brain: the medial prefrontal cortex (MPC) and the dorsolateral prefrontal cortex (DLPFC). The medial prefrontal cortex is a part of the brain associated with self-expression; it’s a mental narrator that keeps tabs on the story of your life. The DLPFC is closely associated with impulse control. It’s a part of the brain that makes you think twice before you eat a slice of pizza or gamble – a sort of mental shackle that keeps your neurons in check.

The key finding involved the DLPFC. Limb found that the musicians “deactivated” their DLPFC once they began improvising. That is, the musicians turned off part of their conscious brain to let the unconscious mind do the work. As Limb says, “musical creativity vis-à-vis improvisation may be a result of… the suspension of self-monitoring and related processes that typically regulate conscious control of goal-directed, predictable, or planned actions.” In other words, the pianists were inhibiting their inhibitions. (Watch Limb’s TED lecture here)

Continue reading here.

How To Generate A Good Idea

My latest on my Big Think blog. Here’s the gist:

Several years ago, Tom DeMarco and Timothy Lister conducted a study that measured the productivity of computer programmers. Their data set included more than 600 programmers from 92 companies. According to Susan Cain, author of the recently released book Quiet: The Power of Introverts, DeMarco and Lister found that what distinguished the best programmers was not experience or salary, but privacy: personal workspace and freedom from interruption.

In fact, “sixty-two percent of the best performers said their workspace was sufficiently private compared with only 19 percent of the worst performers. Seventy-six percent of the worst programmers but only 38 percent of the best said that they were often interrupted needlessly.”

In Quiet, her manifesto, Cain criticizes the new “Groupthink” model that she says dominates our schools and work places. Indeed, students are encouraged to collaborate with their peers often, and many businesses (70% by Cain’s estimate) sport open office plans to encourage their employees to freely exchange ideas. The idea behind Groupthink models is that creativity and achievement requires other people. Lone geniuses are out, says Cain, and collaboration is in.

Continue reading here

The Irrationality of Irrationality: The Paradox of Popular Psychology

Here’s my latest on ScientificAmerican.com 

In 1996, Lyle Brenner, Derek Koehler and Amos Tversky conducted a study involving students from San Jose State University and Stanford University. The researchers were interested in how people jump to conclusions based on limited information. Previous work by Tversky, Daniel Kahneman and other psychologists found that people are “radically insensitive to both the quantity and quality of information that gives rise to impressions and intuitions,” so the researchers knew, of course, that we humans don’t do a particularly good job of weighing the pros and cons. But to what degree? Just how bad are we at assessing all the facts?

To find out, Brenner and his team exposed the students to legal scenarios. In one, a plaintiff named Mr. Thompson visits a drug store for a routine union visit. The store manager informs him that according to the union contract with the drug store, plaintiffs cannot speak with the union employees on the floor. After a brief deliberation, the manager calls the police and Mr. Thompson is handcuffed for trespassing. Later the charges were dropped, but Mr. Thompson is suing the store for false arrest.

All participants got this background information. Then, they heard from one of the two sides’ lawyers; the lawyer for the union organizer framed the arrest as an attempt to intimidate, while the lawyer for the store argued that the conversation that took place in the store was disruptive. Another group of participants – essentially a mock jury – heard both sides.

The key part of the experiment was that the participants were fully aware of the setup; they knew that they were only hearing one side or the entire story. But this didn’t stop the subjects who heard one-sided evidence from being more confident and biased with their judgments than those who saw both sides. That is, even when people had all the underlying facts, they jumped to conclusions after hearing only one side of the story.

The good news is that Brenner, Koehler and Tversky found that simply prompting participants to consider the other side’s story reduced their bias – instructions to consider the missing information was a manipulation in a later study – but it certainly did not eliminate it. Their study shows us that people are not only willing to jump to conclusions after hearing only one side’s story, but that even when they have additional information at their disposal that would suggest a different conclusion, they are still surprisingly likely to do so. The scientists conclude on a somewhat pessimistic note: “People do not compensate sufficiently for missing information even when it is painfully obvious that the information available to them is incomplete.”

In Brenner’s study, participants were dealing with a limited universe of information – the facts of the case and of the two sides’ arguments. But in reality – especially in the Internet era – people have access to a limitless amount of information that they could consider. As a result, we rely on rules of thumb, or heuristics, to take in information and make decisions. These mental shortcuts are necessary because they lessen the cognitive load and help us organize the world – we would be overwhelmed if we were truly rational.

This is one of the reasons we humans love narratives; they summarize the important information in a form that’s familiar and easy to digest. It’s much easier to understand events in the world as instances of good versus evil, or any one of the seven story types. As Daniel Kahneman explains, “[we] build the best possible story form the information available… and if it is a good story, [we] believe it.” The implication here is that it’s how good the story is, not necessarily its accuracy, that’s important.

But narratives are also irrational because they sacrifice the whole story for one side of a story that conforms to one’s worldview. Relying on them often leads to inaccuracies and stereotypes. This is what the participants in Brenner’s study highlight; people who take in narratives are often blinded to the whole story – rarely do we ask: “What more would I need to know before I can have a more informed and complete opinion?”

The last several years have seen many popular psychology books that touch on this line of research. There’s Ori and Rom Brafman’s Sway, Dan Ariely’s Predictably Irrational and, naturally, Daniel Kahneman’s Thinking, Fast and Slow. If you could sum up the popular literature on cognitive biases and our so-called irrationalities it would go something like this: we only require a small amount of information, often times a single factoid, to confidently form conclusions and generate new narratives to take on new, seemingly objective, but almost entirely subjective and inaccurate, worldviews.

The shortcomings of our rationality have been thoroughly exposed to the lay audience. But there’s a peculiar inconsistency about this trend. People seem to absorb these books uncritically, ironically falling prey to some of the very biases they should be on the lookout for: incomplete information and seductive stories. That is, when people learn about how we irrationally jump to conclusions they form new opinions about how the brain works from the little information they recently acquired. They jump to conclusions about how the brain jumps to conclusions and fit their newfound knowledge into a larger story that romantically and naively describes personal enlightenment.

Tyler Cowen made a similar point in a TED lecture a few months ago. He explained it this way:

There’s the Nudge book, the Sway book, the Blink book… [they are] all about the ways in which we screw up. And there are so many ways, but what I find interesting is that none of these books identify what, to me, is the single, central, most important way we screw up, and that is, we tell ourselves too many stories, or we are too easily seduced by stories. And why don’t these books tell us that? It’s because the books themselves are all about stories. The more of these books you read, you’re learning about some of your biases, but you’re making some of your other biases essentially worse. So the books themselves are part of your cognitive bias.

The crux of the problem, as Cowen points out, is that it’s nearly impossible to understand irrationalities without taking advantage of them. And, paradoxically, we rely on stories to understand why they can be harmful.

To be sure, there’s an important difference between the bias that comes from hearing one side of an argument and (most) narratives. A corrective like “consider the other side” is unlikely to work for narratives because it’s not always clear what the opposite would even be. So it’s useful to avoid jumping to conclusions not only by questioning narratives (after all, just about everything is plausibly a narrative, so avoiding them can be pretty overwhelming), but by exposing yourself to multiple narratives and trying to integrate them as well as you can.

In the beginning of the recently released book The Righteous Mind, social psychologist Jonathan Haidt explains how some books (his included) make a case for how one certain thing (in Haidt’s case, morality) is the key to understanding everything. Haidt’s point is that you shouldn’t read his book and jump to overarching conclusions about human nature. Instead, he encourages readers to always think about integrating other points of view (e.g., morality is the most important thing to consider) with other perspectives. I think this is a good strategy for overcoming a narrow-minded view of human cognition.

It’s natural for us to reduce the complexity of our rationality into convenient bite-sized ideas. As the trader turned epistemologist Nassim Taleb says: “We humans, facing limits of knowledge, and things we do not observe, the unseen and the unknown, resolve the tension by squeezing life and the world into crisp commoditized ideas.” But readers of popular psychology books on rationality must recognize that there’s a lot they don’t know, and they must be beware of how seductive stories are. The popular literature on cognitive biases is enlightening, but let’s be irrational about irrationality; exposure to X is not knowledge and control of X. Reading about cognitive biases, after all, does not free anybody from their nasty epistemological pitfalls.

Moving forward, my suggestion is to remember the lesson from Brenner, Koehler and Tversky: they reduced conclusion jumping by getting people to consider the other information at their disposal. So let’s remember that the next book on rationality isn’t a tell-all – it’s merely another piece to the puzzle. This same approach could also help correct the problem of being too swayed by narratives – there are anyways multiple sides of a story.

Ultimately, we need to remember what philosophers get right. Listen and read carefully; logically analyze arguments; try to avoid jumping to conclusions; don’t rely on stories too much. The Greek playwright Euripides was right: Question everything, learn something, answer nothing.

Blogging At Big Think

Starting today I will be a blogger for BigThink.com. For those of you who are not familiar, Big Think is a wonderful website with great content. Here’s what they’re all about:

In our digital age, we’re drowning in information. The web offers us infinite data points—news stories, tweets, wikis, status updates, etc—but very little to connect the dots or illuminate the larger patterns linking them together. Here at Big Think, we believe that success in the future is about knowing the ideas that allow you to manage and master this universe of information. Therefore, we aim to help you move above and beyond random information, toward real knowledge, offering big ideas from fields outside your own that you can apply toward the questions and challenges in your own life.

My blog is called Moments of Genius. Here’s a quick summary of what it will be about.

Everybody has their own pet theory about how to generate ideas and be productive: some chug caffeine, others relax; some work in groups, others work alone; some work at night, others in the morning. This blog draws from recent findings in cognitive science to inform and answer these questions and others like it. It’s for the creative professional, the businessperson or the artist who seeks to create new ideas and work efficiently. It’s about translating findings in psychology and neuroscience so we can be more productive, make better decisions, be more creative, collaborate efficiently and solve problems effectively.

My first post went up today. It’s an expansion of a previous Why We Reason post on childhood and creativity. Here’s the gist:

The Monster Engine is one of the best ideas I’ve come across. It’s a book, demonstration, lecture and gallery exhibition created by Dave Devries. The premise is simple: children draw pictures of monsters and Devries paints them realistically. According to the website, the idea was born in 1998 when Devries took an interest in his niece’s doodles. As a comic addict, Devires wondered if he could use color, texture and shading to bring his niece’s drawings to life.

But Devries had a larger goal: he wanted to always see things as a child. Why? In many ways, children flourish where adults fail. Children are more creative and are natural inventors. Their worldview is incomplete and demands discovery. They prosper because they embrace their ignorance instead of ignoring it. And they are willing to explore, investigate and put their ideas to the test because they are willing to fail. Unlike adults, they don’t care how other people perceive or evaluate their ideas, and they’re unconcerned with the impossible or what doesn’t work.

So what does this mean for Why We Reason? In short, Why We Reason will remain for the time being. I still have a few WWR posts in the works and they need to see the light of day. However, some changes will be made in the near future. In the mean time, I encourage my readers to bookmark, tweet, share, etc., my posts on Big Think.

Are Superstitions Rational?

By many accounts, Bjorn Borg is one of the greatest tennis players of all time. The former world no. 1 won 11 Grand Slam titles between 1974 and 1981. Most remarkably, he won 82 percent of all the professional matches he played. He had skills.

But that’s not all he had. Like many athletes, he had superstitions. To prepare for Wimbledon, Bjorn grew a beard and wore the same Fila shirt during the matches. It worked too. He holds a career record of 51-4 at Wimbledon along with five consecutive singles titles he recorded in the second half of the 1970s. Bjorn’s “lucky beard,” as the Swedes termed it, has become a staple in other sports. Today, NFL, NBA and NHL players sport the “playoff beard” in search of a competitive edge.

Superstitions are, by many accounts, irrational and scientifically backwards. However, empirical evidence suggests that this might not be entirely true. A few years ago social psychologist Lysann Damisch teamed up with Barbara Stoberock and Thomas Mussweiler to measure what effects, if any, superstitions had in sports.

In one experiment, the social scientists tested the “lucky ball” myth by having two groups of participants attempt ten golf putts from a distance of 100cm. Like good psychologists they told one group of participants that the ball they were about to use “turned out to be lucky” (superstition-activated condition). In contrast, they told the second group they were using a ball that everyone used (control condition). The researchers found that participants in the superstition-activated condition drained 35 percent more putts than participants in the control condition.

In a related study conducted last year, undergraduate Charles Lee of the University of Virginia joined with Sally Linkenauger to see if superstitious beliefs about equipment affected performance. They recruited 41 undergraduates with backgrounds in golf for their study. Similar to Damisch’s team, Lee and Linkenauger told half of the students they were using a really nice putter and the other half that British Open champion and PGA tour player Ben Curtis, who was known to be an expert putter, previously owned the putter they were able to use. (Importantly, all of the undergraduates knew who Curtis was.) Their findings were telling: students who putted with “Curtis’” putter sank, on average, one and a half more balls.

What accounts for these findings? The basis for superstitious beliefs is sheer fantasy, but their effects can be real and consequential. For example, a 2010 paper by Travis Ng of Hong Kong University found that superstitions surrounding ‘8’ and ‘4’ in Cantonese – 8 is considered lucky because it rhymes with prosper and prosperity whereas 4 is unlucky because it rhymes with die or death – affected the economics of license plates. Here’s the BPS Research Digest:

Controlling for visual factors that affect price (for example, plates with fewer digits are more sought-after) Ng’s team found that an ordinary 4-digit plate with one extra lucky ’8′ was sold 63.5 per cent higher on average. An extra unlucky ’4′ by contrast diminished the average 4-digit plate value by 11 per cent. These effects aren’t trivial. Replacing the ’7′ in a standard 4-digit plate with an ’8′ would boost its value by roughly $400.

So why do we believe in superstitions in the first place? Some cases are clearer than others. In terms of athletic performance, evidence suggests that a superstitious belief in certain objects (Curtis’ putter) and habits (Bjorn’s beard) gives us confidence, which moreover improves performance. In terms of the study involving Ben Curtis’ putter, it’s the clubs history that’s relevant. For the same reason people would like to wear a sweater knitted by Mother Teresa or use Einstein’s pencil, we believe that the equipment a legend used would give us an advantage on the playing field. In the case of Bjorn’s beard, the habit provides structure and security to an otherwise disorganized or nervous pre-Wimbledon routine.

It’s also hypothesized that superstitions arise from our natural tendency to seek evidence of intentionality in the world. We want reasons for things, and we want those reasons to have an author (e.g., God, destiny, karma, the force). We hate randomness. Many religious beliefs come about from teleological reasoning along these lines. And like superstitions in sports, there are real consequences. This is what research from anthropologist Richard Sosis suggests. As a recent NYTimes reports:

[Sosis] found that in Israel during the second intifada in the early 2000s, 36 percent of secular women in the town of Tzfat recited psalms in response to the violence. Compared with those who did not recite psalms, he found, those women benefited from reduced anxiety: they felt more comfortable entering crowds, going shopping and riding buses — a result, he concluded, of their increased sense of control.

All of this research encourages the idea that superstitious beliefs might not be entirely irrational. Although there is no empirical data to suggest that superstitions are real in it of themselves, their behavioral consequences illustrate a different trend.

There are downsides, of course, to fantastical thinking – athletes often become overly obsessed with pregame rituals and many religious beliefs led to less than ideal scenarios. But superstitions are essential. For better or for worse, they are a natural component of our cognition.

Read more

Produce First, Sharpen Second: What Dylan’s Vomit Teaches Us About Creativity

For Dylan, “Like a Rolling Stone” began as a long piece of vomit, at least that’s what he told two reporters back in 1965. As the story goes, Dylan, who was at the tail end of a grueling tour that took his pre-electric act across the United States and into Europe, decided to quit music and move to a small cabin in upstate New York to rethink his creative direction. He was sick of answering the same questions over and over again. He was sick of singing the same song over and over again. He wanted to liberate his mind.

This is why “Like a Rolling Stone” began as a twenty-page ramble. It was, as Dylan described it, a regurgitation of dissatisfactions and curiosities. What came next was Dylan’s true talent. Like a wood sculpture, he whittled at his rough draft. He cherry picked the good parts and threw away the bad parts. He began to dissect his words to try to understand what his message was. Eventually, Dylan headed to the studio with a clearer vision, and today, “Like a Rolling Stone” stands as one of the very best.

What’s interesting is how Dylan approached the writing process. The song started as a splattering of ideas. Dylan wasn’t even trying to write a song; initially, he didn’t care about verses or choruses. He compared the writing process to vomiting because he was trying to bring an idea that infected his thinking from the inside to the outside of his body.

His strategy isn’t unique. In fact, it resembles the approach of many other artists throughout history. For example, in the Fall 1975 issue of The Paris Review, the Pulitzer Prize winner and Nobel laureate John Steinbeck gave this piece of advice about writing: “Write freely and as rapidly as possible and throw the whole thing on paper. Never correct or rewrite until the whole thing is down. Rewrite in process is usually found to be an excuse for not going on. It also interferes with flow and rhythm which can only come from a kind of unconscious association with the material.” As the saying goes, perfection is achieved not when there is nothing left to add, but when there is nothing left to take away.

This principle doesn’t just show itself in art. Economies, too, succeed and fail by continuous innovation and wealth followed by unvaried ideas and bankruptcies. The Austrian economist Joseph Schumpeter popularized the term creative destruction to describe the simultaneous accumulation and annihilation of wealth under capitalism. As Schumpeter saw it, for every successful entrepreneur dozens of failures followed. But this was a good thing; capitalism was to be understood as an evolutionary process where good ideas prevailed over bad ones.

With these thoughts in mind, consider a study released this month conducted by Simone Ritter of the Radboud University in the Netherlands with help from Rick B. van Baaren and Ap Dijksterhuis. For the first experiment, the scientists recruited 112 university students and gave them two minutes to come up with creative ideas to solve relatively harmless problems (e.g., improving the experience of waiting in line at a supermarket). Next, the subjects were divided into two groups: the first went straight to work, while the second performed an unrelated task for two minutes to distract their conscious mind.

The first thing the psychologists found wasn’t too eye opening. Both groups – conscious and distracted – created the same amount of ideas. But the second finding was slightly more intriguing. Here’s Jonah Lehrer describing the results:

After writing down as many ideas as they could think of, both groups were asked to choose which of their ideas were the most creative. Although there was no difference in idea generation, giving the unconscious a few minutes now proved to be a big advantage, as those who had been distracted were much better at identifying their best ideas. (An independent panel of experts scored all of the ideas.) While those in the conscious condition only picked their most innovative concepts about 20 percent of the time — they confused their genius with their mediocrity — those who had been distracted located their best ideas about 55 percent of the time. In other words, they were twice as good at figuring out which concepts deserved more attention.

When it comes to writing an essay for college, pitching a business plan or creating a work of art we are hard wired to believe that our output is above average. As a result, we are blind to what needs improvement. It’s not just that we can’t see any holes and errors; we don’t think they exist. What’s interesting about Ritter’s findings is that they give us a strategy to overcome our overconfidence. The lesson from her research is that in order to recognize our imperfections we must step back and be dilettantes. In other words, get distracted and don’t marry the first draft.

And this brings me back to Dylan’s vomit and Steinbeck’s advice. The reason we should “never correct or rewrite until the whole thing is down” is because we initially don’t know which of our ideas are worthwhile. It’s only after we get everything down that we are able to recognize what works from what doesn’t. This is the lesson from Ritter’s research: we need to give the unconscious mind time to mull it over so it can convince the conscious mind to make adjustments. Or, as Nietzsche said in All Too Human: “The imagination of the good artist or thinker produces continuously good, mediocre or bad things, but his judgment, trained and sharpened to a fine point, rejects, selects, connects…. All great artists and thinkers are great workers, indefatigable not only in inventing, but also in rejecting, sifting, transforming, ordering.”

Read more

The Pros And Cons Of Likemindedness

Do opposites attract? Pop culture thinks so. Movies like Pretty Woman and The Notebook suggest that couples with virtually nothing in common are destined for each other. Psychological studies paint a different picture. When people have a choice, they seek people who are just like them. Psychologists call this the similarity-attraction effect (SAE) and it shows itself across many cultures.

The SAE is especially pronounced between romantic couples. For example, in the early 1990s the Chicago Sex Survey collected data to find out where and how Americans met their partners. It found that “people search for – or, in any case, find – partners they resemble and partners who are of comparable ‘quality’… the great majority of marriages exhibit homogamy on virtually all measured traits, ranging from age to education to ethnicity.”

The same is true of our friends. This is what a recent paper by Angela Bahns, Kate Pickett and Christian Crandall at Wellesley College and the University of Kansas demonstrates. The researchers were interested in how the social diversity of a college influenced social relationships: Did more socially diverse schools lead to more diverse relationships?

To find out they compared the relationships of students at a large state university (University of Kansas) with four small colleges in Kansas. They accomplished this by asking students about their demographic information, behaviors and beliefs (opinions on birth control and under age drinking for instance). They found that the “greater human diversity within an environmental leads to less personal diversity.” The students at the University of Kansas, in other words, tended to create more homogeneous social groups compared to their peers at smaller schools. This means, ironically, that the more opportunities there are to pursue diverse relationships the more we tend to gravitate towards likeminded people.

This can be a problem. Several studies conducted over the last decade illustrate the importance of intellectual diversity. An analysis of Stanford Business School graduates found that “entrepreneurs with more ‘entropic’ and ‘diverse’ social networks scored three times higher on a metric of innovation, suggesting that the ability to access ‘non-redundant information from peers’ is a crucial source of new ideas.” Similarly, Brian Uzzi and Jarrett Spiro found that the most successful Broadway musicals combined new blood with industry veterans; too much familiarity or novelty within the staff was a killer of quality content.

In the context of marriage the SAE is a good thing. Marriages usually succeed when two likeminded people are involved; the similarity of personality traits is a good predictor of marital stability and happiness. In fact, it’s especially unlikely for people with dissimilar personalities to be attracted to each other. It’s not merely that opposites don’t attract: They often repel.

If opposites don’t attract romantically, why do we have such a propensity to believe that they do? For one thing, we humans love romantic stories. From Romeo and Juliet to EVE and Wall-E to Katniss and Peeta, we can’t help but fantasize about pairs of star-crossed lovers. Unfortunately, because stories sacrifice reality for more passionate and heart wrenching plots our perception of romantic relationships is heavily distorted. Not everything has a happy ending.

In brief, then, romantic relationships thrive on similarity. The opposite is true for your social and professional circles: when it comes to generating ideas, being creative or entrepreneurial, intellectually diverse social circles are key.

Political Empathy & Moral Matrices

It’s difficult to make objective predictions about our future self. No matter how hard we try, we’re always influenced by the present. In one study, for example, researchers phoned people around the country and asked them how satisfied they were with their lives. They found that “when people who lived in cities that happened to be having nice weather that day imagined their lives, they reported that their lives were relatively happy; but when people who lived in cities that happened to be having bad weather that day imagined their lives, they reported that their lives were relatively unhappy.”

Similarly, a few years ago researchers went to a local gym and asked people who had just finished working out if food or water would be more important if they were lost in the woods. Like good social scientists, they asked the same question to people who were just about to work out. They found that 92 percent of the folks who just finished working out said that water would be more important; only 61 percent of people who were about to work out made the same prediction.

Physical states are difficult to transcend, and they often cause us to project our feelings onto everyone else. If I’m cold, you must be too. If I like the food, you should too. We are excellent self-projectors (or maybe that’s just me). Sometimes there are more consequential downsides to this uniquely human ability. And this brings me to a new study led by Ed O’Brien out of the University of Michigan recently published in Psychological Science. (via Maia Szalavitz at Time.com)

The researchers braved the cold for the first experiment. They approached subjects at a bus stop in January (sometimes the temperature was as low as -14 degrees F) and asked them to read a short story about a hiker who was taking a break from campaigning when he got lost in the woods without adequate food, water and clothing. For half of the subjects the lost hiker was a left leaning and pro-gay rights Democrat; the other half read about a right-wing Republican. Next, the researchers asked the subjects their political views and which feeling was most unpleasant for the stranded hiker – being thirsty, hungry or cold. (For female participants, the hiker was described as female; for men, the hiker was male.) While these chilly interviews were being conducted O’Brien and his team ran the same study in a cozy library. Did the two groups show different answers?

The first thing O’Brien found was consistent with the gym study: 94 percent of the people waiting for the bus said the cold was the most unpleasant feeling for the hiker compared to only 57 percent of the library dwellers. Here’s were things got interesting: “If participants disagreed with the hiker’s politics… their own personal physical state had no bearing on their response: people chose the cold in equal numbers, regardless of where they were interviewed.” In other words, we don’t show as much empathy towards people who don’t share our political beliefs.

Their findings are disheartening given the current political climate in the United States. If we cannot empathize with someone who doesn’t share our political views, how are we supposed to engage in rational discourse with them? In order to work out our differences, it seems like we need to first recognize that we are the same deep down.

The larger problem is that compassion, empathy and moral sentiments towards other people binds and blinds. As one author says, “we all get sucked into tribal moral communities, circling around something sacred and then sharing post-hoc arguments about why we are so right and they are so wrong. We think the other side is blind to truth, reason, science, and common sense, but in fact everyone goes blind when talking about their sacred objects.”

How do we break out of our political matrices? Here’s one idea: let’s take the red pill and realize that we all can’t be right while remembering that we all have something to contribute. This is what the Asian religions nailed on the head. Ying and Yang aren’t enemies because like night and day they are necessary for the functioning of the world. Vishnu the preserver (who stands for conservative principles) and Shiva the destroyer (who stands for liberal principles), the two of the high Gods in Hinduism, cooperate to preserve the universe. It’s a cliché worth repeating: let’s work together to get along.

Read more

Religion, Evolution & What The New Atheists Overlook

Lancet flukes (Dicrocelium dendriticum) are a clever little parasite. To reproduce, they find their way into the stomach of a sheep or cow by  commandeering an ant’s brain. Once this happens, ants exhibit strange behavior: they climb up the nearest blade of grass until it falls, then they climb it again, and again. If the flukes are lucky, a grazing farm animal eats the grass along with the ant; a sure win for the flukes, but a sad, and unfortunate loss for the six-legged insect.

Does anything like this happen with human beings? Daniel Dennett thinks so. In the beginning of his book Breaking the Spell, Dennett uses the fluke to suggest that religions survive because they influence their hosts (e.g., people) to do bad things for themselves (e.g., suicide bombing) but good things for the parasite (e.g., Islam). Implicit in Dennett’s example is that religions are like viruses, and people and societies are better of without them.

Dennett’s position is akin to the rest of the New Atheists: religion is a nasty and irrational byproduct of natural selection. This means that religious beliefs were not directly selected for by evolution any more than our noses evolved to help us keep our glasses from sliding off our faces. In the words of Pascal Boyer, “religious concepts and activities hijack our cognitive resources.” The question is: what cognitive resources influenced religion?

Most cognitive scientists agree that the Hypersensitve Agency Detection Device (abbreviated HADD) played an important role. In brief, the HADD explains why we see faces in the clouds, but never clouds in faces. Neuroscientist Dean Buonomano puts it this way: “We are inherently comfortable assigning a mind to other entities. Whether the other entity is your brother, a cat, or a malfunctioning computer, we are not averse to engaging it in conversation.” This ability endows will and intention to other people, animals and inanimate objects. The HADD produces a lot of false positive errors (e.g., seeing the virgin Mary in a piece of toast), and God might be one of them.

Another feature of the human mind that religion might have co-opted is a natural propensity towards a dualistic theory of mind. Dualism is our tendency to believe that people are made up off physical matter (e.g., lungs, DNA, and atoms) as well as an underlying and internal essence. Even the strictest materialist cannot escape this sentiment; we all feel that there is a “me” resting somewhere in our cortices. A belief in disembodied spirits could have given rise to beliefs in supernatural entities that existed independent of matter. Yale psychologist Paul Bloom is a proponent of this view and supports his conclusions with experimental evidence highlighted in his book Descartes’ Baby.

Although the by-productive hypothesis, as it is known, is incomplete, it all points to the same logic: “a bit of mental machinery evolved because it conferred a real benefit, but the machinery sometimes misfires, producing accidental cognitive effects that make people prone to believing in gods.”

This is an important piece of the puzzle for the New Atheists. If religion is the off shoot of a diverse set of cognitive modules that evolved for a variety of problems, then religious beliefs are nothing more than a series of neural misfires that are “correctable” with secular Enlightenment thinking.

Not everyone agrees. The evolutionary biologists David Sloan Wilson and Edward O. Wilson propose that religiosity is a biological adaptation that created communities by instilling a “one for all, all for one” mentality in its members. This is important because it allowed group members to function as a superorganism, which moreover gave them an advantage on the African savannah; “An unshakable sense of unity among… warriors,” Buonomano says, “along with certainty that the spirits are on their side, and assured eternity, were as likely to, as they are now, to improve the chances of victory in battle.” The binding power of religion would have also helped communities form objective moral codes – do unto others as you would have others do unto you – and protected against free riders.

Jonathan Haidt is making a name for himself by advocating this point. In addition to the group selection hypothesis, Haidt points to our species ability to experience moments of self-transcendence. The world’s religions, he believes, are successful because they found a way to facilitate such experiences. Here’s how he explained it in a recent TED:

If the human capacity for self-transcendence is an evolutionary adaptation, then the implications are profound. It suggests that religiosity may be a deep part of human nature. I don’t mean that we evolved to join gigantic organized religions — that kind of religion came along too recently. I mean that we evolved to see sacredness all around us and to join with others into teams that circle around sacred objects, people and ideas. This is why politics is so tribal. Politics is partly profane, it’s partly about self-interest. But politics is also about sacredness. It’s about joining with others to pursue moral ideals. It’s about the eternal struggle between good and evil, and we all believe we’re on the side of the good.

What’s interesting about Haidt’s angle is that it sheds a bad light on Enlightenment and secular ideals that western civilization was founded on. We exult liberty, individualism and the right to pursue our self-interest. But are we ignoring our innate desire to be part of something greater? Are we denying our groupish mentalities? The modern world gives us fixes – think big football games or raves – but I think some atheists are deprived.

And this brings me back to the fluke and the New Atheists. If Haidt is right, and our religiosity was an evolutionary adaptation, then religious beliefs are a feature of, not a poison to, our cognition. The fluke, therefore, is not a parasite but an evolutionary blessing the facilitated the creation of communities and societies. This is not to deny all the bloodshed on behalf of religion. But if religion is an adaptation and not a byproduct, then “we cannot expect people to abandon [it] so easily.”

Follow

Get every new post delivered to your Inbox.

Join 331 other followers

%d bloggers like this: