Skip to content

Posts tagged ‘Jonah Lehrer’

Why Traveling Abroad Makes Us More Creative

Originally posted on BigThink.com

Like many college students, I took a semester abroad. I spent the first half of my junior year in London taking classes at UCL, exploring the museums, and learning the difference between two pints, two pounds and two pence. After a few lovely months on the “other” side of the pond I returned home feeling cultured. Of course, the difference between London and New York (where I went to school) was small. But the UK nonetheless influenced me to see the world a bit differently.

Such are the benefits of travel. A few weeks or months in a foreign country won’t necessarily transform our lives, but wandering the streets of Helsinki, Harare or Hong Kong leaves a residue on our minds. Returning home, this cultural footprint is hard to ignore and difficult to identify. Something’s different, but what?

Given the importance of traveling abroad, it’s no surprise that psychologists study how these experiences affect our cognition. Do they make us smarter or more open-minded? Does learning a foreign language boost IQ? Is it a good idea to live outside of your native country for a while? Consider a study conducted by Lile Jia and his colleagues at Indiana University.

In one experiment the team of psychologists asked participants to list as many different modes of transportation as possible. They explained that the task was created by either Indiana University students studying in Greece (distant condition) or by Indiana University students studying in Indiana (near condition). This small ripple turned out to have large effects: participants in the distant condition generated more modes of transportation and were more original with their ideas.

The second experiment demonstrated similar results. The team asked participants to solve three insight problems. Here’s an example of one:

A prisoner was attempting to escape from a tower. He found a rope in his cell that was half as long enough to permit him to reach the ground safely. He divided the rope in half, tied the two parts together, and escaped. How could he have done this?

Like the first experiment, Jia and his team told participants that the questions came from either a research institute “around 2,000 miles away” or in Indiana “2 miles away.” (In a control condition they did not reference a location). Again, the researchers found that participants in the distant condition generated more solutions than participants in the other two conditions.

A ScientificAmerican.com article on Jia’s study summarizes the results this way:

This pair of studies suggests that even minimal cues of psychological distance can make us more creative. Although the geographical origin of the various tasks was completely irrelevant – it shouldn’t have mattered where the questions came from – simply telling subjects that they came from somewhere far away led to more creative thoughts.

In Imagine, Jonah Lehrer parallels this research with a 2009 study out of the Kellogg School of Management and INSEAD. The researchers “reported that students who lived abroad for an extended period were significantly more likely to solve a difficult creativity problem than students who had never lived outside of their birth country.” Lehrer concludes that, “the experience of another culture endows the traveler with a valuable open-mindedness, making it easier for him or her to realize that a single thing can have multiple meanings.”

It’s unclear if this finding is causal or correlative – students who go abroad might be endowed with an open and creative mindset in the first place – but the point remains: diverse experiences are good for creativity because they influence us to look at problems from multiple points of view.

This brings me to a brand new study out of Tel Aviv University’s School of Psychological Sciences conducted by professor Nira Liberman and a team of her students. They wanted to see if “expansive thinking” improves the creative output of 6 to 9 year olds.

Their experiment was straightforward. The researchers gave the kids a series of photographs displaying nearby objects (a pencil on a desk) and distant objects (a picture of the Milky Way galaxy). Here’s the important part: half of the kids started with the nearby objects and progressed to more distant ones (expansive mindset); the other half saw the photos in reverse order (contractive mindset).

Next, the kids tackled several creativity tests in which they were given an object and asked to name as many different uses for it. The tasks were designed to test “outside of the box” thinking. For example, if the object was a paper clip, an unimaginative response would be to hold paper. More creative answers, on the other hand, would be “a bookmark,” or “Christmas tree decorations.”

Liberman found that kids in the expansive mindset scored significantly better on all measures of creativity. They came up a greater number of uses and more creative uses for the objects. Why? According to Liberman, “spatial distance, as opposed to spatial proximity, was clearly shown to enhance creative performance…. [and] psychological distance can help to foster creativity because it encourages us to think abstractly.”

Two important findings come out of Liberman’s research. The first is that creativity can be taught. David Kelley makes this point precisely in a recent TED talk. Drawing upon personal experience and years of research, Kelley puts it this way:

Don’t let people divide the world into the creative and the non-creative like it’s some God given thing…. People [should] realize they are naturally creative and… these people should let their ideas fly. They should achieve… self-efficacy, [meaning they] should do what they set out to do… And reach a place of creative confidence.

The second point brings me back to London. One way to kill creativity and abstract thinking – two cognitive attributes vital in the 21st century economy – is to maintain a “here and now” perspective. London steered me away from this mindset; it influenced me to adopt a more open-minded perspective.

To be sure, my leisurely strolls through the British Museum didn’t make me smarter, and by no means was I “culturally transformed” upon hearing that ‘soccer’ was actually ‘football’. But it’s remarkable what you can learn by sitting in an English pub for a few hours. For starters, pints are two pounds, not two pence.

Jonah Lehrer and the New Science of Creativity

The following is a repost of my latest article originally posted on ScientificAmerican.com. It is a review of Jonah Lehrer’s latest book, Imagine: How Creativity Works, which was released March 19th.

Bob Dylan was stuck. At the tail end of a grueling tour that took him across the United States and through England he told his manager that he was quitting music. He was physically drained – insomnia and drugs had taken their toll – and unsatisfied with his career. He was sick of performing “Blowin’ in the Wind” and answering the same questions from reporters. After finishing a series of shows at a sold out Royal Albert Hall in London, he escaped to a cabin in Woodstock, New York to rethink his creative direction.

What came next would change rock ‘n’ roll forever. As soon as Dylan settled into his new home he grabbed a pencil and started writing whatever came to his mind. Most of it was a mindless stream of conscious. “I found myself writing this song, this story, this long piece of vomit, twenty pages long,” he once told interviewers. The song he was writing started like any children’s book – “Once Upon a Time” – but what emerged was a tour de force that left people like Bruce Springsteen and John Lennon in awe. A few months later, “Like A Rolling Stone” was released to critical acclaim.

Creativity in the 21st Century

Every creative journey begins with a problem. For Dylan it was the predictability and shallowness of his previous songs. He wasn’t challenging his listeners enough; they were too comfortable. What Dylan really wanted to do to was replace the expected with the unexpected. He wanted to push boundaries and avoid appealing to the norm; he wanted to reinvent himself.

For most of human history, the creative process has been associated with higher powers; it was about channeling the muses or harnessing one’s inner Apollonian and Dionysian; it was otherwordly. Science has barely touched creativity. In fact, in the second half of the 20th century less than 1 percent of psychology papers investigated aspects of the creative process. This changed in the last decade. Creativity is now one of the most popular topics in cognitive science.

The latest installment is Jonah Lehrer’s Imagine: How Creativity Works – released today. With grandiose style á la Proust Was Neuroscientist and How We Decide, Lehrer tells stories of scientific invention and tales of artistic breakthroughs – including Dylan’s – while weaving in findings from psychology and neuroscience. What emerges from his chronicles is a clearer picture of what happens in the brain when we are exercising – either successfully or unsuccessfully – our creative juices. The question is: what are the secrets to creativity? 

How To Think

There’s nothing fun about creativity. Breakthroughs are usually the tale end of frustration, sweat and repeated failure. Consider the story of Swiffer. Back in the 1980s Procter and Gamble hired the design firm Continuum to study how people cleaned their floors. The team “visited people’s homes and watched dozens of them engage in the tedious ritual of floor cleaning. [They] took detailed notes on the vacuuming of carpets and the sweeping of kitchens. When the notes weren’t enough, they set up video cameras in living rooms.” The leader of the team, Harry West, described the footage as the most boring stuff imaginable. After months of poring through the tapes he and his team knew as much about how people cleaned their floors as anybody else – very little.

But they stuck with it. And eventually landed on a key insight: people spend more time cleaning their mops than they did cleaning the floor. That’s when they realized that a paper towel could be used as a disposable cleaning surface. Swiffer launched in the spring of 1999 and by the end of the year it generated more than $500 million in sales.

Discovery and invention require relentless work and focus. But when we’re searching for an insight stepping back from a problem and relaxing is also vital; the unconscious mind needs time to mull it over before the insight happens – what Steve Berlin Johnson calls the “incubation period.” This is the story of Arthur Fry, which Lehrer charmingly brings to life.

In 1974 Fry attended a seminar given by his 3M colleague Spencer Silver about a new adhesive. It was a weak paste, not even strong enough to hold two pieces of paper together. Fry tried to think of an application but eventually gave up.

Later in the year he found himself singing in his Church’s choir. He was frustrated with the makeshift bookmarkers he fashioned to mark the pages in his hymnal book; they either fell out or got caught in the seams. What he really needed was glue strong enough so his bookmarkers would stick to the page but weak enough so they wouldn’t rip the paper when he removed them. That’s when he had his moment of insight: why not use Silver’s adhesive for the bookmark? He called it the Post-it Note.

Fry’s story bodes well with tales of insight throughout history. Henrí Poincaré is famous for thinking up Non-Euclidean geometry while boarding a bus, and then there’s Newton’s apple-induced revelation about the law of gravity. Lehrer delves into the relevant research to make sense of these stories from the neurological level. Fascinating studies from Mark Jung-Beeman, John Kounios and Joy Bhattacharya give us good reason to take Lehrer’s advice: “Rather than relentlessly focusing, take a warm shower, or play some Ping-Pong, or walk on the beach.”

When it comes to the creative process, then, it’s important to balance repose with red bull. As Lehrer explains: “the insight process… is a delicate mental balancing act. At first, the brain lavishes the scarce resource of attention on a single problem. But, once the brain is sufficiently focused, the cortex needs to relax in order to seek out the more remote association in the right hemisphere, which will provide the insight.”

Other People

The flip side of the creative process is other people. The world’s great ideas are as much about our peers as they are about the individual who makes it into the textbook. To explore how the people around us influence our ideas Lehrer explains the research of Brian Uzzi who, a few years ago, set out to answer this question: what determines the success of a Broadway musical?

With his colleague Jarrett Spiro, Uzzi thoroughly examined a data set that included 2,092 people who worked on 474 musicals from 1945 to 1989. They considered metrics such as reviews and financial success and controlled for talent and any economic or geographic advantages – big New York City musicals would likely flub the data. They found that productions failed for two reasons. The first was too much like-mindedness: “When the artists were so close that they all thought in similar ways… theatrical innovation [was crushed].” On the other hand, when “the artists didn’t know one another, they struggled to work together and exchange ideas.” Successful productions, in contrast, found an even distribution between novelty and familiarity within its members. This is why West Side Story was such a hit: it balanced new blood with industry veterans.

This is what the website InnoCentive.com teaches us. InnoCentive is a website where, as Matt Ridley would suggest, ideas go to have sex. The framework is simple: “seekers” go to the website to post their problems for “solvers.” The problems aren’t trivial but the rewards are lucrative. For example, the Sandler-Kenner Foundation is currently offering a $10,000 reward for anybody who can create “diagnostic tools for identification of adenocarcinoma and neuroendocrine pancreatic cancer at early stages of development.” Another company is offering $8,000 to anyone who can prevent ice formation inside packages of frozen foods.

What’s remarkable about InnoCentive is that it works. Karim Lakhani, a professor at Harvard Business School, conducted a study that found that about 40 percent of the difficult problems posted on InnoCentive were solved within 6 months. A handful of the problems were even solved within days. “Think, for a moment,” Lehrer says, “about how strange this is: a disparate network of strangers managed to solve challenges that Fortune 500 companies like Eli Lilly, Kraft Foods, SAP, Dow Chemical, and General Electric—companies with research budgets in the billions of dollars—had been unable to solve.”

The secret was outside thinking:

The problem solvers on InnoCentive were most effective when working at the margins of their fields. In other words, chemists didn’t solve chemistry problems, they solved molecular biology problems, just as molecular biologists solved chemistry problems. While these people were close enough to understand the challenges, they weren’t so close that their knowledge held them back and caused them to run into the same stumbling blocks as the corporate scientists.

This is the lesson from West Side Story: great ideas flourish under the right balance of minds. John Donne was right: no man is an island.

Conclusion

There are so many wonderful nuggets to take away from Imagine, and Lehrer does an excellent job of gathering stories from history to bring the relevant psychological research to life. The few stories and studies I’ve mentioned here are just the tip of the iceberg.

When I asked him what the takeaway of his book is (if there could be just one) he said:

The larger lesson is that creativity is a catchall term for a bundle of distinct processes. If you really want to solve the hardest problems you will need all these little hacks in order to solve these problems. This is why what the cognitive sciences are saying about creativity is so important.

He’s right. We think about creativity as being a distinct thing and as people being either creative or not, but the empirical research Lehrer discusses tells a different story. Creativity engages multiple cognitive processes that anybody can access.

This is why Dylan’s story is so important: It’s the story of a musician being discontent with his creative direction, having a moment of insight, working tirelessly to bring this insight and new sounds to life to ultimately change the norm. Dylan’s genius isn’t about a specific skill that nobody else possessed. It’s about his ability to wage through the creative process by using the right parts of the brain at the right times.

Not everybody can be Dylan, but Imagine reminds us that as mysterious and magical as creativity seems, “for the first time in human history, it’s possible to learn how the imagination actually works.”

Why The Future of Neuroscience Will Be Emotionless

In Phaedrus, Plato likens the mind to a charioteer who commands two horses, one that is irrational and crazed and another that is noble and of good stock. The job of the charioteer is to control the horses to proceed towards Enlightenment and the truth.

Plato’s allegory sparked an idea that perpetuated throughout the next several millennia in western thought: emotion gets in the way of reason. This makes sense to us. When people act out-of-order, they’re irrational. No one was ever accused of being too reasonable. Around the 17th and 18th centuries, however, thinkers began to challenge this idea. David Hume turned the tables on Plato: reason, Hume said, was the slave of the passions. Psychological research of the last few decades not only confirms this view, some of it suggests that emotion is better at deciding.

We know a lot more about how the brain works compared to the ancient Greeks, but a decade into the 21st century researchers are still debating which of Plato’s horses is in control, and which one we should listen to.

A couple of recent studies are shedding new light on this age-old discourse. The first comes from Michael Pham and his team at Columbia Business School. The researchers asked participants to make predictions about eight different outcomes ranging from American Idol finalists, to the winners of the 2008 Democratic primary, to the winner of the BCS championship game. They also forecasted the Dow Jones average.

Pham created two groups. He told the first group to go with their guts and the second to think it through. The results were telling. In the American Idol results, for example, the first group correctly predicted the winner 41 percent of the time whereas the second group was only correct 24 percent of the time. The high-trust-in-feeling subjects even predicted the stock market better.

Pham and his team conclude the following:

Results from eight studies show that individuals who had higher trust in their feelings were better able to predict the outcome of a wide variety of future events than individuals who had lower trust in their feelings…. The fact that this phenomenon was observed in eight different studies and with a variety of prediction contexts suggests that this emotional oracle effect is a reliable and generalizable phenomenon. In addition, the fact that the phenomenon was observed both when people were experimentally induced to trust or not trust their feelings and when their chronic tendency to trust or not trust their feelings was simply measured suggests that the findings are not due to any peculiarity of the main manipulation.

Does this mean we should always trust our intuition? It depends. A recent study by Maarten Bos and his team identified an important nuance when it comes to trusting our feelings. They asked one hundred and fifty-six students to abstain from eating or drinking (sans water) for three hours before the study. When they arrived Bos divided his participants into two groups: one that consumed a sugary can of 7-Up and another that drank a sugar-free drink.

After waiting a few minutes to let the sugar reach the brain the students assessed four cars and four jobs, each with 12 key aspects that made them more or less appealing (Bos designed the study so an optimal choice was clear so he could measure of how well they decided). Next, half of the subjects in each group spent four minutes either thinking about the jobs and cars (the conscious thought condition) or watching a wildlife film (to prevent them from consciously thinking about the jobs and cars).

Here’s the BPS Research Digest on the results:

For the participants with low sugar, their ratings were more astute if they were in the unconscious thought condition, distracted by the second nature film. By contrast, the participants who’d had the benefit of the sugar hit showed more astute ratings if they were in the conscious thought condition and had had the chance to think deliberately for four minutes. ‘We found that when we have enough energy, conscious deliberation enables us to make good decisions,’ the researchers said. ‘The unconscious on the other hand seems to operate fine with low energy.’

So go with your gut if your energy is low. Otherwise, listen to your rational horse.

Here’s where things get difficult. By now the debate over the role reason and emotion play in decision-making is well documented. Psychologists have written thousands of papers on the subject. It shows in the popular literature as well. From Antonio Damasio’s Descartes’ Error to Daniel Kahneman’s Thinking, Fast and Slow, the lay audience knows about both the power of thinking without thinking and their predictable irrationalities.

But what exactly is being debated? What do psychologists mean when they talk about emotion and reason? Joseph LeDoux, author of popular neuroscience books including The Emotional Brain and The Synaptic Self, recently published a paper in the journal Neuron that flips the whole debate on its head. “There is little consensus about what emotion is and how it differs from other aspects of mind and behavior, in spite of discussion and debate that dates back to the earliest days in modern biology and psychology.” Yes, what we call emotion roughly correlates with certain parts of the brain, it is usually associated with activity in the amygdala and other systems. But we might be playing a language game, and neuroscientists are reaching a point where an understanding of the brain requires more sophisticated language.

As LeDoux sees it, “If we don’t have an agreed-upon definition of emotion that allows us to say what emotion is… how can we study emotion in animals or humans, and how can we make comparisons between species?” The short answer, according to the NYU professor, is “we fake it.”

With this in mind LeDoux introduces a new term to replace emotion: survival circuits. Here’s how he explains it:

The survival circuit concept provides a conceptualization of an important set of phenomena that are often studied under the rubric of emotion—those phenomena that reflect circuits and functions that are conserved across mammals. Included are circuits responsible for defense, energy/nutrition management, fluid balance, thermoregulation, and procreation, among others. With this approach, key phenomena relevant to the topic of emotion can be accounted for without assuming that the phenomena in question are fundamentally the same or even similar to the phenomena people refer to when they use emotion words to characterize subjective emotional feelings (like feeling afraid, angry, or sad). This approach shifts the focus away from questions about whether emotions that humans consciously experience (feel) are also present in other mammals, and toward questions about the extent to which circuits and corresponding functions that are relevant to the field of emotion and that are present in other mammals are also present in humans. And by reassembling ideas about emotion, motivation, reinforcement, and arousal in the context of survival circuits, hypotheses emerge about how organisms negotiate behavioral interactions with the environment in process of dealing with challenges and opportunities in daily life.

Needless to say, LeDoux’s paper changes things. Because emotion is an unworkable term for science, neuroscientists and psychologists will have to understand the brain on new terms. And when it comes to the reason-emotion debate – which of Plato’s horses we should trust – they will have to rethink certain assumptions and claims. The difficult part is that we humans, by our very nature, cannot help but resort to folk psychology to explain the brain. We deploy terms like soul, intellect, reason, intuition and emotion but these words describe very little. Can we understand the brain even though our words may never suffice? The future of cognitive science might depend on it.

Read more

Why Intellectual Diversity Is Important

Below is my latest column at The Creativity Post in its entirety. I argue that good ideas benefit from intellectual diversity. Incidentally, I came across this wonderful NYTimes article on the same subject at Farnam Street blog this morning. It discusses Scott Page’s The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools and Societies.

A few years ago Brian Uzzi of Northwestern University and Jarrett Spiro of Stanford University set out (pdf) to answer this question: What determines the success of a Broadway musical? Uzzi and Spiro began by poring through a data set that included 2,092 people who worked on 474 musicals from 1945 to 1989. To determine how good each production was they considered metrics such as reviews and financial success. They also controlled for things like talent and economic and geographic conditions to ensure that the big New York City musicals didn’t flub the data.

What they found was that successful productions relied on two components: “The ratio of new blood versus industry veterans, and the degree to which incumbents involved their former collaborators and served as brokers for new combinations of production teams.” In other words, productions that worked found a balance between strong social ties and weak ones, rookies and veterans, familiarity and novelty. They weren’t flooded with a group of likeminded people but neither was everyone a stranger to each other. Uzzi and Spiro hypothesized that the reason intellectual diversity was important is because “small world networks that help to create success or failure in Broadway musicals… face liabilities in the realms of innovation and collaboration that impede their creating new, successful musical hits… too much small-worldliness can undermine the very benefits it creates at more moderate levels, due to a decrease in artists’ ability to innovate and break convention.”

What’s alarming about their conclusions is that a plethora of psychological data suggests that most of us balk when we are given the chance to connect with people who might not share similar intellects. Consider a study (pdf) done back in 2007 by Paul Ingram and Michael Morris at Columbia University. The psychologists gathered a group of executives and had them attend a cocktail mixer where the psychologists encouraged the executives to exchange ideas, network and meet new people. Like good behavioral scientists, Ingram and Morris weaseled microphones on all the nametags to record what was said. Prior to the “mixing” the executives stated that they wanted to “meet as many different people as possible” or “expand their social network,” but the Ingram and Morris found just the opposite. “Do people mix at mixers? “ they asked in the concluding remarks of their study, “The answer is no… our results show that guests at a mixer tend to spend the time talking to the few other guests whom they already know well.” Or, as Jonah Lehrer somewhat sarcastically puts it in a recent post, “investment bankers chatted with other investment bankers, and marketers talked with other marketers, and accountants interacted with other accountants.”

Ingram and Morris’ study should be taken as a warning: If we want to broaden our intellectual horizons it’s important to remember our natural tendency to drift towards and eventually connect with only likeminded people. Stories of innovation and discovery throughout history illustrate how important this point is. My favorite, which doesn’t get told enough, is the discovery of Cosmic Microwave Background Radiation (CMB), a key piece of evidence that changed our understanding of the origin of the universe forever.

The story begins in Holmdel New Jersey at Bell Labs where Arno Penzias and Robert Wilson were experimenting with a horn antenna originally built to detect radio waves that bounced off of echo balloon satellites. After spending some time with the antenna they ran into a problem. It was a mysterious hissing noise – like static on the radio – that persisted all over the sky, day and night. The duo went to great lengths to eliminate the hiss – they even washed bird droppings off of the dish – but it was all to no avail. Meanwhile, at Princeton University just 60 miles down the road, Robert Dicke, Jim Peebles and David Wilkinson were trying to find evidence for the Big Bang in the form of microwave radiation. They predicated that if the Big Bang did in fact take place it must have scattered an enormous blast of radiation throughout the universe much like how a rock thrown into a lake creates ripples that broadcast outwards. With the right instrumentation, they believed, this radiation could be all over the sky, day and night.

It was only a matter of time before serendipity set in and a mutual friend at MIT, professor of physics Bernard F. Burke, told Penzias about what the researchers at Princeton were looking for. After that, the two teams exchanged ideas and realized the implications of their work. It turned out that the hiss that Penzias and Wilson were trying so hard to get rid of was precisely the radiation that the Princeton team was looking for. A few calculations and a published paper later landed Penzias and Wilson the 1978 Noble Prize in Physics; the rest of us are still repeating the benefits of a more complete understanding of the universe.

The story of CMB reminds us that when it comes to solving difficult problems a fresh set of eyes, even one that comes from a different field, is vital. The CMB story shows itself in one form or another many times throughout history. The world’s great ideas are as much about other people as they are about the individual who makes it into the textbook. As Matt Ridely explains in a TED lecture in a slightly different context, “what’s relevant to a society is how well people are communicating their ideas and how well they are cooperating not how clever the individuals are… it’s the interchange of ideas, the meeting and mating of ideas between that [causes]… innovation.”

There is a wonderful website called InnoCentive.com that facilitates what Ridley calls the meeting and mating of ideas. The framework of InnoCentive is quite simple: “seekers” go to the website to post their problems for “solvers.” Problems range from the “Recovery of Bacillus Spore from Swabs,” to “Blueprints for a Medical Transportation Device for Combat Rescue,” and multi-billion dollar companies like General Electric and Procter and Gamble often post them with cash prizes up to $1 million.

The amazing part is that it’s working. A study (pdf) by researchers at Harvard Business School found that about 33 percent of problems posted on InnoCentive were solved on time. Why does InnoCentive work? The same reason that successful Broadway plays do and CMB was discovered: intellectual diversity. If an organic chemistry problem only attracted organic chemists it tended to be troublesome. However, if a biologist got involved with that same problem then the chances were greater that the problem was solved. The implications of this should make you think: solvers were at their best when they were at the margins of their fields of expertise.

Maybe it sounds obvious to suggest that a proper mixture of minds is important for accomplishing tasks, but remember the lesson from Uzzi’s and Spiro’s cocktail party study: it’s really hard to not surround yourself with people like you. Don’t hang out with too many opposites though, we don’t want another Spider Man: Turn Off The Dark.

Cognitive Obstacles: Why Distractions Can Improve Creativity and Problem-Solving

From Wikipedia commons

Getting things done takes focus. When it comes to studying for exams or preparing for presentations we strive to get “in the zone,” that magical state where time seems to stop and we gain a sense of complete control while becoming totally absorbed in what needs to be done. (What Hungarian psychologist Mihály Csíkszentmihályi terms flow.) Auguste Rodin’s The Thinker visually captures this; hunched over with his chin resting on his hand, the statue portrays the image of deep concentration well.

When it comes to getting things done, distractions and obstacles are thought to be bad. But a few recent studies suggest a different picture. To begin, consider Shane Frederick’s “Cognitive Reflection Test.” In it, participants are asked to answer a series of word problems including the following:

If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?

  • 100 minutes
  • 5 minutes

In a lake, there is a patch of lily pads. Every day, that patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?

  • 24 days
  • 47 days

Frederick had two groups answer the puzzles. The first saw them in a “small font in washed-out gray print,” and the other saw them in a “normal font.” The puzzles written in the harder were designed to induce “cognitive strain.” Frederick found that “90 percent of the students who saw the Cognitive Reflection Test in normal font made at least one mistake in the test, but the proportion dropped to 35 percent when the font was barely legible.” In other words, participants performed better when the puzzles were harder to see.

What explains this?

The easier a problem is the less time we are going to spend on it. This makes sense most of the time; we complete simple math problems on auto-pilot because they don’t demand very much cognitive energy. When we encounter a difficult problem we spend more time on it because it requires more from our neurons. In other words, we know that the chances of us making a mistake are higher so we pay more attention. But sometimes difficult problems are disguised as easy problems, and we run into trouble when we deploy the same amount of cognitive effort on the former thinking it is the latter. Frederick’s study reminds us of this. (The answers, by the way, are 5 minutes and 47 days.)

Let’s consider two more experiments, which Jonah Lehrer highlighted on his blog the other day. Both experiments come from a study led by Janina Marguc out of the University of Amsterdam. In one, Marguc and her team had participants complete a computer maze game. She created two groups: one had to solve the maze with a blocking obstacle, which made it harder to find the escape route, and the other had to solve the maze without a blocking obstacle. Then, both groups were given a remote association test. Here’s an example: what word connects “envy,” “golf,” and “beans? (Green.) Marguc found that the group exposed to the blocking obstacle solved 40 percent more remote associate puzzles. This means, as Lehrer explains, that “the constraint had forced them into a creative mindset; their imaginations benefited from the struggle.”

In the second experiment Marguc asked participants to solve anagrams. Again, she created two groups: one was forced to listen to a neutral voice of repeating words and the other was not. Next, Marguc had them complete a test that assessed global versus local thinking. “A more global thought process is,” as Lehrer explains, “generally ideal for coming up with truly creative solutions, as it makes people more likely to notice cross-cutting connections.” To do this she asked them to complete a Navon letter task, which asks participants to automatically respond to images of letters made up of letters. (So imagine the letter E written with a bunch of little A’s and the letter A written with a bunch of little E’s. Click here to see what this looks like.) She found that those who listened to the neutral voice perceived the letters holistically while those who did not saw the particular letters. That is, the “neutral voice” participants saw the letter E as the letter E while the other group saw it as an A.

Taken together, these two experiments, along with the other experiments in the study, “suggest that obstacles trigger an ‘if obstacle, then start global processing’ response, primarily when people are inclined to stay engaged and finish ongoing activities.” Or Lehrer’s take: “this is why constraints are so important: It’s not until we encounter an unexpected hindrance – a challenge we can’t easily resolve – that the chains of cognition are loosened, giving us newfound access to the weird connections simmering in the unconscious.”

I want to connect Frederick’s work with Marguc’s. Both describe two different phenomena to be sure; cognitive strain illustrates how cognition is affected by “the current level of effort and the presence of unmet demands,” while Marguc’s study demonstrates how mental obstacles force us to think holistically to generate more creative solutions. But they both play off of the same principle: the important role of mental distractions and obstacles in problem solving.

From Wikipedia commons

When I was in school I was taught the importance of studying in an environment with few distractions. There is more than a grain of truth to this; Facebook, text messaging, and email clearly get in the way of more important “needs”. But sometimes mental distractions and obstacles – those cognitive hiccups that force us to focus – are a good thing. The work of Frederick and Marguc suggest that when it comes to solving word problems, completing mazes, tackling a Navon letter task and the like it is important that your brain doesn’t get too comfortable. We want to get “in the zone,” but doing so might cause us to make more mistakes than we would like.

The takeaway message is not just a reminder to remain focused during easy tasks, but also to realize the important role that cognitive constraints play in our creative (as Lehrer’s post suggests) and analytical (as Frederick’s study suggests) thinking. Stravinsky was right, then, to say that “the more constraints one imposes, the more one frees one’s self… the arbitrariness of the constraint serves only to obtain precision of execution.”

The Aha! Moment: How Relaxation Helps the Creative Process

I hated the SAT. It was long, difficult and very taxing. I remember spending lengthy periods of time on individual problems only to draw complete blanks. I did the test preps, the practice exams and even studied multiple choice strategies, but it was all to no avail. Why? One reason is that I am a horrible test taker. Some readers may take that as a euphemism for me be stupid – fair enough. But another reason is that tests stress me out enormously. And as anyone who shares my pain can tell you, stress is seriously detrimental to a decent score. Great test takers get in the zone, breeze through problems without second guessing and live to see the next day; I was, and still am, not one of those people.

There is something inherently mysterious about problem solving. I could never tell you why I had such a difficult time figuring how those stupid analogies, and I am sure my more intelligent counterpart would have an equally difficult time telling you why he or she had such an easy time. This is because much of what happens in our brains when we are trying to solve a problem is unconscious; our conscious selves are forced to patiently wait while the answer decides if it wants to “show up” or not. The mystery eludes neuroscientists too. In a New Yorker article a few years back, Mark Beeman, a cognitive neuroscientist at Northwestern University, told science journalist Jonah Lehrer that moments of insights, “[are] one of those defining features of the human mind, and yet we have no idea how or why it happens.”

To lessen the unknowns, Beeman began studying what happens in the brain when we problem solve and have moments of insight. The first thing he did was develop a series of word puzzles that he called Compound Remote Associate Problems (CRAP, yes, that’s funny) for his participants. To solve a CRAP problem (still funny), you have to find a word that can be combined with three given words. For example, if you have “pine,” crab,” and “sauce,” the correct answer is “apple” (pineapple, crabapple, and applesauce). While participants were busy musing over the word problems, Beeman, along with his colleague John Kounios, measured their brains using fMRI and EEG.

They found several things. The first was a spike in activity in the anterior superior temporal gyrus (aSTG) moments before the insight. Not much is known is about the aSTG, although it is linked to the processing of metaphors in previous research. This makes some sense, we understand metaphors by linking seemingly unrelated ideas. The second was less technical and more remarkable. By looking at the EEG data, which appears on a computer screen in real-time, they could predict up to eight seconds in advance if someone was going to find the answer. What tipped them off were alpha waves, which are electrical neural oscillations that are linked to times when we are most relaxed. They show up when we are laying down in bed, taking a warm shower or strolling through the park. You could think of alpha waves as the quite voice in the back of your head that subtly reminds you what the right answer is. (Funny story. I was once hooked up to an EEG cap in college as part of a neuro lab. My task was very simple. I watched a series of sentences flash up on a screen in front of me. Unfortunately, I hadn’t slept the night before, the room was dark and I had been yearning for a nap the whole day. Naturally, I started dozing off. Just as that happened the professor stopped the experiment and sent me home. I tried to play it off but he told me my alpha waves gave it all away – I was falling asleep and he knew just by looking at the data).

Here’s the interesting part. Culture tells us that red bull, coffee and intense focus are necessary for anyone to get work done. But Beeman and John Kounios findings’ are painting a different picture – it is when the brain is calm and relaxed that it has those moments of insight. This was supported when they brought in a Buddhist monk to solve CRAP problems. After failing dozens in a row, they saw his alpha waves spike and watched as he solved the next 27 in a row in no time. He was an “insight machine” as Kounios described. As Lehrer summarizes in his New Yorker article,

One of the surprising lessons of this research is that trying to force an insight can actually prevent the insight. While it’s commonly assumed that the best way to solve a difficult problem is to focus, minimize distractions, and pay attention only to the relevant details, this clenched state of mind may inhibit the sort of creative connections that lead to sudden breakthroughs. We suppress the very type of brain activity that we should be encouraging.

To be sure, those double-shot espressos help, but just not all the time. Neuroscience research like Beeman and Kounios’ is simply suggesting that being calm and relaxed is just as important as being amped. So the next time I take the SAT, which will be never, maybe I should chill and let the answer find my consciousness instead of my consciousness finding it.

Read more

A Brief History of Popular Psychology: An Essay

It is unclear when the popular psychology movement started, perhaps with Malcolm Gladwell’s The Tipping Point or Steven Levitt and Stephen Dubner’s Freakonomics, or how it is defined, but it could be generally described by the public’s growing interest in understanding people and events from a sociological, economical, psychological, or neurological point of view.

Over the last decade the New York Times bestseller list has seen a number of these books: Ariely’s Predictably Irrational (2008) and The Upside of Rationality (2010), Gilbert’s Stumbling on Happiness (2006), Haidt’s The Happiness Hypothesis (2006), Lehrer’s How we Decide (2009), and Thaler & Sunstein’s Nudge (2008). What unites them is their attempt to “explore the hidden side of everything,” by synthesizing numerous academic studies in a relatable way, drawing upon interesting real-world examples, and by providing appealing suggestions for how one can understand the world, and his or her decisions and behaviors within the world, better.

The popular psychology movement is the result of a massive paradigm shift, what many call the cognitive revolution, that took place in the second half of the 20th century. Although it’s starting point is unclear, George A. Miller’s 1956 “The Magical Number Seven, Plus or Minus Two,” and Noam Chomsky’s 1959 “Review B. F. Skinner’s Verbal Behavior,” were, among others, important publications that forced psychology to become increasingly cognitive. Whereas behaviorists – who represented the previous paradigm – only considered the external, those involved in the cognitive revolution sought to explain behavior by studying the internal; the cause of behavior was therefore thought of as being dictated by the brain and not the environment.

The cognitive revolution naturally gave rise to the cognitive sciences – neuroscience, linguistics, artificial intelligence, and anthropology – all of which began to study how human brains processed information. A big part of the revolution revolved around the work done by psychologists Daniel Kahneman and Amos Tversky. Kahneman and Tversky developed a cognitive bias and heuristic program in the early 1970s that changed the way human judgment was understood. The heuristics and biases program had two goals. First, it demonstrated that the mind has a series of mental shortcuts, or heuristics, that “provide subjectively compelling and often quite serviceable solutions to… judgmental problems.” And second, it suggested that underlying these heuristics were biases that “[departed from] normative rational theory.”

Kahneman and Tversky’s work was vital because it questioned the notion that judgment was an extensive exercise based off of algorithmic processes. Instead, it suggested that people’s decisions and behaviors are actually influenced by “simple and efficient… [and] highly sophisticated… computations that the mind had evolved to make.”

Their work was complimented by Richard Nisbett and Lee Ross’s 1980 book Human Inference: Strategies and Shortcomings of Social Judgment, which outlined how people’s “attempts to understand, predict, and control events in their social sphere are seriously compromised by specific inferential shortcomings.” From this, a list of cognitive biases began to accumulate. These included: attentional bias, confirmation bias, the endowment effect, status quo bias, gambler’s fallacy, the primacy effect, and more.

The cognitive biases and heuristic program was just one part of the cognitive revolution however. The other equally important aspects came a bit later when psychologists began to empirically study how unconscious processing influenced behavior and conscious thought. These studies stemmed from the 1977 paper Telling More Than We Can Know: Verbal Reports on Mental Processes, by Richard Nisbett and Timothy Wilson. Nisbett and Wilson argued that, “there may be little or no direct introspective access to higher order cognitive processes,” thereby introducing the idea that most cognition takes place automatically at the unconscious level.

Wilson continued his research in the 80s and 90s, eventually developing the concept of the “adaptive unconscious,” a term he uses to describe our ability to “size up our environments, disambiguate them, interpret them, and initiate behavior quickly and non-consciously.” He argued that the adaptive unconscious is an evolutionary adaptation used to navigate the world with a limited attention. This is why we are able to drive a car, type on a computer, or walk without having to think about it.

Complimenting Wilson was Yale psychologist Jon Bargh who significantly contributed to the study of how certain stimulus influenced people’s implicit memory and behavior. In numerous experiments, Bargh demonstrated that people’s decisions and behaviors are greatly influenced by how they are “primed”. In one case, Bargh showed the people primed with rude words, such as “aggressively, bold, and, intrude,” were on average about 4 minutes quicker to interrupt an experimenter than participants who were primed with the polite words such as “polite, yield, and sensitively.”

Also in the 80s and 90s, neuroscientists began to understand the role of emotion in our decisions. In the 1995 book Descartes Error, Antonio Damasio explicates the “Somatic Markers Hypothesis” to suggest that, contrary to traditional western thought, a “reduction in emotion may constitute an equally important source of irrational behavior.” NYU professor Joseph LeDoux was also instrumental in studying emotions. Like Wilson, Nisbett, and Bargh, LeDoux advocated that an understanding of conscious emotional states required an understanding of “underlying emotional mechanisms.”

Along with emotion and the unconscious, intuition was another topic that was heavily researched in the past few decades. It was identified and studied as a way of thinking and as a talent. As a way of thinking, intuition more or less corresponds to Wilson’s adaptive unconscious; it is an evolutionary ability that helps people effortlessly and unconsciously disambiguate the world; i.e., the ability for people to easily distinguish males from females, their language from another, or danger from safety.

Intuition as a talent was found to be responsible for a number of remarkable human capabilities, most notably those of experts. As Malcolm Gladwell says in his 2005 best seller Blink, intuitive judgments, “don’t logically and systemically compare all available options.” Instead, they act off of gut feelings and first impressions that cannot be explained rationality. And most of the time, he continues, acting on these initial feelings is just as valuable as acting on more “thought out” feelings.

By the 1990s, when the “revolution in the theory of rationality… [was] in full development,” the line between rational and irrational behavior became blurred as more and more studies made it difficult to determine what constituted rational behavior. One on hand, some (mainly economists) maintained rationality as the norm even though they knew that people deviated from it. On the other hand, individuals like Herbert Simon and Gerd Gigerenzer argued that the standards for rational behavior should be grounded by ecological and evolutionary considerations. In either case though, rational choice theory was what was being argued. Because of this, the 1990s saw books such as Stuart Sutherland’s Irrationality (1994), Massimo Piattelli-Palmarini’s Inevitable Illusions: How Mistakes of Reason Rule Our Mind (1996), and Thomas Gilovich’s How We Know What Isn’t: The Fallibility of Human Reason in Everyday Life (1991). Each perpetuated that idea that behavior or decision-making was to be judged by a certain standard or norm (in this case, rational choice theory) as the titles imply.

However, when all of the facets of the cognitive revolution – cognitive biases and heuristics, the unconscious, emotion, and intuition – are considered, the idea that we act rationally begins to look extremely weak; this observation has heavily influenced the popular psychology movement. Pick up any popular psychology book and you will find Kahneman, Tversky, Nisbett, Wilson, Bargh, Damasio, Ledoux, and others heavily cited in arguments that run contrary to rational actor theory.

What’s interesting, and my last post touched on this, is that each popular psychology author has something different to say: Dan Ariely pushes behavioral economics to argue that we are all predictably irrational; Damasio argues that reason requires emotion; Gladwell, David Myers, and Wilson suggest that mostly thought is unconscious and our intuitive abilities are just as valuable as our rational ones; Daniel Gilbert and Jonathan Haidt illustrate how our cognitive limitations affect our well-being; Barry Schwartz shows how too much choice can actually hurt us; and Jonah Lehrer draws upon neuroscience to show the relationship between emotion and reason in our decision-making.

As a result of all these assertions, the human condition has become seriously complicated!

If there is something to conclude from what I have outlined it is this. Implicit in any evaluation of behavior is the assumption that human beings have a nature or norm, and that their behavior is deviating from this nature or norm. However, the popular psychology movement shows that our brains are not big enough to understand human behavior and our tendency to summarize it so simplistically is a reflection of this. We aren’t rational, irrational, or intuitive, we are, in the words of K$sha, who we are. 

Brains, Music, and The Bad Plus

A lot has been written about the neuroscience of music lately, including these two articles in the NYTimes, a blog post by Jonah Lehrer, books by Oliver Sacks and Daniel Levitin, and a paper in Nature Neuroscience. What are they all saying? To get a sense, meet The Bad Plus, a Minneapolis trio known for a unique brand of rock infused Avant-garde jazz music. The Bad Plus have been around for just over a decade and have made a name for themselves by covering famous hits from the 80s and 90s – everything from Blondie’s ‘Heart of Glass’ to Nirvana’s ‘Smells Like Teen Spirit’ – as well as producing original material.

If you have ever listened to The Bad Plus you will know that their music can be a bit challenging. Instead of the standard verse-chorus, 4/4 time structure that most pop songs are constituted by, Bad Plus songs are much more chaotic. Often switching from one unusual time signature to the next (5/16 to 3/4 to 10/8 for example), speeding up and slowing down the tempo, and rarely repeating previous motifs, their songs could be classified as lawless. However, beneath all the disorder, The Bad Plus nonetheless maintain a deep and steady structure that binds all of their songs together – and this is what makes them successful musicians.

It is their ability to combine what we expect with what we don’t expect that separates them from most. Sometimes this is frustrating, and other times it is confusing, but it is ultimately enjoyable. Why? It all goes back to patterns, expectations, and predictions – three things that brains love. When it comes to music, brains are focused on identifying patterns, forming expectations, and then predicting where the song will go based off of the patterns and expectations it has identified and formed. Brains like it when they do this successfully and hate it when they don’t. This is one reason singers like Britney Spears and Justin Timberlake are so popular –  their songs are structured by patterns and expectations that are very easy to predict. When we listen to ‘Baby One More Time’ or ‘Sexxyback’ we know exactly what we are going to get.

However, groundbreaking musicians like The Bad Plus know that it is ultimately more enjoyable to hear a song violate an expectation instead of fulfill an expectation – this is why many of their songs replace the expected with the unexpected. Unlike a pop song, a Bad Plus song challenges your brain to figure out the new pattern. Many times this is difficult, and this is probably why the average listener does not give The Bad Plus a chance. But if you give your brain enough time to figure out the new pattern it will reward you. It’s like doing a difficult math problem, at first it sucks, but it feels great to figure it out, especially if you worked hard to get it.

All of this is explained by neuroscientist Daniel J. Levitin in his 2006 book This Is Your Brain On Music:

As music unfolds, the brain constantly updates its estimates of when new beats will occur, and takes satisfaction in matching a mental beat with a real-in-the-world one, and takes delight when a skillful musician violates that expectation in an interesting way – a sort of musical joke that we’re all in on. Music breathes, speeds up, and slows down just as the real world does, and our cerebellum finds pleasure in adjusting itself to stay synchronized (Levitin, p. 191).

One of my favorite Bad Plus songs, which exemplifies what Levitin is talking about, is a cover of the academy award-winning theme song, “Titles,” from the 1981 hit Chariots of Fire. Below is a video of The Bad Plus performing this song live, and there are three things that you should pay attention to. First, notice how the song begins with what you are familiar with – the Chariots of Fire theme. Second, listen at the 1:53 mark to how The Bad Plus deviate from what you are familiar with and notice that you do not find this particularly appealing. Finally, if you were patient enough to listen through the entire middle section where the band seems to get buried in its own sound, pay close attention to what happens at 5:39. Amidst a smattering of bass notes and drum crashes, pianist Ethan Iverson slowly brings back the familiar Chariots of Fire motif. The song climaxes at 6:22 when all three members strike the same chord and deliver “Titles,” as you know it.

If your brain is like most, it will love this moment. As Levitin explained, it is an instance where the brain rewards itself for being able to understand a pattern that had been previously violated – this is why Bad Plus songs are ultimately rewarding. They establish a pattern we know, they deviate from this pattern, and then they reward us by bringing the pattern back to its original state (for another good example of this listen to their cover of the Radiohead song “Karma Police”). I challenge you to listen to “Titles” several times – give your brain a chance to “get to know its patterns” so that it can successfully predict what comes next. Listened to enough times, I suspect that the 6:22 mark will eventually become extremely enjoyable.

How a musician understands and uses patterns, expectations, and prediction largely defines the quality of his or her music. Musicians who only deliver what is expected may be popular, but they certainly won’t go down in history as one of the best. It is people like Dylan, who took folk electric, or the Ramones, who introduced punk, or Jay-Z, Kanye West, and Girl Talk, who sampled other songs to create new songs, who will be remembered. Though it took audiences some time to get used to these artists, their music became celebrated once brains adapted to the new patterns e.g., electric, punk, or sampling.

I look forward to even more literature that addresses the relationship between brains and music. If the last few years are indicative, we should see many more insights.

Follow

Get every new post delivered to your Inbox.

Join 332 other followers

%d bloggers like this: