Skip to content

Posts tagged ‘Steven Pinker’

What Believers and Atheists Can Learn From Each Other (co-written with Rabbi Geoff Mitelman)

Here’s a forthcoming article for the Huffington Post religious blog I’ve written with Rabbi Geoff Mitelman, a friend and fellow cognitive science enthusiast. We discuss atheism and the psychology of belief. Check out his blog Sinai and Synapses

Rabbi Geoffrey Mitelman: It’s inherently challenging for believers and atheists to have productive conversations. Discussing topics such as belief and nonbelief, the potential irrationality of religion, or the limits of scientific knowledge is difficult since each side often ends up more firmly entrenched in their own worldview.

But one bright person interested in broadening the conversation is Sam McNerney, a science writer who focuses on cognitive science and an atheist interested in religion from a psychological point of view.

I found Sam through his writing on ScientificAmerican.com, and started reading his blog Why We Reason and his posts on BigThink.com. We discovered that even though we approached religion from different perspectives, we had great respect for each other.

So as two people with different religious outlooks we wondered: what can we learn from each other?

Sam McNerney: There are many things we can learn. Let’s take one: the role of authority.

A recent New York Times article points out that secular liberal atheists tend to conflate authority, loyalty and sanctity with racism, sexism and homophobia. It’s not difficult to see why. Societies suffer when authority figures, being motivated by sacred values and religious beliefs, forbid their citizens from challenging the status quo. But a respect for authority and the principles they uphold to some degree is necessary if societies seek to maintain order and justice and function properly. The primatologist Frans de Waal explains it this way: “Without agreement on rank and a certain respect for authority there can be no great sensitivity to social rules, as anyone who has tried to teach simple house rules to a cat will agree.” (Haidt, 106)

Ironically, atheists’ steadfast allegiance to rationality, secular thinking and the importance of open-mindedness blinds them to important religious values including respect for authority. As a result, atheists tend to confuse authority with exploitation and evil and undervalue the vital role authority plays in a healthy society.

Geoff: You accurately bring up one aspect of why organized religion can be so complicated: it is intertwined with power. And I’m glad you note that authority and power are not inherently bad when it comes to religion. In fact, as you also say, a certain degree of authority is necessary.

To me, the real problem arises when religion adds another element into the mix: certainty. It’s a toxic combination to have religious authorities with the power to influence others claiming to “know” with 100% certainty that they’re right and everyone else is wrong.

One thing I learned from several atheists is the importance of skepticism and doubt. Indeed, while certainty leads to arrogance, uncertainty leads to humility. We open up the conversation and value diverse experiences when we approach the world with a perspective of “I’m not sure” or “I could be wrong.”

Recently, astrophysicist Adam Frank wrote a beautiful piece on NPR’s blog 13.7 about how valuable uncertainty can be:

Dig around in most of the world’s great religious traditions and you find people finding their sense of grace by embracing uncertainty rather than trying to bury it in codified dogmas…

Though I am an atheist, some of the wisest people I have met are those whose spiritual lives (some explicitly religious, some not) have forced them to continually confront uncertainty. This daily act has made them patient and forgiving, generous and inclusive. Likewise, the atheists I have met who most embody the ideals of free inquiry seem to best understand the limitations of every perspective, including their own. They encounter the ever shifting ground of their lives with humor, good will and compassion.

Certainty can be seductive, but it hurts our ability to engage with others in constructive ways. Thus when religious people talk about God, belief or faith, we have to approach the conversation with a little humility and recognize that we don’t have a monopoly on the truth. In the words of Rabbi Brad Hirschfield, we need to realize that another person doesn’t have to be wrong for us to be right.

This doesn’t mean believers and atheists will agree on the role of religion in society, the validity of a particular belief system, or even the very existence of God. In fact, believers and atheists will almost certainly continue to vehementlydisagree about these questions. But we have to remember that not all disagreements are bad. Some arguments are quite beneficial because they help us gain a deeper understanding of reality, encourage clearer thinking, and broaden people’s perspectives.

The Rabbis even draw a distinction between two different kinds of arguments. Arguments they call “for the sake of Heaven” will always be valuable, while arguments that are only for self-aggrandizement will never be productive (Avot5:20). So I’m not interested in arguments that devolve into mocking, ridicule, name-calling or one-upmanship. But I’d gladly participate in any discussion if we are arguing about how we make ourselves and this world better, and would actively strive to involve whoever wants to be part of that endeavor, regardless of what they may or may not believe.

Sam: You are right to point out that both atheists and believers under the illusion of certainty smother potentially productive dialogue with disrespectful rhetoric. What’s alarming is that atheism in the United States is now more than non-belief. It’s an intense and widely shared sentiment where a belief in God is not only false, but also ridiculous. Pointing out how irrational religion can be is entertaining for too many.

There’s no doubt that religious beliefs influence negative behavioral consequences, so atheists are right to criticize religion on many epistemological claims. But I’ve learned from believers and my background in cognitive psychology that faith-based beliefs are not necessarily irrational.

Consider a clever study recently conducted by Kevin Rounding of Queen’s University in Ontario that demonstrates how religion helps increase self-control. In two experiments participants (many of whom identified as atheists) were primed with a religious mindset – they unscrambled short sentences containing words such as “God,” “divine” and “Bible.” Compared to a control group, they were able to drink more sour juice and were more willing to accept $6 in a week instead of $5 immediately. Similar lines of research show that religious people are less likely to develop unhealthy habits like drinking, taking drugs, smoking and engaging in risky sex.

Studies also suggest that religious and spiritual people, especially those living in the developing world, are happier and live longer, on average, than non-believers. Religious people also tend to feel more connected to something beyond themselves; a sentiment that contributes to well-being significantly.

It’s unclear if these findings are correlative or causal – it’s likely that many of the benefits that come from believing in God arise not from beliefs per se but from strong social ties that religious communities do such a good job of fostering. Whatever the case, this research should make atheists pause before they dismiss all religious beliefs as irrational or ridiculous.

Geoff: It’s interesting — that actually leads to another area where atheists have pushed believers in important ways, namely, to focus less on the beliefs themselves, and more on how those beliefs manifest themselves in actions. And to paraphrase Steven Pinker, the actions that religious people need to focus on are less about “saving souls,” and more about “improving lives.”

For much of human history the goal of religion was to get people to believe a certain ideology or join a certain community. “Being religious” was a value in and of itself, and was often simply a given, but today, we live in a world where people are free to choose what they believe in. So now, the goal of religion should be to help people find more fulfillment in their own lives and to help people make a positive impact on others’ lives.

It’s important to note that people certainly do not need religion to act morally or find fulfillment. But as Jonathan Haidt writes in his new book The Righteous Mind, religion can certainly make it easier.

Haidt argues that our mind is like a rider who sits atop an elephant to suggest that our moral deliberations (the rider) are post-hoc rationalizations of our moral intuitions (the elephant). The key to his metaphor is that intuitions comes first (and are much more powerful) and strategic reason comes afterwards.

We need our rider because it allows us to think critically. But our elephant is also important because it motivates us to connect with others who share a moral vision. Ultimately, if we are striving to build communities and strengthen our morals, we cannot rely exclusively on either the rider or the elephant; we need both. As Haidt explains:

If you live in a religious community, you are enmeshed in a set of norms, institutions and relationships that work primarily on the elephant to influence your behavior. But if you are an atheist living in a looser community with a less binding moral matrix, you might have to rely somewhat more on an internal moral compass, read by the rider. That might sound appealing to rationalists, but it is also a recipe for…a society that no longer has a shared moral order. [And w]e evolved to live, trade and trust within shared moral matrices. (Haidt, 269)

Since religion is a human construct, with its “norms, institutions and relationships,” it can be used in a variety of different ways. It can obviously be used to shut down critical thinking and oppress others. But as you mention, religion has positive effects on well-being, and religious beliefs correlate with a sense of fulfillment. Perhaps the job of religion, then, should be giving us a common language, rituals, and communities that reinforce and strengthen our ability to become better human beings and find joy and meaning in our lives.

Ultimately, we don’t have to agree with someone in order to learn from them. As Ben Zoma, a 2nd century Jewish sage, reminds us: “Who is wise? The person who learns from all people.” (Avot 4:1) When we are willing to open ourselves up to others, we open ourselves up to new ideas and different perspectives.

Indeed, I have come to believe that our purpose as human beings – whether we identify as a believer, an atheist, or anything in between – is to better ourselves and our world. And any source of knowledge that leads us to that goal is worth pursuing.

Why I’m Optimistic About The Future

The history of Earth is a rocky and lifeless story. The first signs of life emerged about a billion years after our planet’s creation. They weren’t much either; mostly single-celled organisms that resembled today’s bacteria. Land animals emerged out of the oceans as recent as 500 million years ago, and the genus homo came onto the scene a mere 2.5 million years ago. Complex life on Earth is the new kid on the block; natural selection spent most of its time keeping species the same, not changing them.

We humans are a different story. 200,000 years ago a few tens of thousands of us dotted the African plains. But then something happened. We spread across the globe creating cities and villages along the way. Language evolved, and with it culture and societies. We began living longer and healthier and our population skyrocketed as a result.

What’s peculiar about the rise of humans is that biologically speaking, nothing changed; the same genes that constituted our hunter-gatherer ancestors constitute us. But somewhere along the line a small change led to profound differences in our behavior within a short period of time. Whereas homo erectus and the Neanderthals spent hundreds of thousands of years making the same tools over and over again, we were able to understand and organize the world better.

Whatever the genetic change was, we eventually gained the ability to learn from others. This was hugely important. Anthropologists call this cultural or social learning, and it not only describes our tendency to copy and imitate by watching others, it highlights our unique ability to realize the best from a number of alternatives and attempt to improve on it. Many animals can learn, but only humans can learn and improve. As evolutionary biologist Mark Pagel explains, “even if there were a chimpanzee-Einstein, its ideas would almost certainly die with it, because others would be no more likely to copy it than a chimpanzee-dunce.”

What’s more is our ability to put ourselves inside the minds of others – what philosophers term a theory of mind. It helps us assign reason, purpose and intentionality to objects and people, which moreover allows us to understand things as being part of a bigger picture. Without a theory of mind we would probably still be using the same tools as we did 200,000 years ago.

In addition, theory of mind gives rise to emotions like empathy and sympathy, which give us the capacity to cooperate with people and groups outside of our kin. Virtually all members of the subfamily homininae (beside bonobos) including chimps, gorillas and orangutans do not exhibit this type of behavior. To borrow a thought experiment from the anthropologist Sarah Hrdy, imagine if you were on a 747 filled with chimps and a baby started to cry. In Hrdy’s words: “any one of us would be lucky to disembark with all their fingers and toes still attached, with the baby still breathing and unmaimed.” Recent psychological research is confirming that our species ability to cooperate is partially innate. As bleak as our current headlines are, it appears we humans are wired with at least a minimal ability to get along with each other and have a sense of justice. We’re not perfect, but no chimp would donate to charity and certainly not group of chimps could set up a charity.

This is important for many reasons. The most obvious is that economics is impossible without the means to cooperate with strangers. This is why, according to Matt Ridley, one of the key developments in our species history took place when we “started to do something to and with each other that in effect began to build a collective intelligence… [we] started, for the very first time, to exchange things between unrelated, unmarried individuals; to share, swap, barter and trade.” The effect of trade was specialization, which gave rise to innovation, which in turn improved technologies and so on and so on. Well before Smith hypothesized the invisible hand and Ricardo thought about how England and Portugal could efficiently trade wine we had already begun to understand that communities were better off when their members honed their skills, pursued their self-interest and traded with other communities.

This is a simplified and incomplete story but you get the idea: humans flourished because they were able to learn from and cooperate with each other. It’s unclear what happened biologically, but the consequences were obviously vast.

What’s interesting is that the same cognitive mechanisms that allowed our species to prosper in the African savannah are the same cognitive mechanisms that are responsible for globalization in the 21st century. However, in place of face-to-face interactions is communication over the web.

In a 2010 Ted lecture Chris Anderson addressed this point by exploring how web video powers global innovation. He explained the following:

A while after Ted Talks started taking off we noticed that speakers were starting to spend a lot more time in preparation… [the previous speakers raised] the bar for the next generation of speakers… it’s not as if [speakers] ended their talks saying ‘step your game up,’ but they might as well have… you have these cycles of improvement apparently driven by people watching web videos.

Anderson terms this phenomenon “crowd accelerated innovation,” and uses it to explain not just how Ted Talks are improving in quality, but how everything is. He is making the same general point as Pagel and Ridley: humans learn and innovate by watching and stealing ideas from others. But what’s unique about Anderson’s point is that it’s describing how the Internet is facilitating this ability. And the exciting part is that people will learn and imitate even faster with YouTube, Wikipedia, Google Books and many more online services that focus on the distribution of content. As Anderson says, “this is the technology that is going to allow the rest of the world’s talents to be shared digitally, thereby launching a whole new cycle of… innovation.”

Whereas a famine could have easily wiped out the only community that knew how to harvest a certain crop, build a certain type of boat or make a certain type of tool – what anthropologist call random drift – the Internet not only ensues our collective knowledge, it makes it widely accessible, something the printing press wasn’t able to achieve to the same degree. This is why I’m optimistic about the future: the Internet will only accelerate our ability and desire to improve upon the ideas of others.

Ted lectures over the years give us plenty of concrete examples to be hopeful about: Hans Rosling illustrated the global rise in GDP and decrease in poverty over the last several decades; Steven Pinker demonstrated the drastic decline in violence; Ridley and Pagel spoke about the benefits of cultural and economic cooperation; and most recently, Peter Diamondis argued that we will be able to solve a lot of the problems that darken our vision of the future. And because all this research is coming to us via the web the next round of ideas will be even better. More importantly, it will inspire a generation of young Internet users who are looking to change the world for the better.

What Conspiracy Theories Teach Us About Reason

Conspiracy theories are tempting. There is something especially charming about a forged moon landing or government-backed assassination. Christopher Hitchens called them “the exhaust fumes of democracy.” Maybe he’s right: cognitive biases, after all, feast on easy access to information and free speech.

Leon Festinger carried out the first empirical study of conspiracy theorists. In 1954 the Stanford social psychologist infiltrated a UFO cult that was convinced the world would end on December 20th. In his book When Prophecy Fails, Festinger recounts how after midnight came and went, the leader of the cult, Marian Keech, explained to her members that she received a message from automatic writing telling her that the God of Earth decided to spare the planet from destruction. Relieved, the cult members continued to spread their doomsday ideology.

Festinger coined the term cognitive dissonance to describe the psychological consequences of disconfirmed expectations. It is a “state of tension that occurs whenever a person holds two cognitions that are psychologically inconsistent,” as two authors describe it, “the more committed we are to a belief, the harder it is to relinquish, even in the face of overwhelming contradictory evidence.”

Smokers are another a good example; they smoke even though they know it kills. And after unsuccessfully quitting, they tend to say that, “smoking isn’t that bad,” or that, “it’s worth the risk.” In a related example doctors who preformed placebo surgeries on patients with osteoarthritis of the knee “found that patients who had ‘sham’ arthroscopic surgery reported as much relief… as patients who actually underwent the procedure.” Many patients continued to report dramatic improvement even after surgeons told them the truth.

A recent experiment by Michael J. Wood, Karen M. Douglas and Robbie M. Sutton reminds us that holding inconsistent beliefs is more the norm than the exception. The researchers found that “mutually incompatible conspiracy theories are positively correlated in endorsement.” Many subjects, for example, believed Princess Diana faked her own death and was killed by a rogue cell of British Intelligence, or that the death of Osama bin Laden was a cover-up and that he is still alive. The authors conclude that many participants showed “a willingness to consider and even endorse mutually contradictory accounts as long as they stand in opposition to the officially sanctioned narrative.”

The pervasiveness of cognitive dissonance helps us explain why it sometimes takes societies several generations to adopt new beliefs. People do not simply change their minds; especially when there is a lot on the line. It took several centuries for slavery to be universally banned (Mauritania was the last country to do so in 1981). In the United States civil rights movements for women and African-Americans lasted decades. Same-sex marriage probably won’t be legal in all 50 states for several more years. Our propensity to hold onto cherished beliefs also pervades science. As Max Plank said, “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

Are there ways to dilute the negative effects of cognitive dissonance? I’m afraid that the Internet is part of the problem. Google makes it so easy for us to find something that confirms a belief. But it is also part of the solution. History tells us that cooperation and empathy between individuals, institutions and governments increases as the exchange of information becomes easier. From the printing press to Uncle Tom’s cabin and through the present day (where social networks are the primary means of communication for so many) people tend to consider a point of view other than their own the more they are exposed to other perspectives.

Steven Pinker captures this point well in his latest book: “As literacy and education and the intensity of public discourse increase, people are encouraged to think more abstractly and more universally. That will inevitably push in the direction of a reduction of violence. People will be tempted to rise above their parochial vantage points – that makes it harder to privilege one’s own interest over others.” It shouldn’t come as a surprise then, that the rise of published books and literacy rates preceded the Enlightenment, an era that was vital in the rise of human rights.

This brings me back to Hitchen’s quote. Indeed, a byproduct of democracy is the tendency for some people to believe whatever they want, even in the face of overwhelming contradictory evidence. However, Pinker reminds us that democracy is helping to relieve our hardwired propensity to only look for what confirms our beliefs. That our confirmation biases are innate suggests that they will never disappear, but the capacity to reason facilitated by the exchange of information paints an optimistic future.

What Motivates A Suicide Bomber?

Suicide terrorism is a peculiar business. As a means of killing civilians it is hugely efficient. Steven Pinker explains that, “it combines the ultimate in surgical weapon delivery – the precision manipulators and locomotors called hands and feet, controlled by the human eyes and brain – with the ultimate in stealth – a person who looks just like millions of other people.” The most sophisticated drone doesn’t come close.

Relative to the past few decades it is trending. During the 1980s the world saw an average of about five suicide attacks per year. Between 2000 and 2005 that number skyrocketed to 180. The targets have been diverse. Israel, Iraq and Afghanistan get all the media attention, but Somalia and Sri Lanka experienced their share of self-destruction over the past five years.

What’s peculiar about suicide terrorism is that it is especially difficult to understand from a psychological point of view. Most people find it impossible to empathize with someone who walks into a crowded Jerusalem market wearing an overcoat filled with nails, ball bearings and rat poison with the intention of detonating the bomb strapped to his (99 percent of suicide terrorists are male) waist. How do we make sense of this?

Secular westerners tend to understand suicide terrorists as unfortunate products of undeveloped, undereducated and economically devastated environments. This isn’t true. All the 9/11 hijackers were college educated and suffered “no discernible experience of political oppression.” As Sam Harris explains:

Economic advantages and education, in and of themselves, are insufficient remedies for the cause of religious violence. There is no doubt that many well-educated, middle-class fundamentalists are ready to kill and die for God…. Religious fundamentalism in the developing world is not, principally, a movement of the poor and uneducated.

What is a sufficient explanation? In the case of Islam, why are so many of its followers eager to turn themselves into bombs? Harris believes that it is “because the Koran makes this activity seem like a career opportunity… Subtract the Muslim belief in martyrdom and jihad, and the actions of suicide bombers become completely unintelligible.” However you interpret the Koran, Harris’ position is that faith motivates Muslim suicide terrorists and that beliefs are the key to understanding the psychology of suicide terrorism. When nineteen Muslim terrorists woke up on the morning of September 11th they believed that 72 virgins awaited them in Heaven; they believed they would be remembered as heroes; they believed that self-destruction in the name of their God was glorious. It does not take a stretch of the imagination to correctly guess what they were saying (I should say, praying) moments before their doom.

Epistemology isn’t the whole story. Action requires belief but belief is not created in a vacuum. Understanding the motives of suicide bombers demands knowledge of the community they grew up in. You need context.

This is precisely what anthropologist Scott Atran attempted to dissect. After interviewing failed and prospective suicide terrorists he published several articles outlining the psychological profile of suicide terrorists and concluded that a call to martyrdom is appealing because it offers an opportunity to join a cohesive and supportive community of like-minded persons. Here’s Atran’s testimony to a U.S. Senate subcommittee:

When you look at whom [suicide terrorists] idolize, how they organize, what bonds them and what drives them; then you see that what inspires the most lethal terrorists in the world today is not so much the Koran or religious teachings as a thrilling cause and call to action that promises glory and esteem in the eyes of friends, and through friends, eternal respect and remembrance in the wider world that they will never live to enjoy.

The work of anthropologist Richard Sosis suggests that Atran is correct. Sosis studied the history of communes in the United States in the nineteenth century. He found that twenty years after their founding 6 percent of the secular communes still existed compared to 39 percent of the religious communes. He also discovered that the more costly sacrifices the religious commune demanded the better it functioned. By requiring members to withstand from things like alcohol and conform to dress codes the religious communes quickly and effectively bound its members together. This is why if the West wants to minimize suicide terrorism, Atran recommends, it should “[learn] how to minimize the receptivity of mostly ordinary people to recruiting organizations.”

Thankfully, the number of suicide bombers has declined in the last few years. In Iraq Vehicle and suicide attacks dropped from 21 a day in 2007 to about 8 a day in 2010. Along with a surge of American soldiers, the decline can be attributed to an attitude shift within the Islamic community. In Pinker’s latest book he explains that, “in the North-West Frontier Province in Pakistan, support for Al Qaeda plummeted from 70 percent to 4 percent in just five months in late 2007… In a 2007 ABC/BBC poll in Afghanistan, support for jihadist militants nosedived to one percent.” If Atran is correct in suggesting that suicide terrorism is fueled by an appeal to community and an opportunity to gain esteem then this is good news.

Individual belief and the communities they arise from help us understand the psyche of suicide bombers. But even a sufficient explanation would leave me wondering. Our DNA has one goal: replication. That natural selection has given us the means to stop this process might be one of Nature’s great ironies.

Read more

“Who’s There?” Is The Self A Convenient Fiction?

For a long time people thought that the self was unified and eternal. It’s easy to see why. We feel like we have an essence; we grow old, gain and lose friends, and change preferences but we are the same person from day one.

The idea of the unified self has had a rough few centuries however. During the English Enlightenment Hume and Locke challenged the platonic idea of human nature being derived from an essence; in the 19th century Freud declared that the ego “was not even the master of his own house;” and after decades of revealing empirical research neuroscience has yet to reveal anything that scientists would call unified. As clinical neuropsychologist Paul Broks says, “We have this deep intuition that there is a core… But neuroscience shows that there is no center in that brain where things do all come together.”

One of the most dramatic demonstrations of the illusion of the unified self comes from Michael Gazzaniga, who showed that each hemisphere of the brain exercises free will independently when surgeons cut the corpus callosum. Gazzaniga discovered this with a simple experiment. When he flashed the word “WALK” in the right hemisphere of split-brain patients they walked out of the room. But when he asked them why they walked out all responded with a trivial remark such as, “To go to the bathroom” or “To get a Coke.” Here’s where things got weird. When he flashed a chicken in patients’ left hemisphere (in the right visual field) and a wintry scene in their right hemisphere (in the left visual field), and asked them to select a picture that goes with what they saw, he found that their left hand correctly pointed to a snow shovel and their right hand correctly pointed to a chicken. However, when the patients were asked to explain why they pointed at the pictures they responded with something like, “That’s easy. The shovel is for cleaning up the chicken.”

Nietszche was right: “We are necessarily strangers to ourselves…we are not ‘men of knowledge’ with respect to ourselves.”

But you don’t have to have a severed corpus callosum or a deep understanding of Genealogy of Morals (which I don’t) to appreciate how modular ourselves are. Our everyday inner-monologues are telling enough. We weigh the pros and cons between fatty meats and nutritious vegetables even though we know which is healthier. When we have the chance to procrastinate we usually take it and rationalize it as a good decision. We cheat, lie, are lazy and eat Big Macs knowing full well how harmful doing these things are. When it comes to what we think about, what we like and what we do Walt Whitman captured our natural hypocrisies and inconsistencies with this famous and keenly insightful remark: “Do I contradict myself? Very well then I contradict myself, (I am large, I contain multitudes.)”

That the unified self is largely an illusion is not necessarily a bad thing. The philosopher and cognitive scientist Dan Dennett suggests that it is a convenient fiction. I think he’s right. With it we are able to maintain stories and narratives that help us make sense of the world and our place in it. This is a popular conviction nowadays. As prominent evolutionary psychologist Steven Pinker explains in one of his bestsellers, “each of us feels that there is a single “I” in control. But that is an illusion that the brain works hard to produce.” In fact, without the illusion of selfhood we all might suffer the same fate as Phineas Gage who was, as anyone who has taken an introductory to psychology course might remember, “no longer Gage” after a tragic railroad accident turned his ventromedial prefrontal cortex into a jumbled stew of disconnected neurons.

However, according to the British philosopher Julian Baggini in a recent TED lecture the illusion of the self might not be an illusion. The question Baggini asks is if a person should think of himself as a thing that has a bunch of different experiences or as a collection of experiences. This is an important distinction. Baggini explains that, “the fact that we are a very complex collection of things does not mean we are not real.” He invites the audience to consider the metaphor of a waterfall. In many ways a waterfall is like the illusion of the self: is it not permanent, it is always changing and it is different at every single instance. But this doesn’t mean that a waterfall is an illusion or that it is not real. What it means is that we have to understand it as a history, as having certain things that are the same and as a process.

Baggini is trying to save the self from neuroscience, which is admirable considering that neuroscience continues to show how convoluted our brains are. I am not sure if he is successful – argument by metaphor can only go so far, empirical data wins at the end of the day – but I like the idea that personal and neurological change and inconsistency doesn’t imply an illusion of identity. In this age of cognitive science it’s easy to subscribe to Whitman’s doctrine – that we are constituted by multitudes; it takes a brave intellect, on the other hand, to hang on to what Freud called our “naïve self-love.”

Shakespeare opened Hamlet with the huge and beautifully complex query, “Who’s There.” Four hundred years later Baggini has an answer, but many of us are still scratching our heads.

Read more

Does Pinker’s “Better Angels” Undermine Religious Morality?

Pinker at Strand book store in Manhattan last week

It is often argued that religion makes individuals and the world more just and moral, that it builds character and provides a foundation from which we understand right from wrong, good from evil; if it wasn’t for religion, apologists say, then the world would fall into a Hobbesian state of nature where violence prevails and moral codes fail. To reinforce this contention, they point out that Stalin, Hitler and Mao were atheists to force an illogical causal connection between what they did and what they believed.

One way to answer the question of if religion makes people and the world more moral and better off is to look at the history books. For that, I draw upon Steven Pinker’s latest, The Better Angels of Our Nature, an 800 page giant that examines the decline of violence from prehistoric hunter-gatherer societies to the present. Pinker opens his book with the following: “Believe it or not – and I know that most people do not – violence has declined over long stretches of time, and today we may be living in the most peaceable era in our species’ existence. The decline, to be sure, has not been smooth; it has not brought violence down to zero; and it is not guaranteed to continue. But it is an unmistakable development, visible on scales from millennia to years, from the waging of wars to the spanking of children.” Whether you’re familiar with Better Angels or not, it’s worth reviewing its arguments to show why violence declined. Let’s run through three sections of Pinker’s book – The Pacification Process, The Civilizing Process, and The Humanitarian Revolution – to see how violence declined. Doing so will allow us to judge if history has anything to say about religion being a credible source of moral good at the individual and global level.

The Pacification Process describes the shift from hunter-gatherer societies to state-run societies. Comparing data from hunter-gatherer societies to modern states reveals two different worlds. For example, the percentage of deaths due to violent trauma (we know this from archaeological studies) in hunter-gatherer societies was on average about 15 percent, with the Crow Creek Native Americans of South Dakota (circa 1325 CE) topping off the list at just below 65 percent and the Nubia of Papua New Guinea (circa 12,000-10,000 BCE) at the bottom at just below 10 percent. By comparison, in 2005 the percentage was less than point one of one percentage; people just aren’t killing each other like they used too, in other words. Another metric to compare hunter-gatherer societies to state-run societies in terms of violence is war deaths per 100,000 people per year. In hunter-gatherer societies it was on average 524. In contrast, consider the two most violence state-run societies in the modern era: Germany in the 20th century, which was involved in two world wars, is at 135 and Russia, which was involved in two world wars and a major revolution, is at 130. The whole world in the 20th century was around 60 war deaths per 100,000 people per year. Taken together, then, Hobbes got it right when he said that the state of nature was “solitary, poor, nasty, brutish and short.”

The Civilizing Process describes the decline of violence in Europe throughout the middle ages beginning around 1200 and ending in the modern era. One way to compare these two societies is to look at homicides per 100,000 people per year in England over the course of the last 800 years. Between 1200 and 1400, roughly 20 to 30 of every 100,000 English people were murdered. Compare this to the year 2000 where the number is less than one. This means, as Pinker says, “a contemporary Englishmen has a 50 fold less chance of being murdered than his compatriot in the middle ages.” The same story holds across Europe where murder rates declined in a nearly identical fashion. In Italy, for example, the murder rate dropped from about 90 homicides per 100,000 per year in 1300 to between one and two percent in 2000, and in the Netherlands it dropped from about 80 to also between one and two percent across the same time period. Indeed, as Pinker remarks, “from the 14th century on, the European homicide rate sank steadily.” The United States saw similar trends, though obviously not over the same period of time. Here’s one example. Homicides per 100,000 in per year in California fell from a bit over a hundred in 1850 to less than ten in 1910; it truly was the wild west.

The Humanitarian Process describes the rise in human rights, individualism, and liberal ideals throughout the last few centuries. There are several ways to examine this, one is the abolition of judicial torture. From just before 1700 to just after 1850 every major European country officially abolished every form of judicial torture including “breaking at the wheel, burning at the stake, sawing in half, impalement, and clawing.” In addition, England saw the abolition of the death penalty for non lethal crimes including, “poaching, counterfeiting, robbing a rabbit warren and being in the company of Gypsies.” By the turn of the 20th century, the death penalty was abolished outright for nearly every European country (sans Russia and Belarus). The United States saw similar trends. In the 17th and 18th century, it abolished capital punishment for crimes including, “theft, sodomy, bestiality, adultery, witchcraft, concealing birth, burglary, slave revolt, and counterfeiting.” However, capital punishment is still legal, though only about 50 people per year are executed. Describing the humanitarian process would be incomplete without mentioning the abolition of slavery, which sharply increased throughout the 19th century in many countries around the world. Mauritania was the last country to abolish slavery when it did so in 1981. It is also worth considering that the number of countries with policies that discriminate against ethnic minorities fell from 44 in 1950 to under 20 in 2003; the number of peacekeepers rose from zero just after World War Two to somewhere in the tens of thousands; and over 90 countries in the world are now democratic, compared to less than 20 autocracies.

Pinker describes two more processes – The Long Peace and The New Peace – which describe similar trends but in the 20th century. In brief, pick your metric having to do with violence and it’s a safe bet it has gone down in the last century. However, there are a few details regarding social issues in the United States worth mentioning. First, we saw a reduction in hate crimes and domestic violence; lynching dropped from 150 per year in 1880 to zero in 1960 and assaults by intimate partners from 1,000 (female victims) and about 200 (male victims) to about 400 and about 50 respectively. We also saw changes in sentiments towards minorities and females. The percentage of white people who “would move if a black family moved in next door” fell over the past six decades from 50 percent to nearly zero; the percentage of white people who believed that “black and white students should go to separate schools” fell similarly; and the approval rating of husband slapping steadily dropped throughout the second half of the 20th century. In addition, gay rights have risen dramatically, animal rights have increased and hate crimes have declined.

By now, the decline of violence should be clear (if you’re not sold, read Pinker’s book). What’s uncertain are its causes. This brings me back to religion and its claim that it provides a necessary moral foundation for the individual and the society. It’s my contention that considering the data Pinker assimilated there is little evidence to support this assertion. That is, religion is not responsible for the moral progress of the last few centuries and for humanity pulling itself out of its former Hobbesian state. As Pinker himself asserts, “the theory that religion is a force for peace, often heard among the religious right and its allies today, does not fit the facts of history.”

If not religion, then what? The more accurate picture is that humans are inclined towards violence and peace. Douglas Kenrick’s study, which Pinker cites, shows that most people (male & female) occasionally fantasize about killing another person, and a trip to the movies or a hockey game will probably demonstrate this sentiment. Paul Bloom’s study, on the other hand, illustrates that babies as young as six month have a moral sense of good and bad. Therefore, it’s much more fruitful to ask what are the historical circumstances that bring out what Abraham Lincoln called our “better angels.”

Pinker identifies four “better angels” – self-control, empathy, a moral sense and reason – and four historical circumstances or “pacifying forces” that favor them over our “inner demons.” The first is the “Leviathan,” or the state. As the Pacification Process and Humanitarian Process illustrated, state-run societies are much more peaceful than hunter-gatherer societies. There are a number of reasons for this. Most obvious is the fact that it is impossible to impose legalities during anarchy. It is only under a state-run society that laws regarding physical abuse or murder can be enforced. In addition, whereas hunter-gatherers were often forced to fight over food and territory, citizens of states tended to be more secure.

The second is “gentle commerce.” This describes process in which individuals realized that engaging in trade can result in a win-win. It’s Adam Smith’s Wealth of Nations; a society benefits when its citizens are allowed to freely exchange in trade and form their own businesses. The McDonald’s theory, which explains that no two countries with McDonald’s have ever gone to war with each other, highlights how gentle commerce benefits society on a global scale.

The third is the idea of the “expanding circle,” and it describes our growing tendency to be kind and emphatic towards strangers. Whereas hunter-gatherers and citizens of early states only cared for their kin, citizens in today’s world are much more helpful, forgiving, and caring to strangers. This helps explain why we often give money to people we’ve never met even when there is no return as is the case with charities or tipping (In the famous Ultimatum experiment in which people are given $20 and the choice to either take all of it, split it $18/$2, or split it $10/$10, most split it evenly). Indeed, institutions like the Red Cross and Unicef are predicated on the idea that humans are willing to give to others more in need. What expanded the circle? Pinker points to increased cosmopolitanism, which research shows encourages people to adopt the perspective of others.

The fourth is the “escalator of reason.” Pinker says it best: “As literacy and education and the intensity of public discourse increase, people are encouraged to think more abstractly and more universally. That will inevitably push in the direction of a reduction of violence. People will be tempted to rise about their parochial vantage points – that makes it harder to privilege one’s own interest over others. It replaces a morality based on tribalism, authority and puritanism with a morality based on fairness and universal rules. It encourages people to recognize the utility of cycles of violence and to see it as a problem rather than a contest to be won.” It shouldn’t come as a surprise then, that the rise of published books and literacy rates preceded the Enlightenment, an era that was vital in the rise of human rights.

These are the four pacifying forces the favor our “better angels.” Reviewing them again puts into question the claim that religion is a necessary moral foundation and the world is better because of religion. If these two claims are true than it would be difficult to explain why the decline of violence and the rise of humanitarian rights occurred so many years after the inception of the Abrahamic religions. Religion was late to the game if it does bring out our better angels. While apologists were busy trying to prove the existence of God and justify scriptures that preach “genocide, rape, slavery and the execution of nonconformists,” the age of reason allowed Europeans to realize that understanding what was morally right and what contributed to human flourishing the most did not require religious texts.

This is not to ignore the fact that good things happened on behalf of religion. The Quakers, to their credit, supported the abolition of slavery in the United States long before most, figures like Desmond Tutu have been instrumental in reducing global and nation conflicts and positive psychology research tells us that religion is a significant source of personal happiness. But it is to deny the claim that religion is a necessary moral foundation and the claim that the world would fall into moral anarchy without religion. People assume that a moral sense or code, an understanding of right and wrong, requires religion. Is this true? In reviewing data outlined in The Better Nature of Our Nature it is apparent that religion played at best a minimal role. It seems more plausible to explain the decline of violence through other historical circumstances and events, which I’ve outlined here.

Taken together, then, it’s probably most accurate to say that religion has been along for the ride but it certainly hasn’t been in the drivers seat. Waves of violence have come and gone – thankfully most of them have gone – and humanitarian rights are at an all times high at the hand of other historical forces. People who believe that religion provides a necessary moral foundation are merely paying “lip service [to the bible] as a symbol of morality, while getting their actual morality from more modern principles.”

The Evolution of Water

In a recent TedTalk, MIT cognitive scientist Deb Roy gave a presentation entitled “The Birth of a Word,” based on a remarkable study he conducted over the last three years. Roy’s study was motivated by his interest in language acquisition. Specifically, he wanted to know how infants learned individual words over the course of their development. To do so, Roy wired his house with audio/video equipment and  recorded 90,ooo hours of video and 140,000 hours of audio (200 terabytes) to track how his son acquired English. Here is a clip that compiles all the instances from when his son tried to say the word “water,” over a six month stretch (listen for its gradual yet sudden transition from “gaga” to “water”).

Roy’s work is part of a growing body of research that tries to understand language acquisition, which is a hotly debated topic still in its infancy. In my last post I briefly touched on it by explaining why language is so metaphoric and interconnected. However, I realize that if we want to understand how language is acquired it is fruitless to just study syntax. It is like trying to understand planetary motion without mathematical equations; it’s easy to say that planets travel around the sun, but it entirely different explain why they do so. 

Unfortunately, there isn’t a Kepler for language acquisition, so I can’t offer anything novel here. However, it is worth contextualizing two contemporary theories that the language acquisition debate has spawned from. The first comes to us from Steven Pinker, who, in his 1997 book The Language Instinct, suggests that the ability to acquire language is geneticHe doesn’t explicitly state that there is a gene for language, “for any grammar gene that exists in every human being, there is currently no way to verify its existence” (322), but he does say that it is, “a distinct piece of the biological makeup of our brains” (18). So he is advocating that language is genetic in the same way that puberty or yawning is; there certainly isn’t “a gene” for either, but they are part of our genetic code nonetheless.

Park of Pinker’s project aims to figure out if all human languages are unified by some sort of “universal grammar,” (as Chomsky calls it) which holds that all “children must innately be equipped with a plan common to the grammars of all languages… that tells them how to distill the syntactic patterns out of the speech of their parents” (Pinker, 22).  The universal grammar debate is key for the language instinct hypothesis because if language really is instinctual, then you would expect it to be manifested similarly regardless of culture. Chomsky and Pinker have gone to great lengths to prove just that, but I will not get into the details for the sake of space (read Pinker’s book if your really interested).

In contrast, Stephen Jay Gould believes that thinking was an exaptation (a trait evolved for one purpose that later serves another pursue) that led to language. In his words, “natural selection made the human brain big, but most of our mental properties and potentials may be spandrels – that is, nonadaptive side consequences of building a device with such structural complexity” (The Pleasure of Pluralism, p.11). Evolutionarily speaking, Gould’s theory fits a bit better than Pinker who is faced with the Wallace problem, which asks how the Neolithic Revolution could have happened if brains had achieved its modern size roughly a million years ago. Simply put, language was too recent a phenomenon to be explained by natural selection, which is a much more gradual process. Gould accounts for this by saying that language is a byproduct of cognitive abilities that already existed.

Who is correct? Like so many things, the Pinker-Gould debate falls victim to dichotomy-fever i.e., our tendency to categorize things as being this or that. But that is not to say it is not helpful. As V.S Ramachandran explains, “neither of them is [correct], although there is a grain of truth to each… the competence to acquire [language] rules is innate, but exposure is needed to pick up the actual rules” (171). Like others, Rama believes that brains are language-ready, but do not have language built-in. This has been referred to as the still unidentified “language acquisition device,” and Ramachandran uses it to resolve Pinker and Gould’s contrasting views.

The next challenge is to understand exactly how genetics and experience work together to acquire language. This is part of Deb Roy’s project, which I think will turn out to be vitally important in the years to come. It is also a goal of many neuroscience labs out there today. And because they are all showing the brain to be more and more dynamic, it appears that an understanding of language acquisition will involve an understanding of the brain from a botton-up perspective. Unfortunately, this is a long ways off.

Follow

Get every new post delivered to your Inbox.

Join 333 other followers

%d bloggers like this: