Tag Archives: science

“Those talking heads…are full of shit.” Political bias and irrationality.

Political Dissonance

Joe Keohane has a fascinating summary of our political biases in the Boston Globe Ideas section this weekend. It’s probably not surprising that voters aren’t rational agents, but it’s always a little depressing to realize just how irrational we are. (And it’s worth pointing out that this irrationality applies to both sides of the political spectrum.) We cling to mistaken beliefs and ignore salient facts. We cherry-pick our information and vote for people based on an inexplicable stew of superficial hunches, stubborn ideologies and cultural trends. From the perspective of the human brain, it’s a miracle that democracy works at all. Here’s Keohane:

A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare — the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct — but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)

Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a “potentially formidable problem” in a democratic system. “It implies not only that most people will resist correcting their factual beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.”

In How We Decide, I discuss the mental mechanisms behind these flaws, which are ultimately rooted in cognitive dissonance:

Partisan voters are convinced that they’re rational⎯only the other side is irrational⎯but we’re actually rationalizers. The Princeton political scientist Larry Bartels analyzed survey data from the 1990’s to prove this point. During the first term of Bill Clinton’s presidency, the budget deficit declined by more than 90 percent. However, when Republican voters were asked in 1996 what happened to the deficit under Clinton, more than 55 percent said that it had increased. What’s interesting about this data is that so-called “high-information” voters⎯these are the Republicans who read the newspaper, watch cable news and can identify their representatives in Congress⎯weren’t better informed than “low-information” voters. According to Bartels, the reason knowing more about politics doesn’t erase partisan bias is that voters tend to only assimilate those facts that confirm what they already believe. If a piece of information doesn’t follow Republican talking points⎯and Clinton’s deficit reduction didn’t fit the “tax and spend liberal” stereotype⎯then the information is conveniently ignored. “Voters think that they’re thinking,” Bartels says, “but what they’re really doing is inventing facts or ignoring facts so that they can rationalize decisions they’ve already made.” Once we identify with a political party, the world is edited so that it fits with our ideology.

At such moments, rationality actually becomes a liability, since it allows us to justify practically any belief. We use the our fancy brain as an information filter, a way to block-out disagreeable points of view. Consider this experiment, which was done in the late 1960’s, by the cognitive psychologists Timothy Brock and Joe Balloun. They played a group of people a tape-recorded message attacking Christianity. Half of the subjects were regular churchgoers while the other half were committed atheists. To make the experiment more interesting, Brock and Balloun added an annoying amount of static⎯a crackle of white noise⎯to the recording. However, they allowed listeners to reduce the static by pressing a button, so that the message suddenly became easier to understand. Their results were utterly predicable and rather depressing: the non-believers always tried to remove the static, while the religious subjects actually preferred the message that was harder to hear. Later experiments by Brock and Balloun demonstrated a similar effect with smokers listening to a speech on the link between smoking and cancer. We silence the cognitive dissonance through self-imposed ignorance.

There is no cure for this ideological irrationality – it’s simply the way we’re built. Nevertheless, I think a few simple fixes could dramatically improve our political culture. We should begin by minimizing our exposure to political pundits. The problem with pundits is best illustrated by the classic work of Philip Tetlock, a psychologist at UC-Berkeley. (I’ve written about this before on this blog.) Starting in the early 1980s, Tetlock picked two hundred and eighty-four people who made their living “commenting or offering advice on political and economic trends” and began asking them to make predictions about future events. He had a long list of questions. Would George Bush be re-elected? Would there be a peaceful end to apartheid in South Africa? Would Quebec secede from Canada? Would the dot-com bubble burst? In each case, the pundits were asked to rate the probability of several possible outcomes. Tetlock then interrogated the pundits about their thought process, so that he could better understand how they made up their minds. By the end of the study, Tetlock had quantified 82,361 different predictions.

After Tetlock tallied up the data, the predictive failures of the pundits became obvious. Although they were paid for their keen insights into world affairs, they tended to perform worse than random chance. Most of Tetlock’s questions had three possible answers; the pundits, on average, selected the right answer less than 33 percent of the time. In other words, a dart-throwing chimp would have beaten the vast majority of professionals.

So those talking heads on television are full of shit. Probably not surprising. What’s much more troubling, however, is that they’ve become our model of political discourse. We now associate political interest with partisan blowhards on cable TV, these pundits and consultants and former politicians who trade facile talking points. Instead of engaging with contrary facts, the discourse has become one big study in cognitive dissonance. And this is why the predictions of pundits are so consistently inaccurate. Unless we engage with those uncomfortable data points, those stats which suggest that George W. Bush wasn’t all bad, or that Obama isn’t such a leftist radical, then our beliefs will never improve. (It doesn’t help, of course, that our news sources are increasingly segregated along ideological lines.) So here’s my theorem: The value of a political pundit is directly correlated with his or her willingness to admit past error. And when was the last time you heard Karl Rove admit that he was wrong?

via scienceblogs.com

Once again, Jonah Lehrer nails it.

Jesus wept….

…and it wasn’t about climate change.

Unbelievable tripe in the National Post today.  I’m embarrassed to say that I attended the United Church for a few years.

Mardi Tindal, the newly elected moderator of the United Church of Canada, returned from last month’s climate change summit in Copenhagen with a deep malaise. Not a true clinical depression, but an anxious despair that reduced her to weeping.

“The difference between depression and what I was experiencing is that I wasn’t suppressing or finding myself in a place of isolation,” she said in an interview about her “lament,” and how it helped her to see “the truth about the condition of my own soul.”

She was so disappointed by the meeting’s failure to reach a binding deal that she broke down in the car one day as her husband drove toward their home church in Brantford, Ont.

“I simply wept. My tears were quiet, but I spoke through them, and I was being listened to. My husband said, ‘There is great power in what you have just said, and it is a powerful message that makes clear why you are weeping.’”

“And I said, ‘Doug, I’m weeping for the millions of lives that have been lost as a result of what did and did not happen in Copenhagen,” Ms. Tindal said. “My experience was that I had a place to go with my tears and my lament…. It’s an expression of pain for the world’s suffering.”

Don’t bother reading the rest.

Book Review: Experimental Man

Media_httpwwwexperime_tfhzk

David Duncan is a journalist and Director of the Center for Life Science Policy at UC Berkeley.  He’s written a fascinating book called Experimental Man: What One Man’s Body Reveals about His Future, Your Health and Our Toxic World 

Media_httpwwwassocama_wlbkr

about his foray into the world of the body, specifically the new technologies that allow us to test and assess our risks for disease and disorder.  Using himself as a subject, he submits to batteries of tests that examine his genes (for markers of disease), his blood (for toxins as well as naturally occuring molecules like cholesterol), his brain (for structural and functional attributes), and some other miscellaneous items (for example, he is one of the 25% of people who cannot taste bitterness).

With intriguing section titles like “Idyllic childhood in Kansas, except for the toxic waste dump” (which explores his high levels of PBDEs) and “Greed, gambling, and why my brain loves Dodgeball, the movie” (which examines his MRI and fMRI brain scans, the latter while performing some gambling tasks), the book is very readable and a great introduction to the technologies that may fundamentally alter the way medicine is practiced.

With their full consent, Duncan also includes family members in some of the genetic testing, looking for shared traits, and in particular, some information about his brother’s congenital bone disorder (osteogenesis imperfecta).

There is a website that interacts with the book (and can also stand alone) called The Experimental Man Project as well as a blog that provide updates to the book, which was published in 2009.  I highly recommend this book to anyone interested in these sorts of technologies and how, for better or for worse, they will affect our quality (and quantity) of life in the future.