One theme that I keep revisiting again and again is not so much a question of the science behind medical therapies (although certainly I do discuss that issue arguably more than any other) but rather a question of why. Why is it that so many people cling so tenaciously to pseudoscience, quackery, and, frequently, conspiracy theories used by believers to justify why various pseudoscience and quackery are rejected by mainstream science and medicine? Certainly, I’ve touched on this issue before on several occasions, for example, with respect to the anti-vaccine movement, the claim that abortion causes breast cancer, and how we as humans crave certainty.
It turns out that science and science-based medicine are hard for humans to accept because they often conflict with what our senses perceive, which our brains then interpret as irrefutable evidence. The pattern-seeking function of our brain, when evaluating questions of causation in medicine, frequently betrays us. For instance, when a parent sees her child regress into autism sometime not long after being vaccinated, the easiest, most instinctive, and most emotionally compelling conclusion is that the vaccine must have had something to do with it. When scientists tell her that, no, in large studies looking at hundreds of thousands of children, there is no good evidence that vaccination confers an increased risk of autism and a lot of evidence that it does not, it’s a very hard message to believe, because it goes against how the parent interprets what she’s seen with her own eyes. Indeed, how often have we seen believers in the vaccine-autism link pour derision on the concept that when something like autistic regression happens in close temporal proximity to vaccination that the correlation does not necessarily equal causation? “Coincidence?” they will spit. “I don’t believe it.” Similarly, believers in “alternative medicine” who experience improvement in their symptoms also pour derision on the observation, explained so well by R. Barker Bausell in Snake Oil Science, that people frequently take remedies when their symptoms are at their worst, leading them to attribute natural regression to the mean to whatever nostrum they started taking at the time.
These issues have come to the fore again, thanks to an article by a former ScienceBlogger, Chris Mooney. The article appeared in a recent issue of Mother Jones and was entitled, rather ironically, The Science of Why We Don’t Believe Science. Chris made his name as an author primarily in writing about the science of anthropogenic global warming and the political battles over policies intended to mitigate it and, to a lesser extent, over creationism and evolution denial. Of late he has written about the anti-vaccine movement as an anti-science movement, leading predictably to his being attacked by the likes of J.B. Handley. Also of note, although he was widely praised for The Republican War on Science and Storm World, more recently Mooney has been widely criticized for being too critical of “new atheists” and for lack of substance. In his current article, he discusses some of the science thus far about why people can cling to beliefs that science doesn’t just cast doubt upon but shows convincingly are totally wrong.
In his article, Mooney sets the stage with a very famous example studied by Stanford University psychologist Leon Festinger in the 1050s of the Seekers. The Seekers were an apocalyptic cult in the Chicago area led by a Dianetics enthusiast named Dorothy Martin. Its members believed that they were communicating with aliens, one of whom was named “Sananda,” who was supposedly the astral incarnation of Jesus Christ. Martin also taught her followers that Sananda had told her the precise date of a world-ending cataclysm: December 21, 1954. As a result, some of Martin’s followers quit their jobs and sold their homes because they expected that a spaceship would rescue them right before the earth split open and the sea swallowed much of the United states. In fact, Martin’s followers even went so far as to rid themselves of all traces of metal, even removing underwire bras and taking the zippers out of their clothes, because they were told that such metal would pose a danger to the spaceships. Here’s Mooney’s account of what happened when December 21, 1954 came and went and, as those of us living today know, no cataclysm occurred:
At first, the group struggled for an explanation. But then rationalization set in. A new message arrived, announcing that they’d all been spared at the last minute. Festinger summarized the extraterrestrials’ new pronouncement: “The little group, sitting all night long, had spread so much light that God had saved the world from destruction.” Their willingness to believe in the prophecy had saved Earth from the prophecy!
From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. “Their sense of urgency was enormous,” wrote Festinger. The devastation of all they had believed had made them even more certain of their beliefs.
In the annals of denial, it doesn’t get much more extreme than the Seekers. They lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But while Martin’s space cult might lie at on the far end of the spectrum of human self-delusion, there’s plenty to go around. And since Festinger’s day, an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called “motivated reasoning” helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, “death panels,” the birthplace and religion of the president (PDF), and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.
I’ve actually written about motivated reasoning before a couple of years ago. At the time, I used a then-recent study that examined how impervious to evidence certain beliefs about politics were, specifically the belief that Saddam Hussein had been involved in planning 9/11, conspiring with Al Qaeda to destroy the World Trade Center twin towers. In this study, even President George W. Bush’s own words stating that Hussein was not involved in planning 9/11 were not enough to convince believers. Another study cited used similar methodology regarding Saddam Hussein’s lack of weapons of mass destruction. In fact, in this study, there was a “backfire” effect, in which those exposed to disconfirmatory information about Saddam Hussein’s involvement with 9/11 were actually more likely to believe that he was, in fact, involved. Also discussed was the belief that President Barack Obama was not born in the United States and is therefore not eligible to be President (the “Birther” movement, which recently suffered a bit of a setback) and the belief that there were “death panels” written into the recently passed Patient Protection and Affordable Care Act. In the study I discussed, the authors based their analysis of motivated reasoning on its being driven primarily by cognitive dissonance, the the feeling we have when we are forced to become aware that we are holding two contradictory thoughts at the same time. The strength of the dissonance depends upon the importance of the subject to an individual, how sharply the dissonant thoughts conflict, and how much the conflict can be rationalized away, and cognitive dissonance theory thus posits that, when faced with evidence or occurrences that challenge their beliefs, people will tend to minimize the dissonance any way they can without giving up those beliefs.
To the list of examples provided by the authors, I also added the example of someone well-known to this blog, namely Andrew Wakefield, the (in)famous British gastroenterologist who in 1998 published a study in The Lancet that claimed to find a link between the MMR vaccine and “autistic enterocolitis.” When revelations of Wakefield’s financial fraud came to light, however, his fans in the anti-vaccine movement were motivated to cling all the more tightly to him, circling the wagons and attacking anyone who had the temerity to point out his fraud, bad science, bad medicine, and massive conflicts of interest. For example, just last month, in response to criticism of Andrew Wakefield, J.B. Handley, the founder of the anti-vaccine group Generation Rescue, pointed out that people like him view Andrew Wakefield as “Nelson Mandela and Jesus Christ rolled up into one,” an assertion I couldn’t help but comment on, likening the anti-vaccine movement to a religion. Never mind that, scientifically speaking, Wakefield is just as discredited in his science as Dorothy Martin was in her predictions of global destruction. In the same article, anti-vaccine activist Michelle Guppy warned the reporter direly, “Be nice to him, or we will hurt you.” As you can see, despite the drip, drip, drip of allegations and evidence showing Andrew Wakefield to be a horrible scientist and even a research fraud have not had much of an effect on committed activists. I would argue, however, that they did have a significant effect on the media and the fence-sitters.
For the most part, most scientifically literate people know what cognitive dissonance is, but what is “motivated reasoning”? According to Mooney, to understand motivated reasoning, you first have to understand that what we humans call “reasoning” is not a cold, emotionless, Mr. Spock-like process. The way we human beings reason is actually suffused with emotion, or affect:
Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds–fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a “basic human survival skill,” explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.
We’re not driven only by emotions, of course–we also reason, deliberate. But reasoning comes later, works slower–and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.
As a result, if this hypothesis is accurate, it can be expected that people will almost always respond to scientific or technical evidence in a way that justifies their preexisting beliefs. Examples of evidence that support this hypothesis are listed, including the study I discussed two years ago using the example of the persistent belief that Saddam Hussein had a hand in engineering 9/11. Also discussed was a classic study from 1979 in which pro- and anti-death penalty advocates were exposed to two fake studies, one supporting and one refuting the hypothesis that the death penalty deters violent crime. In addition, they were also shown detailed scientific critiques of each study that indicated that neither study was methodologically stronger than the other. In each case, advocates were more likely to find the study that supported their bias more convincing and to be more critical of the one that did not.
To anyone who understands human nature, this is not particularly surprising. After all, as Simon & Garfunkel sang in their 1970 song The Boxer (one of my all time favorite songs), “a man hears what he wants to hear and disregards the rest.” That’s not quite motivated reasoning, but close. Motivated reasoning would be more along the lines of saying, “a man pays attention to information that supports his beliefs and values and finds ways to disregard or discount the rest.” This principle, more than anything else, probably explains why believers in alt-med and anti-vaccine activists are immune to disconfirming evidence. Not just immune, either, they actively seek out confirming evidence and avoid disconfirming evidence, a task made much easier by the Internet and multiple different news outlets catering to different ideologies:
Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information–through the Facebook links of friends, or tweets that lack nuance or context, or “narrowcast” and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan’s Arthur Lupia, are “not well-adapted to our information age.”
We see this in the CAM movement. An entire network of websites and blogs has sprouted up over the last decade or so. CAM believers, if they wish, can peruse sites like NaturalNews.com, Mercola.com, and Whale.to, watch television shows like The Dr. Oz Show, and never see a single piece of information or study that challenges their world view that because it’s natural it must be better, that conventional, scientific medicine is hopelessly in the thrall of big pharma, and that modalities that are nothing more than magical thinking can cure disease. Similarly, anti-vaccine activists have their own set of websites, including Generation Rescue, Age of Autism, the NVIC, the Orwellian-named International Medical Council on Vaccination (formerly “Medical Voices,” and discussed by Mark Crislip), SafeMinds, and many others. These CAM and anti-vaccine sites also have their own scientific-seeming meetings, such as Autism One (which, by the way, is fast approaching again) and the AANP.
Wrapped safely in such a cocoon, believers seldom encounter arguments against their cherished beliefs, much less strong arguments against them. No wonder they’re often so poor at defending their favorite woo when they dare to stray out of the safe confines of their little world. However, one interpretation of motivated reasoning that I’ve come up with states that you don’t actually have to be good at producing arguments that convince other people; you just have to be good enough to cherry pick arguments that convince yourself.
Politics, CAM, and the anti-vaccine movement
While Mooney’s summary for the evidence for motivated reasoning is compelling, he stumbles a bit in trying to ascribe different forms of motivated reasoning to the right and the left. While it is clear that certain forms of anti-science do tend to cluster either on the right or the left (for example, anthropogenic global warming denialism is definitely far more common on the right), if motivated reasoning is a valid hypothesis that describes well how human beings react to information that challenges their belief systems and values matter more (at least initially) than facts and science, then it would only be expected that certain forms of science would be viewed more hostilely by the right than the left while other scientific findings would be viewed more hostilely by the left. Unfortunately, one of the examples Mooney picks is fairly dubious:
So is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism. Its most famous proponents are an environmentalist (Robert F. Kennedy Jr.) and numerous Hollywood celebrities (most notably Jenny McCarthy and Jim Carrey). The Huffington Post gives a very large megaphone to denialists. And Seth Mnookin, author of the new book The Panic Virus, notes that if you want to find vaccine deniers, all you need to do is go hang out at Whole Foods.
It’s hard not to note right here that the founder of Whole Foods, John Mackey, is an anti-union Libertarian and admirer of Ayn Rand. In any case, I really hate it when people like Mooney try to pin anti-vaccine views as being mainly “on the left.” True, left-leaning crunchy types are the primary face of anti-vaccine views, but there is an entire underground on the right that is virulently anti-vaccine. These include General Bert Stubblebine III‘s Natural Solutions Foundation, far right libertarians, and others who want to protect their “purity of essence.” In addition, FOX News isn’t above pushing anti-vaccine nonsense. For example, of late the FOX and Friends crew has been doing sympathetic pieces on Andrew Wakefield, interviews with Dr. Bob Sears, SafeMinds’ anti-vaccine PSA campaign, Louise Kuo Habakus (who is virulently anti-vaccine herself and politically active in New Jersey pushing for transparent “philosophical exemption” laws. Politically, some of the most rabid anti-vaccine activists in government are conservative, for instance Representative Dan Burton. Moreover, conservative fundamentalist religion is not uncommonly a motivation for anti-vaccine views. Not surprisingly, Mooney’s example ignited a rather intense debate in the blogosophere, which included Mike the Mad Biologist, Razib Khan, Joshua Rosenau, Andrew Sullivan, David Frum, and Kevin Drum, among others.
This debate didn’t go very far in either direction because there aren’t actually a lot of good data examining whether there is a correlation between political affiliation and anti-vaccine views. Ultimately, Mooney followed up with a post on his blog in which he did the best he could do with polling data on the politics of vaccine resistance. Reanalyzing a poll from 2009 asking about Jenny McCarthy’s anti-vaccine views, specifically how many people were aware of them and how many were more or less likely to agree with them, Brendan Nyhan and Chris Mooney found:
So here are the results: Liberals (41% not aware, 38 % aware but not more likely, 21 % aware and more likely); Moderates (48% not aware, 28% aware but not more likely, 24% aware and more likely); Conservatives (49% not aware, 28 % aware but not more likely, 23% aware and more likely).
These results basically suggest that there’s little or no political divide in terms of who falls for Jenny McCarthy’s misinformation. Notably, liberals were somewhat more aware of her claims and yet, nevertheless, were least likely to listen to them. But not by a huge margin or anything.
Mooney also noted another poll done by Pew regarding whether vaccines should be mandatory:
What’s interesting here is that Pew also provided a political breakdown of the results, and there was simply no difference between Democrats and Republicans. 71% of members of both parties said childhood vaccinations should be required, while 26% of Republicans and 27% of Democrats said parents should decide. (Independents were slightly worse: 67% said vaccinations should be required, while 30% favored parental choice.)
Bottom line: There’s no evidence here to suggest that vaccine denial (and specifically, believing that childhood vaccines cause autism) is a distinctly left wing or liberal phenomenon. However, I will reiterate that we don’t really have good surveys at this point that are clearly designed to get at this question.
Even though the evidence is admittedly weak and more studies and surveys would definitely be in order, Mooney’s conclusion is nonetheless in line with my experience. I’ve said before many times that anti-vaccine views are the woo that knows no political boundaries. Although I don’t have hard scientific data to support this my contention and therefore can’t definitively discount the possibility that my observations represent confirmation bias, I’ve noticed that right wing anti-vaccine activists tent to be suspicious of the government and appeal to “health freedom” as a reason for their resistance to vaccination, and tend to eschew any societal obligation to contribute to herd immunity. Left wing anti-vaccine activists tend to be suspicious of big pharma and believe that vaccines are somehow “unnatural.” I realize my interpretation might be biased, but until better data are available it’s all I have to work with. Similarly, alternative medicine use tends not to fall into an easy left-right dichotomy either. My favorite example to illustrate this point is that, even though alternative medicine is viewed as a crunchy, “New Age” phenomenon more prevalent on the left, the Nazi regime actively promoted naturopathy and various other “volkish” alternative medicine modalities. I trust that now someone will invoke Godwin’s law, but forgive me; I was intentionally using an extreme example to illustrate my point that all parts of the political spectrum can be prone to quackery.
Finally, Mooney makes another point that I quibble with:
Well, according to Charles Taber and Milton Lodge of Stony Brook, one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. “People who have a dislike of some policy–for example, abortion–if they’re unsophisticated they can just reject it out of hand,” says Lodge. “But if they’re sophisticated, they can go one step further and start coming up with counterarguments.” These individuals are just as emotionally driven and biased as the rest of us, but they’re able to generate more and better reasons to explain why they’re right–and so their minds become harder to change.
I would quibble somewhat with whether, in the case of science and medicine at least, that apparent “sophisticated” understanding of the issues possessed by ideologues is actually as sophisticated as it appears on the surface. In some cases it might be, but far more often it’s a superficial understanding that has little depth, mainly because few lay people have the detailed scientific and medical background to apply the information. It’s often a matter of knowing facts, but not having the scientific experience, understanding of mechanisms, or sophistication to put them in context or to apply them to the situation properly. Thus, the arguments of, for instance, anti-vaccine advocates often have the veneer of scientific sophistication, but to those knowledgeable about vaccines are easily identified as utter poppycock. Examples abound, and include this “review” article by a man named David Thrower is, and every “scientific review” published by, for example, Age of Autism.
I can’t remember how many times that, while “debating” in misc.health.alternative, I would have a study quoted to me as supporting an antivaccination or other alternative medicine viewpoint and find that, when I actually took the trouble to look up the study and download the PDF of the actual article rather than just reading the abstract (which is all most lay people have access to and therefore all they read), I would find a far more nuanced and reasonable point or even that the article didn’t support what the altie was saying. One other aspect that often comes into play is an extreme distrust of conventional medicine and/or the government such that few individual studies that question the safety of vaccines are given far more weight in their minds than the many more studies that show vaccines to be extraordinarily safe or large metanalyses. Certainly this is one reason why the infamous Wakefield study, despite being shoddily designed and now thoroughly discredited, keeps rearing its ugly head again and again and continues to be cited by antivaccination activists as strong evidence that the MMR vaccine causes autism. Basically, what is happening here is that highly intelligent and motivated people can construct arguments that seem better to the uninformed.
One thing that must be remembered about motivated reasoning is that we as skeptics and supporters of science-based medicine must remember that, as human beings, we are by no means immune to this effect. Indeed, as Mooney points out, citing recent research, it’s quite possible that reasoning is a better tool for winning arguments than it is for finding the truth, and when motivated reasoning combine with the echo chamber effect of modern social groups bound together by the Internet and like-minded media, the result can be disastrous for science:
But individuals-or, groups that are very like minded-may go off the rails when using reasoning. The confirmation bias, which makes us so good at seeing evidence to support our views, also leads us to ignore contrary evidence. Motivated reasoning, which lets us quickly pull together the arguments and views that support what we already believe, makes us impervious to changing our minds. And groups where everyone agrees are known to become more extreme in their views after “deliberating”-this is the problem with much of the blogosphere.
Actually, I’m constantly asking myself when I’m writing one of these logorrheic gems of analytic brilliance if I really am being analytically brilliant or am I being selectively analytically brilliant in order to bolster my pre-existing beliefs and values? In other words, am I doing from the other viewpoint the same things that anti-vaccine zealots, for example, do when they cherry pick and misrepresent studies in order to support their beliefs that vaccines cause autism? Of course, that’s where our readers come in, as does the fact that I (and, I have no doubt, every other SBM blogger) frequently ask myself that very question. As Richard Feynman famously said, “The first principle is that you must not fool yourself – and you are the easiest person to fool.” Science is simply a method for minimizing the chance that you will fool yourself. To say “I saw it with my own eyes” is not enough, but that is what our brains are hard-wired to believe.
That’s one reason why I’m far less concerned about winning over committed ideologues. Although such a task is possible and people do change their minds, sometimes even about things very important to them, for the most part expecting to win over someone like J.B. Handley, Jenny McCarthy, or Barbara Loe Fisher is a fool’s errand. The people who need to be educated are the ones who are either on the fence or otherwise susceptible to pseudoskeptical, sophisticated-sounding arguments from denialists because they do not understand science or the issues. Although it will by no means be easy, such a goal is at least achievable.