Earlier this week, I discussed the prevalence of what I called “reasonable” apologists for the antivaccine movement. This description also applied to the same sort of “reasonable” apologists for those promoting antimask, anti-“lockdown,” and, of course, antivaccine COVID-19 disinformation, because the Venn diagram of antivaxxers prepandemic and COVID-19 cranks is damned near a circle. Today, I plan on being more general and discussing how the concept of “reasonable” apologist can apply to the entire spectrum of science denial, of which the antivaccine movement and the anti-public health movement that’s been energized as a result of the pandemic is only currently the loudest and most visible component.
What inspired me, if you will, to generalize a bit on this topic was an article by Tom Chivers entitled How not to talk to a science denier. Basically, as is so often the case with “reasonable” apologists of science denial, Chivers starts with a seemingly reasonable premise, namely that demonization of one’s opponents can be counterproductive, and naïvely runs straight off the rails with it. In doing so, he falls prey to exactly the same problem that I discussed with respect to “reasonable” apologists just the other day. The wrinkle is how he invokes Bayesian reasoning—very badly—to justify his stance compared to the apparently “absolutist” thinking of Popperians.
Let’s start with Chivers’ reasonable criticism that doesn’t take long at all morph into the “reasonable” apologia for science denial and an assertion of his own moral and intellectual superiority because he is not so “absolute”:
Imagine you bought a book with the title How to Talk to A Contemptible Idiot Who Is Kind of Evil. You open the book, and read the author earnestly telling you how important it is that you listen, and show empathy, and acknowledge why the people you’re talking to might believe the things they believe. If you want to persuade them, he says, you need to treat them with respect! But all the way through the book, the author continues to refer to the people he wants to persuade as “contemptible idiots who are kind of evil”.
At one stage he even says: “When speaking to a contemptible idiot who is kind of evil, don’t call them a contemptible idiot who is kind of evil! Many contemptible idiots find that language insulting.” But he continues to do it, and frequently segues into lengthy digressions about how stupid and harmful the idiots’ beliefs are. Presumably you would not feel that the author had really taken his own advice on board
This is very much how I feel about How to Talk to A Science Denier, by the Harvard philosopher Lee McIntyre.
I will admit that I have not read McIntyre’s book and thus cannot comment on whether Chivers’ characterization of McIntyre’s central premise is accurate—let’s just say that I suspect it’s…exaggerated—but, based on Chivers’ opprobrium, I think that perhaps I should look into acquiring a copy and reading it. He sounds like my kind of guy, and I also suspect that McIntyre was engaging in a bit of humor to make his point and that Chivers either didn’t get or didn’t like (or both) such a snarky approach. Still, let’s consider the germ of a reasonable point being made by Chivers. Yes, you’re not going to change anyone’s mind by dismissing them as evil. Again, I don’t know any science communicator who believes that, much less seriously argues that we should approach the job of science communication to those who have bought into the conspiracy theories of science denial. On the other hand, as I pointed out before, the key is to recognize whose mind is and is not open to evidence, counterarguments, and reason. To reiterate what I emphasized the other day, hard core antivaxxers have internalized their antivaccine belief system to the point where it is every bit as much part of their identity as their religion and political ideology. Think of how difficult it is to persuade someone to abandon their religion in favor of another (or of no religion). It’s possible, but only with great effort and then only in those who were already receptive to change anyway.
To be honest, though, from the blurb describing McIntyre’s book, I’m not sure why Chivers is so hostile to him:
Drawing on his own experience—including a visit to a Flat Earth convention—as well as academic research, McIntyre outlines the common themes of science denialism, present in misinformation campaigns ranging from tobacco companies’ denial in the 1950s that smoking causes lung cancer to today’s anti-vaxxers. He describes attempts to use his persuasive powers as a philosopher to convert Flat Earthers; surprising discussions with coal miners; and conversations with a scientist friend about genetically modified organisms in food. McIntyre offers tools and techniques for communicating the truth and values of science, emphasizing that the most important way to reach science deniers is to talk to them calmly and respectfully—to put ourselves out there, and meet them face to face.
This sounds almost exactly like what Chivers is arguing. So where’s the problem? Here’s where the “reasonable apologia” comes in. First, Chivers notes:
McIntyre wants to help us change people’s minds. Specifically, to help us change the minds of these strange, incomprehensible people called “science deniers”. He addresses five main groups of “deniers”: flat earthers; climate deniers; anti-vaxxers; GMO sceptics; and Covid deniers.
These are, of course, all science deniers. Granted, among the entire group, arguably the least harmful group of science deniers are the flat-earthers, who are a tiny group quite correctly viewed by the vast majority of the public as cranks. As Douglas Adams’ The Hitchhiker’s Guide to the Galaxy characterized Earth, as science deniers go, flat-earthers are “mostly harmless,” if not harmless. In marked contrast to the case with the other specific types of science deniers, though, flat-earthers are so obviously wrong that no one other than another flat-earther would think they are anything but cranks.
And that is where Chivers goes off the rails. He doesn’t like the other kinds of science deniers being lumped in with flat-earthers. Oh, no. He doesn’t like it at all, which leads him to opine:
The framing is that McIntyre goes and meets representatives of these groups and tries to persuade them out of their wrong beliefs. He goes armed with social-psychology research about how best to persuade people. His big trick (which I think is a good, if limited, one) is asking: what evidence would it take to make you change your mind?
But the whole book is premised on one idea: McIntyre is right, and the people he is “talking to” are wrong.
And it’s true that all five groups are wrong, or at least their central claims are. The earth is in fact an oblate spheroid; the climate is warming, due to human influence, and will likely have severe negative impacts; vaccines work; GMOs are safe; and Covid is real.
The trouble is that by using these groups, McIntyre is playing on easy mode. When your example of a “science denier” is a literal flat-earther, it’s easy to say “look over there at the crazy deniers”.
Note the weasel words in Chivers’ characterization of the claims of the five groups. They’re wrong or “at least their central claims are” wrong. That qualification gives away his game as a “reasonable” apologist for deniers of climate science, vaccine science, GMOs, and the reality and severity of COVID-19.
Don’t believe me? Then behold, as Chivers dives straight into the fallacy of the middle ground, painting the less extreme science deniers as so very “reasonable”:
Even with climate change scepticism, sure, there are people who literally don’t believe that anthropogenic greenhouse gases are warming the planet. But those people are relatively rare. People who believe that anthropogenic greenhouse gases are warming the planet, but that the emissions are going to be hard to stop because of economic growth in the developing world and it would make more sense to concentrate on adaptation rather than mitigation, are much more common. Are they “deniers”? Certainly they’re often called deniers. But McIntyre himself acknowledges that China is by far the largest emitter of greenhouse gases and that the IPCC says the sweeping global changes required to cut emissions sufficiently to avoid a 1.5°C warming are unprecedented.
What Chivers fails to understand about this particular variety of climate science denier is that this “adaptation not mitigation” variety is a result of goalpost moving. Climate science deniers shifted to that position as the evidence that anthropogenic global climate change is happening and is definitely largely caused by humans became incontrovertible. Indeed, there’s even a name for advocates of this form of climate science denial: Lukewarmers. Unlike hard core deniers, lukewarmers do not deny the physics of greenhouse gases and the now overwhelming mountain of evidence supporting the conclusion that CO2 generated by humans burning fossil fuels is warming the climate. Their game is cleverer. They use the same techniques that climate science denialists use, distortion and cherry picking evidence coupled with ideological arguments, in order to minimize the perception of the threat posed by climate change and therefore argue that the risks of climate change are not large enough to justify strong and urgent action by governments.
As is the case with a lot of science denial, the goal is ideological. Lukewarmers don’t like the government intervention and the disruptions to current industry models that will be necessary to decrease CO2 emissions sufficiently to mitigate the worst effects of climate change; so, instead of denying that climate change is happening or a problem, they say we can just adapt to it, which of course involves ignoring what are admittedly worst case scenarios in which climate change is so severe that it threatens the survival of human civilization. In this, the lukewarmers are a lot like COVID-19 deniers, who use a very similar gambit discussing mitigation measures to slow the spread of the disease. Unlike the crankier COVID-19 deniers who deny that SARS-CoV-2, the virus that causes the disease, exists (and sometimes even deny germ theory), this subtype of COVID-19 deniers (or, as I often call them, minimizers) cite, distort, and cherry pick science and evidence to argue that the disease is not as huge a threat as public health officials warn. The logical endgame of this argument is that mitigation (e.g., mask and vaccine mandates, lockdowns, and other interventions) are unnecessary overreactions that cause more harm than good, just as lukewarmers argue that global climate change is an insufficient threat to warrant strong action and that such action causes more harm than good. That Chivers can’t or won’t see this is rather odd, given that it’s an obvious characteristic of science denial movements; they subconsciously craft their denial to serve their ideologies.
What really irritates Chivers the most, though, is how McIntyre correctly understands the central role of conspiracy theories in science denial:
McIntyre constantly wants to make a clean distinction between “science deniers” and non-deniers. So, for instance, he says that there are five “common reasoning errors made by all science deniers” [my emphasis]. They are: cherrypicking, a belief in conspiracy theories, a reliance on fake experts, illogical reasoning and an insistence that science must be perfect. If you don’t make all five of those errors, you’re not an official McIntyre-accredited science denier.
Hang on, though. A “belief in conspiracy theories”? McIntyre spends a lot of time talking about the tobacco firms who manufactured doubt in the smoking/lung cancer link, and the oil firms who did the same with the fossil fuel/climate change link. He says that the spread of Covid denialism through the US government was driven by Republican desire to keep the economy open and win the election. Aren’t these conspiracy theories?
There is only one appropriate reaction to such rhetorical questions:
I was tempted to include multiple other memes featuring the facepalm, but thankfully I restrained myself.
Assuming Chivers is correctly describing McIntyre’s argument, it is McIntyre who is correct, not Chivers, who appears to be disingenuously conflating genuine conspiracies with the concept of conspiracy theory in order to attack the idea that science denial is rooted in conspiracy theory. This isn’t even subtle, and his rhetorical questions about tobacco and fossil fuel companies’ efforts and the efforts of right wing populists that have infected the Republican Party to promote specific forms of misinformation are, quite simply, either disingenuous or evidence of black hole level ignorance that threatens to suck all science, knowledge, and reason into it.
Not that he doesn’t try to justify this conflation:
Ah, but for McIntyre these aren’t conspiracy theories, they’re conspiracies. The distinction is “between actual conspiracies (for which there should be some evidence) and conspiracy theories (which customarily have no credible evidence).”
So, since some anti-vaxx conspiracy theories like the polio vaccine giving children polio, or the CIA using fake vaccination stations to take people’s DNA, are true, does that mean anti-vaxxers don’t believe in “conspiracy theories” but “conspiracies”?
Obviously not. But the point is that there’s not some clear line between “real conspiracies” and “conspiracy theories”. When Alex Jones says that chemicals in the water are turning frogs gay, he’s referring to real claims that endocrine disruptors are affecting sexual development in lots of animals. It’s not easy to draw a line between real and fake, evidence-based and not evidence-based.
Chivers is making a really facepalm-worthy, cringe-inducing argument here. For one thing, none of the incidents that he cites were actual “conspiracies.” For example, let’s consider the first historical “conspiracy” that he cites, the Cutter incident. Regular readers know that the Cutter incident was a catastrophe in the history of vaccines that marred the very first mass vaccination campaign against polio in 1955. In brief, a manufacturer’s process failed to completely inactivate the polio vaccine being used to manufacture polio vaccine. As described in the link, more than 200 000 children in five western and midwestern states received this defective polio vaccine that still contained some live virus, and within days there were reports of paralysis and within a month the first mass vaccination program against polio had to be abandoned. Although a tragic incident, the Cutter incident was not a “conspiracy,” as a key component of conspiracy theories is that there must be someone powerful “suppressing the Truth,” and the conspiracy to do so must be ongoing. Going against this definition is the fact (as documented by no less a vaccine advocate than Dr. Paul Offit!) is that the Cutter incident was immediately and thoroughly investigated, an investigation that led to laws and regulations that made vaccines among the safest of medical products. As I’ve said many times, if the Cutter incident was “covered up,” it was the most incompetent coverup in the history of conspiracies.
That was just the most ridiculous example. Yes, the CIA did run a fake vaccination campaign to try to get DNA to identify Osama bin Laden. However, although that real conspiracy happened, it is not the same as the conspiracy theories woven by antivaxxers, who might point without actual evidence to that as an “example” to justify their claims about vaccines being a plot for “control.” As bad as this incident was, though, it was done for a very specific purpose, to identify and confirm the location where Osama bin Laden was hiding and was not part of a more general “conspiracy” of the type promoted by antivaxxers. As for that last part referencing Alex Jones, I couldn’t help myself. This was indicated:
Here, Chivers appears to be conflating incidents, pieces of scientific evidence, or claims referenced by conspiracy theorists like Alex Jones to bolster their conspiracy theories with actual conspiracy theories themselves. One can’t help but wonder if Chivers buys the central antivaccine conspiracy theory behind, for example, the antivaccine propaganda-fest disguised as a documentary VAXXED because there was an actual heated disagreement between investigators at the CDC who did the study at the center of the conspiracy theory over what parts of the data to include in the final paper. Maybe he thinks that COVID-19 conspiracy theorists cherry picking various scientific reports that scientists denounce as poor science (or at least debatable) is the same thing as a conspiracy.
Chivers wouldn’t really like my point of view, I’m sure. Remember, I’ve argued (and backed up my argument) that antivax beliefs are a conspiracy theory at their core, as is all science denial. Whatever the form of science denial you examine, there will be a conspiracy theory behind it. Always. No one has yet been able to show me a type of science denial without a conspiracy theory at its core, and I keep looking, because I do test my ideas against reality.
Perhaps what irritates me the most about Chivers’ article is the part near the end, when he goes all “only a Sith deals in absolutes” on McIntyre:
I think the basic problem is that McIntyre is a Popperian. That is, in hugely oversimplified terms, he believes that no amount of evidence can confirm a theory: but evidence can falsify it. “If we find only evidence that fits our theory, then it might be true,” he writes. “But if we find any evidence that disconfirms our theory, it must be ruled out.”
I, on the other hand, am a Bayesian. I have some prior belief and I assign some level of probability to it: “climate change is real and dangerous”: 90%; “the world is flat”, 0.1%. And then each new piece of evidence shifts my belief a little: if next year NASA say “we got new photos in, looks like Earth is sitting on the back of a turtle”, then I’ll upgrade my belief in a flat earth to, I dunno, 1.5% (but also upgrade my belief in there being mad people at NASA to 95%).
So I don’t need to draw a bright line between “denial” and “reality”. I can say: “I think it’s likely that tobacco firms conspired over lung cancer, but I think it’s pretty unlikely that NASA faked the moon landings.” And I can update my beliefs as new evidence comes in. I don’t have to “rule anything out”, I can simply downgrade how likely it is.
McIntyre, though, is stuck with two categories: things that might be true; and things which have been “disconfirmed”. If you believe things that have been disconfirmed, then you must be a “denier”. And so he needs to find ways of explaining why these “deniers” are so different from the rest of us.
Chivers is, of course, simplifying the idea of Bayesian thinking to a level that made me cringe. His point of viewing scientific conclusions in terms of probabilities that they are true is more or less accurate, but his examples reveal that he’s a pretty piss-poor Bayesian. Let’s consider one of his examples. Chivers suggests that the possibility that the earth is flat is currently 0.1%. (Obviously, I’m ignoring the other part of his example about the giant turtle.) How many orders of magnitude too high is that quoted possibility? Let’s put it this way. There are some conclusions in science that are so settled, so supported by overwhelming mountains of evidence from many lines of study and many different scientific disciplines that there is no practical difference between Bayesian and Popperian thinking about them. The conclusion that the earth is not flat is one of them. A key part of Bayesian reasoning is assigning prior probabilities that are as accurate as possible to one’s statistics. Failing to do that leads to vastly overestimating or underestimating the probability that a conclusion is valid, and Chivers did both, ridiculously overestimating the probability of a flat earth and severely underestimating the probability that human-induced climate change is real and leading to potentially catastrophic effects.
I like to use the example of homeopathy to illustrate this principle. Yes, it is just barely possible that water might have memory and that diluting a remedy to the point where not a single molecule is likely to be left might produce an effective medical remedy. However, for this contention to be possible, the central well-established laws and theories of physics, chemistry, physiology, and biochemistry would have to be not just wrong but spectacularly wrong. How likely is it that these laws and theories would be found so wrong that homeopathy becomes possible? About as likely as the possibility that the earth could be flat, in other words, so low as to be, for all intents and purposes, impossible.