Monday, March 1, 2021

The fork in the road: science versus denialism and conspiracy theories

The world is awash in information. Never before have people had as much access to humanity’s collective knowledge as we do today. You want to know when the Normans conquered England? How many people use Weibo? Or what Machu Picchu would have looked like in its glory days? Simply pull out your phone and ask Siri. 

 

This cornucopia of knowledge should mean that people are in the position to make the best decisions possible. From the insurance plans that best fit their needs to voting for candidates or political parties that support policies that return optimal outcomes for individuals and society as a whole. Beyond individuals, this wealth of information should mean that evidence-based policy would be easy to pursue and outcomes for nations continually improving.

 

However, this is clearly not the case. The availability of knowledge doesn’t mean that evidence, fact and truth are utilized. Preconceived belief and ideology are important filters through which evidence is evaluated. Yet, what is really disheartening about the use of knowledge and evidence is how others (individuals and organizations) with political and economic agendas filter and manipulate what is channelled to various audiences.

 

While we might naively refer to the modern era as one based on information and the democratization of evidence and knowledge, the reality is that we live in the era of disinformation. Disinformation is the active and knowing creation and spread of false information, like politicians saying a fair election was stolen. Misinformation is the cancerous offspring of disinformation, where this false information is shared by those unaware of its nefarious origins. Disinformation and misinformation have the power to derail robust democracies and motivate atrocities.

 

The study of the origins and valuation of knowledge is a complex, convoluted and challenging area to say the least. But it is not esoteric nor just academic. Knowledge and understanding are the cornerstone of societal well-being, technological development and ultimately underpin democracy. Public policy driven by misinformation and dismissal of basic facts is simply ill-equipped to deal with many of the problems we face. This is easily showcased by the dismal, and frankly embarrassing, chaotic COVID-19 response in the United States -a clear failure for proponents of evidence-based policy.

 

Knowledge and belief arise through a number of different endeavours that span social influences, logic and reasoning, and, importantly, the empirical claims of science. Science is the process by which we assess testable claims about the world. Scientists use accumulated knowledge and evidence to formulate questions or predictions and then ultimately assess these against experiments and observation. We commonly ascribe science to the scientific method, but what scientists actually do and how they go about developing explanations and testing them is actually quite a bit more complicated. Philosophers of science, from Popper to Kuhn to Lakatos and on to Lauden have argued about what demarcates science from other knowledge-gaining exercises and these debates have, in some ways, been mired by the reliance on a scientific method that may or may not exist (see Lee McIntyre’s The Scientific Attitude for a wonderful overview).

 

The best way to think about science is to use McIntyre’s lead, where science is both a process and worldview. It is a process because it has rules in place to guide how we assess claims about the world. Perhaps more importantly, as a worldview that scientists subscribe to, we are willing to test our explanations against fair and unbiased evidence and are willing to alter our belief in light of countervailing evidence. Explanation and belief are constantly assessed and refined, or in some cases completely dropped, because we allow the real world to correct us. I’ve certainly gone through this process and have changed my thinking about the theories that I work on. More than once.

 

 

As the figure above indicates, there are multiple avenues to gain knowledge and empirical science is one of them. I take a broad view of science, so that it would include a lot of what is done in social sciences. Economics, for example, can certainly answer the question, based on more than 80 years of empirical evidence from neoliberal policies, of whether tax cuts or infrastructure investment result in greater economic growth (it’s the latter).

 

Science is one route to knowledge, insight and introspection about ourselves and our place in the universe. However, on matters of the observable world, it is the most important. Science starts with testable questions which necessitate the collection and assessment of evidence (‘facts’), but something can go wrong here. People who don’t follow the rules of science (like objectivity, honesty and transparency) and have a pre-ordained conclusion can simply use only evidence that confirms their belief (confirmation bias) while downplaying damning evidence that shoots their theory full of holes (refutation bias). Once we hit this fork, we go down the path to denialism, pseudoscience and conspiracy theory.

 

We throw around these last three terms a lot when talking about anti-science and anti-fact movements like QAnon, anti-vaccine movements and flat-Earth proponents, but they are not actually synonyms. Though these three terms are clearly interrelated, and many irrational movements invoke all three.

 

Denialism refers to the refusal to believe empirical evidence that casts doubt on one’s belief or ideology. No amount of negative evidence can change the mind of an adherent. Positive evidence is given extremely high weight, often without critically examining the origins of evidence.  But evidence is often not an important ingredient, it is just convenient when it reinforces belief. 

 

Pseudoscience uses the language of science and even purports to uses empirical evidence and experimentation. However, the preferred explanation is assumed to be true, and all that is required is the evidence support it. Opposing explanations are assumed to be wrong, regardless of empirical support. A classic example was the shift from young-Earth creationism (which usually fell firmly in the Denialism camp) to intelligent design (ID). ID attempted to avoid the language of creationism and instead used technical-sounding concepts like ‘irreducible complexity’ to conclude that a creator was a necessary ingredient to explain life. Unfortunately, for ID, proponents’ claims have not been able to withstand rigorous testing, but proponents will still cling to fragile evidence to support their beliefs. 

 

Finally, conspiracy theory has much in common with denialism, and it can be argued that you need to be a denialist in order to truly be a conspiracy theorist. However, in order to support their claims, they go a step further and see a vast collusion of nefarious actors whose primary agenda is to undermine the ‘truth’. Take for example the recent claims of election fraud in the USA. Adherents to this conspiracy theory are willing to believe that dead dictators, Democrat leaders and a vast network of thousands of election volunteers are all part of an organized attempt to change the outcome of an election. Without e-mails. Or social media posts. Or any other evidence. Compare this to the fact that average people could easily figure out the identities of members of the mob that stormed Congress because of extensive social media threads and verbal communication with friends and neighbours. This strange juxtaposition can only lead us to one of two conclusions. Either there was no election rigging conspiracy, or those who stormed the Capital are idiots and the thousands of election stealers are just so much smarter.

 

In all three of these cases, some form of authority or ideology is given more weight than reality. I have a couple of hypotheses why this happens, and especially in the USA, where the nationalistic hubris creates a large gap between the belief about how great one is compared to their reality, and so instead of accepting reality, feelings and scapegoats trump fact.

 

The dismissal of evidence has become commonplace in political life. No one said it better than Newt Gringerich. He basically says that conservative voters believe America is more dangerous today than in the past, and when the newsperson confronts him with the fact that crime has been on a downward trajectory for a long time and that we are statistically safer today than a couple of decades ago, he responds that ‘Liberals’ might have facts that are theoretically true, but his facts are true too. Remember Sean Spicer’s ‘alternative facts’, and this thinking has been around for a while. Have conclusion, need fact.

 

Christmas day 2020, Wisconsin pharmacist Steven Brandenburg purposely destroyed hundreds of dosesof the Moderna COVID19 vaccine. Turns out that Mr. Brandenburg believes that the world is flat and that the Moderna vaccine was designed to harm people and also includes microchips for tracking. While we might chuckle at the absurdity of these believes, there is a deeper, more troubling issue at play. Mr. Brandenburg is a pharmacist. Meaning that he not only has scientific training, but also needs to make evidence-based decisions to help patients. As a supposedly scientifically literate person, he could have easily devised ways of testing his claims. For example, take a plane to Asia, then another to Europe, and then back to the USA. There, the world is not flat. As for microchips in vaccines, a simple compound microscope ought to be enough to observe these.

 

So, if a pharmacist is not willing to put the effort into testing easily refutable claims, why would we expect our bank teller, auto mechanic or Ted Cruz? This goes to the core of the problem. Given the politics of Truth and fact, science and scientists no longer have any authority for many people. In fact, just being a scientist might be enough to get you dismissed as an agenda peddler or a member of some number of absurd conspiracy theories.

 

There is no doubt that vaccines have saved more lives than almost any other medical technology. Yet no other medical treatment or intervention has elicited more skepticism and outright rage than vaccines. And yet there is no rational reason for this, the evidence is very clear. But, there is a denialist and alarmist reason that plays on parents’ anxiety about the health of their children and mistrust of science. 

 

In 1998, Andrew Wakefield published a paper in the prestigious journal, Lancet, in which he reported a link between MMR vaccines and autism. This paper should have never been published. It was based on a sample size of 12 children, and from which there was evidence that Wakefield altered data and records. This paper was retracted by the journal, which is pretty much the worst public humiliation a scientist can experience. It is a recognition that you broke the sacred rules of science and it is a shame you wear for the rest of your career. Despite this public shaming, non-scientific audiences gravitated to his messaging in books and paid lectures.

 

Today, many thousands of people believe that vaccines are bad for children and might cause autism. Of course, these same people would probably have no problem taking antibiotics for an infection, receiving chemotherapy for cancer or eating a hotdog when hungry, despite the fact they probably can’t tell you what exactly is in these. Why vaccines? That is an interesting question. Maybe it’s just serendipity that this was the fraud target of Wakefield, or maybe it’s because of the violation of having a needle pierce your skin, or maybe it is because of the undeniable success of vaccines.

 

This vaccine denialism not only resulted in the re-emergence of nearly eradicated childhood diseases in places like Paris and Los Angeles, but it wasted money and time that could have been put to better use research new therapies. The response required ever increasing numbers of studies to show that there were no links between vaccines and autism. In one of the largest assessments, Anders Hviid and colleagues examined and analyzed the health records of more than 650,000 Danish children for more than 10 years and they simply didnot find any links between MMR vaccines and autism.

 

If you happen to be one that doubts the safety and efficacy of vaccines, ask yourself why, and where you are getting your information from. Then ask yourself if you were, unfortunately, diagnosed with cancer, would you trust your doctor’s request that you start radiation or chemotherapy? If so, despite not really understanding what constitutes ‘chemotherapy’, you’d trust your doctor's knowledge and expertise. Why would you dismiss this same doctor when it came to vaccine advice? You can’t have it both ways, that is irrational.

 

So, where does this leave us? In a quagmire for sure. But it also means that those of us who practice, use or teach empirical science have the knowledge and scientific understanding to engage in dialogues about important issues, whether that is about climate change or vaccines. It doesn’t mean we need to be political (but we should engage with political structures), and we don’t need to be dismissive. We can ask questions to understand peoples’ mistrust or where they are getting their information from. I find that the best way to engage is to be affirmational and dispassionate (which can be hard for me). I recently engaged in a conversation with someone who wasn’t going to get a COVID vaccine and asked a bunch of ‘why’ questions and then started my statements with phrases like “I can understand why you’d be unsure…” and I laid out the medical and public health facts about vaccines.

 

The only way to counter disinformation is with the light of evidence. Not everyone will abandon their conspiracy theories, but many have been fed misinformation, and scientific understanding and fact can really help people make better decisions for themselves.

1 comment:

koyal rani said...
This comment has been removed by a blog administrator.