=====
We live in the Misinformation Age.
Open your news feed, Facebook, or your inbox full of emails from That One Uncle we all have, and you’re going to be tangled in wild and contradictory claims about science. Left, Right, and Center… if humans are involved, then people are going to be spinning B.S. It’s enough to drive many to despair - nothing is real, everything is dumb - but hear me out, there’s hope!
A side perk of taking a science course is that it should give you a few new tools out of the Field Kit for Critical Thinking. As you stride forward into the strange days ahead, I hope you’ll wield these to cut through this dense forest of B.S. and keep you and yours out of trouble. These tools work together, but in rough order of use:
TOOL 1: Check your dang sources! What is the claim, is it accurately reported, and what evidence supports it?
On the face of it, this is the easiest to do and doesn’t have to be particularly time-consuming. Unless you’re a practicing scientist in a narrow field, very few of us sit down with a cup of coffee and a scientific journal to read hardcore primary research. Everything else is filtered through a different outlet: a University press release, a newspaper, a dedicated science aggregator (like sciencenews.org), or entertainment media. The further you get from the original science, the more likely you are to encounter outright errors or severe distortion of the research findings. So, what do we do?
First, demand a source if one isn’t provided. Lots of claims can be immediately exposed as, er, “creative narrative” with no basis in reality. To be charitable, there are some fanciful imaginations out there! Everyone is entitled to their own opinion, but not their own facts. Despite many recent arguments to the contrary, facts are still a thing (convenient or otherwise) and we can objectively learn about the universe using reason and evidence. Willfully ignoring facts because you don’t like them is magical thinking, is prevalent throughout the political spectrum, and is extremely dangerous.
Next, quickly check out the source. Is it from a remotely credible outlet [more on this in Tools 2 + 3]? If not, can you cross-check it with other, more trustworthy sources? For something sufficiently important, or if you’re embroiled in an argument with someone, there’s no substitute for going to the original research… except that much of the time, this is behind a paywall or completely opaque to non-experts. Are there discrepancies in how different sources are reporting the claim, such as numbers not matching up, how the conclusions are being spun, and so on?
If you take one thing away from this, it’s this key from the legendary Carl Sagan: Extraordinary claims require extraordinary evidence. Despite what typical, formulaic science reporting would have you believe, scientists aren’t constantly “baffled” and don't have to “have to rewrite the textbooks” three times a week. Science very seldom moves forward in huge leaps! Mature fields have a good idea of what’s going on in their little corner of science, and new evidence mostly refines that understanding or fills in gaps. Completely overturning a robust explanation of something requires pretty darn strong evidence to justify it. For example, we have a deep understanding of how viruses in general work and respond to treatment, so it would take an awful lot of careful studies with strong conclusions to substantially change that model. Similarly, our understanding of new fields (like the biology and treatment of COVID-19, a novel virus) can be modified quickly based on decent evidence. If we don’t yet firmly grasp what’s going on, lots of little nudges in the same direction can bring about large changes in our understanding.
Most people live their daily lives this way: if you have a very good reason to believe something, it takes a lot to convince you otherwise; but if you aren’t sure, it’s easier. Science is fundamentally different from matters of faith because if the evidence is strong enough, good scientists are obligated to change their minds. In short, “keep an open mind, but not so open that your brain falls out.”
TOOL 2: Evaluate expertise! Is this person really qualified to make that claim?
Now we’re getting into the tougher stuff. Who is making the claim in your (hopefully) credible source? In descending order, here’s who you should trust to accurately report a claim:
1) The scientists who did the original research (the claim will be undistorted, but seek corroboration of its significance [see Tool 3 below]);
2) Other scientists in that same narrow field;
3) Non-partisan governmental organizations responsible for overseeing research and making recommendations, like the CDC;
4) Relatively unbiased and accurate media outlets reporting news, NOT Opinion articles that are easily confused with news, even in otherwise legitimate outlets. Here is a link to a widely used media bias chart that can help you assess sources.
5) Everyone else.
Number 5 is where the real danger comes in. To see why, let’s pause to talk about expertise.
Expertise means having true mastery of a skill or knowledge. This isn’t something you get in a day, a week, or a year; expertise is earned over many dedicated years of study and practice. Consider a master electrician, a physician, or someone who has run a restaurant for 30 years. Expertise is real, powerful, and very narrow in scope. Just as you don’t want a dentist installing new gas lines in your home, or a skilled carpenter taking out your gall bladder, those with expertise need to stay in their lane. Unfortunately, there’s a trap here. It’s all too easy for those with expertise in one specific field to incorrectly assume that they are also experts in other fields that may or may not be closely related to their area of mastery! “I am the world’s expert in how viruses replicate their genomes, and therefore I can confidently talk about how viruses spread in a population.” Uh, no. You really can’t. These “appeals to authority” are a very common informal logical fallacy, and unfortunately, lots of otherwise brilliant people get themselves into trouble this way. It’s particularly visible among older Nobel laureates, who sometimes start spouting nonsense using the exposure won through their past successes.
Here’s the best way I’ve found to help people avoid falling into this trap, particularly if they’re older, confident adults with expertise of their own (for example, relatives at Thanksgiving…). Say your uncle spent his career designing jet engines. Ask him how much he thinks the general public understands about designing jet engines, versus his personal expertise. He’ll laugh and say, “not a darn thing!” Then gently point out that this is how the experts in any field feel about EVERYONE ELSE outside of that field. Experts in vaccine development recognize that you know “not a darn thing!” about vaccine safety, just as they know nothing of designing jet engines. Intellectually honest people should really stop and reflect on that for a while.
Those who don’t absorb this lesson exemplify what’s called the Dunning-Kruger effect (and look this up, it’s wild). In short, people with the least knowledge about a subject most strongly overestimate their knowledge of it! It’s easy to understand why: they don’t know enough to know what they don’t know. Yeah, read that again. If you’ve ever walked into a test feeling like a rockstar, but then completely bombed it because you didn’t realize that you were unprepared… that’s Dunning-Kruger. We’ve all been there.
Clearly, no one can be an expert on everything. Your professors are typically world experts in one very specific topic, and they can (with varying degrees of success) extend that expertise to teach you about the broader field. Hopefully they are honest about that and will tell you, when they start to get outside of their lane, “That’s a great question! I don’t know, but I’ll find out!” On many occasions in my courses, particularly when we covered something so truly bizarre/horrifying that it stretched belief, I would always make a point of encouraging you to fact-check me. Some of you would do that, and return with haunted eyes. That’s implied in every lesson for every class you will ever take: trust, but verify.
Why am I belaboring this explanation of expertise? I warned you that there’s a danger here, and it’s in confusing experts in a field who can credibly comment on a claim (#2 above) with experts in different fields who aren’t staying in their lanes but are seen as authorities because of their unrelated expertise and credentials (#5). For examples, let’s talk about two prominent media personalities who have been back in the news lately: Dr. Oz and Dr. Phil.
Dr. Mehmet Oz is an accomplished cardiac surgeon on the faculty at Columbia University. He also hosts a television program in which he makes dubious medical claims about supplements and other interventions. Independent evaluators have demonstrated that at least half of these are either inaccurate or totally baseless. In fact, he was summoned to testify before Congress, and an attempt was made at formal censure by the American Medical Association. Why would he do this? We’ll think about that below in Tool 3. Recently, although he has zero expertise in epidemiology or virology, he has regularly appeared on TV programs making claims about COVID-19 that are again contradicted by best evidence.
For more on the highly questionable practices of Dr. Oz, check out this article in the American Medical Association’s Journal of Ethics.
Dr. Phil McGraw, another television personality raised to prominence by Oprah Winfrey, is a clinical psychologist… not even a physician and not licensed to practice (check out his Wikipedia biography for some sordid details). Like Dr. Oz, he has also been back on the media circuit making claims about COVID-19, and his celebrity has granted him a platform to talk about something that is very far indeed from his area of expertise. So, uh. Why do people do this?
TOOL 3: Evaluate motivations! What are they trying to sell you… and what are you trying to sell to yourself?
When someone makes a claim, always ask yourself: what are they trying to sell me? Everyone is trying to sell you something… otherwise, they wouldn’t be engaging with you in the first place. Much of the time, this will be some kind of political/policy stance, meant to influence you to their side (and reinforce their own opinions too). They may be trying to sell you a physical product, with convenient links on their website. They may even just be sowing chaos for “fun,” like typical online trolls. Some do all of these at once! Alex Jones, the prominent host of InfoWars, has built an empire selling bogus conspiracy theories and physical products. The cornerstone of his legal defense during a custody trial for his children (which he lost) was that he is a “performance artist” and his InfoWars personality is a dramatized character not meant to be taken seriously. This is important context for his legions of dedicated fans, particularly as he continues to insert himself into national politics… as a self-proclaimed cartoon character.
Fear is an incredibly powerful motivator, driving people to want clear-cut and simple solutions (or denials) of their problems. Bad actors thrive in this environment, making it more important than ever to tread lightly and use your tools. If everyone is trying to sell you something, find the “products” that are most legitimately fact-based, helpful, and transparent. If you’re looking at an accurately reported claim [Tool 1], made by a governmental body like the CDC that is typically apolitical and charged with tracking the progress of a pandemic and the nation’s response [Tool 2], then they are most likely trying to “sell” you a fact-based set of recommendations on how to keep yourself safe and improve national outcomes. If you’re looking at a medical claim without cited supporting evidence [Tool 1], by a TV personality with a very poor track record of providing accurate advice [Tool 2], and they receive promotional consideration ($) for touting that product while maintaining their national brand [Tool 3], it’s probably B.S. In his response to a formal complaint by other physicians, Dr. Oz said he presents his claims “without conflict of interest.” Well sure, if you totally ignore the money and airtime; the estimated net worth of Dr. Oz is disputed, but in the tens of millions of dollars at the low end of the range. The most famous huckster in history, P.T. Barnum, recognized that when people are given a choice between attractive “humbug” and cold, hard reality… reality doesn’t stand a chance.
That is a deeply important lesson, and one that brings us to our last point. What are you trying to sell to yourself?
Everyone suffers from cognitive biases… it’s just how humans work. We want certain things to be true because they reinforce our values, our self-image, or are otherwise attractive or reassuring. In many cases, bright people with expertise can become even more susceptible to B.S. because they are skilled at motivated reasoning, coming up with justifications for false beliefs that sound plausible but don’t stand up to scrutiny. If you really want something to be true, take a step back and carefully reassess why you want it to be true and the extent to which that influences your view of the claim. Oh man, does this take practice. Motivated reasoning has never been more of an attractive nuisance than it is today in the Misinformation Era of narrowcast media. It is easier than ever to get trapped in an echo chamber of identical, self-reinforcing viewpoints that may or may not correlate with reality. Navigating this strange world, just like success in science, requires an awful lot of humility. Recognize that your expertise is limited, that you are probably wrong about many things in ways both large and small, and that the search for Truth is going to require you to cut through that forest of B.S. seeded daily by swindlers, politicians, and otherwise well-meaning but confused people. Hidden in this shady grove are still some good folks trying to help.
It’s wild out there. Good luck - I believe in you.
-NA