Nautilus Members enjoy an ad-free experience. or Join now .

Why Misinformation Is About Who You Trust, Not What You Think

Two philosophers of science diagnose our age of fake news.

Berger_HERO

I can’t see them. Therefore they’re not real.” From which century was this quote drawn? Not a medieval one. The utterance emerged on Sunday from Fox & Friends presenter Pete Hegseth, who was referring to … germs. The former Princeton University undergraduate and Afghanistan counterinsurgency instructor said, to the mirth of his co-hosts, that he hadn’t washed his hands in a decade. Naturally this germ of misinformation went viral on social media.

The next day, as serendipity would have it, the authors of The Misinformation Age: How False Beliefs Spread—philosophers of science Cailin O’Connor and James Owen Weatherall—sat down with Nautilus. In their book, O’Connor and Weatherall, both professors at the University of California, Irvine, illustrate mathematical models of how information spreads—and how consensus on truth or falsity manages or fails to take hold—in society, but particularly in social networks of scientists. The coathors argue “we cannot understand changes in our political situation by focusing only on individuals. We also need to understand how our networks of social interaction have changed, and why those changes have affected our ability, as a group, to form reliable beliefs.”

Nautilus Members enjoy an ad-free experience. Log in or Join now .

O’Connor and Weatherall, who are married, are deft communicators of complex ideas. Our conversation ranged from the tobacco industry’s wiles to social media’s complicity in bad data. We discussed how science is subtly manipulated and how the public should make sense of contradictory studies. The science philosophers also had a sharp tip or two for science journalists.

Fact Checkers: “We’re philosophers of science and felt the manipulation of science is immediately relevant to our culture and really should be understood,” says James Weatherall (right), about why he and Cailin O’Connor (left) wrote The Misinformation Age.
Nautilus Members enjoy an ad-free experience. Log in or Join now .


What do you think of a commentator on a TV show with an audience of about 1.5. million people saying germs aren’t real?

Cailin O’Connor: [laughs] We disagree!

James Weatherall: We’re against it.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

O’Connor: In fact, there’s a long history of people having wacky false beliefs. People believed there were animal-plant hybrids—and these were naturalists. People believe all sorts of crazy things about the human body. If you understand beliefs in this social perspective, where people are passing them from person to person, and we have to trust each other and can’t verify things for ourselves, it’s not unexpected that we would have some wacky beliefs. But I don’t know about a person who says germs aren’t real in this day and age!

Weatherall: This is a perfect example of what we’re talking about. Acting as if germs don’t exist is going to lead to a lot of bad outcomes. You’re going to get sicker. You’re not going to treat surgical sites the right way. But it’s also something where you can’t really check yourself. Most of us don’t have microscopes to see germs. It’s the same with climate change. You can freely go around saying either the climate isn’t changing or that anthropogenic sources had nothing to do with it. Without getting any immediate feedback, without anything going wrong in your life, you can form these kinds of beliefs.

What inspired two philosophers of science to wade into misinformation?

O’Connor: I’ve been worried about climate change since I was 5 years old, and here we are 30 years later and still not doing anything about it. This is absolutely insane. It’s clear the marketplace of ideas isn’t working. We’ve allowed ourselves to be influenced by big oil and gas for over 30 years. But it was the 2016 election that prompted us. We just started writing it right after the election. We just sat down and said, What can we do, given our research skills, to improve this public crisis about false belief?

Nautilus Members enjoy an ad-free experience. Log in or Join now .

When it comes to misinformation, twas always thus. What’s changed now?

O’Connor: It’s always been the case that humans have been dependent on social ties to gain knowledge and belief. There’s been misinformation and propaganda for hundreds of years. If you’re a governing body, you have interests you’re trying to protect. You want to control what people believe. What’s changed is social media and the structure of communication between people. Now people have tremendous ability to shape who they interact with. Say you’re an anti-vaxxer. You find people online who are also anti-vaxxers and communicate with them rather than people who challenge your beliefs.

The other important thing is that this new structure means that all sorts of influencers—the Russian government, various industry groups, other government groups—have direct access to people. They can communicate with people in a much more personal way. They can pose on Twitter and Facebook as a normal person who you might want to interact with. If you look at Facebook in the lead up to the 2016 election, the Russian Internet Research Agency created animal-lovers groups, Black Lives Matter groups, gun-rights groups, and anti-immigrant groups. They could build trust with people who would naturally be part of these groups. And once they grounded that trust, they could influence them by getting them not to vote or by driving polarization, causing more extreme rhetoric. They can make other people trust them in ways that would have been very difficult without social media.

It’s not fraudulent. They haven’t done anything wrong. But it’s misdirection.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Weatherall: People tend to trust their friends, their family, people who they share other affinities with. So if the message can look like it’s coming from those people, it can be very effective. Another thing that’s become widespread is the ability to produce easily shareable visual media. The memes we see on Twitter or on Facebook don’t really say anything, they conjure up an emotion—an emotion associated with an ideology or belief you might have. It’s a type of misinformation that supports your beliefs without ever coming out and saying something false or saying anything.

How does misinformation spread through science?

Weatherall: The philosopher of science, Bennett Holman, argues that the right way of thinking about the relationship between industry and science is as an arms race where you develop a new sort of epistemic standard. Do you want to get a drug approved? It’s got to be a randomized clinical trial. Previously there were lower standards for what sorts of evidence were needed to demonstrate the efficacy and safety of the drug. But as scientists and regulators come up with new standards for dealing with possible misuse of evidence, groups who want to influence public belief or public policy come up with more sophisticated ways of getting around those.

What’s a good example of an industry using sophisticated techniques to manipulate science?

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Weatherall: In the 1960s, the scientific consensus had become clear that there was a link between tobacco products and cancer. The tobacco industry recognized a number of things very quickly. One, this was almost certainly true—their own scientists were getting the same results. Two, it was disastrous for them. Three, they were not going to be able to come up with a compelling evidence-based argument that tobacco was safe or beneficial. But they realized they didn’t need to. All they needed to do was emphasize the uncertainty present in any kind of scientific endeavor. They made the case the evidence isn’t in yet and it’s rash to act. It’s too soon for individuals to give up smoking. It’s too soon for the government to intervene.

O’Connor: There’s this naive view that they pay off scientists and scientists start saying tobacco is safe. In fact, they have all these subtle and insidious methods that are not fraudulent and don’t subvert the norms of science. In the tobacco case, they went out and found all the studies where mice painted with tobacco tar didn’t get cancer. There were a bunch of those studies done by independent scientists. The industry then shares all those. That’s not fraudulent. They haven’t done anything wrong. But it’s misdirection.

Weatherall: The tobacco industry also funded good research into mesothelioma, the cancer caused by asbestos. They did this because they wanted to go into court and say, “Yes, these people have lung cancer, but there are other environmental factors besides cigarettes that could explain the rise of lung cancer over this period.”

Nautilus Members enjoy an ad-free experience. Log in or Join now .

O’Connor: Bennett Holman and Justin Bruner give a great example of heart arrhythmias. When people were first studying anti-arrhythmic drugs, the question was, “Are these going to reduce heart attacks?” Other scientists asked, “Do they reduce arrhythmia?” Big Pharma funded the latter group. It poured money into scientists asking whether these drugs reduced arrhythmia. In fact, they did. But they also increased heart attacks and were responsible for upward of 100,000 premature deaths by heart attack. So, again, independent researchers were doing exactly what they were doing before. It was just that some of them now had a lot more money and that shaped the evidence.

Weatherall: Whenever there’s an economic incentive to get people to believe something, you’re going to find organizations doing their best to get out the evidence that supports their case. But they may not think of themselves as propagandists. They may simply be engaging in the kind of motivated reasoning that all of us engage in. They’re finding the evidence that happens to support the beliefs they already have. They want whatever it is that they believe to be true. They don’t want to feel like they’re bad people. They’re trying to get the best information out there.

O’Connor: Well, in some cases, they’re more cynical than that.

Weatherall: In some cases, they’re more cynical. I don’t mean to say that they’re all just fine. I just want to emphasize that it can be subtle.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

O’Connor: One of the things we recognize, coming from this philosophical perspective, is scientists are humans. Scientists are people too. And of course scientists are fallible, and of course scientists have political and social beliefs. But that’s normal. Everyone has to have some beliefs. The problem is industry has weaponized what’s normal to their advantage. For example, an ex-manager of DuPont accused the scientists working on CFCs in the ozone hole of not being objective because they had political motives. Well, yes, they had a political motive to protect us from cosmic radiation. That was used against them but in no way undermined the actual evidence they were gathering.

Weatherall: Another thing weaponized along similar lines is the fact that scientists disagree. They ought to disagree. If they weren’t criticizing one another, and disagreeing with one another, we wouldn’t have the grounds to trust the results of science the way that we do. But in cases where it looks as if scientists are disagreeing, it’s very easy for someone to say the jury is out, or the evidence isn’t clear. What often happens is that debates in scientific literature, in peer-reviewed journals, get settled. But then the debate will move to the newspapers and get explored on op-ed pages. It might be written by a scientist who’s doing the disagreeing. But there’s an illegitimacy to that. It reflects not sincere differences between people who are treating the evidence in the same way. It reflects a person who is no longer producing work of a sort that can meaningfully convince their peers of anything. So now they’re trying to convince people who are less equipped to evaluate it.

Fake news is shared more often by older people than by younger people.

So should the public be skeptical of scientists making their case in op-ed pages?

Nautilus Members enjoy an ad-free experience. Log in or Join now .

O’Connor: Not necessarily. In a lot of cases, scientists communicate with the public, and that can be a really good thing. What the public should be skeptical of are scientists who seem to be trying to argue for things in op-ed pages that they’re no longer able to publish in real journals. They should also be skeptical of scientists from some other field publishing an op-ed about a field they’re not part of.

Public trust in science wavers because of competing studies. One day coffee is good for you, the next it’s not. How should the public know what studies to trust?

O’Connor: If you’re a consumer, you should be looking for scientific articles that aren’t a one-off but rather package a lot of data from various studies. This should be true for journalists, too. There’s tremendous incentive to publish things that are surprising or novel because that’s how you get likes and clicks. But standards shouldn’t be about having popular articles about individual studies. Instead, when you’re writing about a topic, it ought to include a combination of good studies that show the science has been progressing for a while. That will give a much less misleading picture of the science.

Weatherall: We should say the incentive to publish surprising or novel studies applies to scientists too. They’re probably less interested in likes, but it’s how you get citations.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

O’Connor: Right, there’s a huge novelty bias. When you look at social media, people share fake news more because it’s novel, exciting. They also share studies that fail to replicate much more than studies that do replicate, probably because these studies are more surprising, right? So, resisting findings that seem shocking, weird, or novel, is something that can maybe protect you from adopting a false scientific belief.

You write cultural beliefs often shape the problems that scientists work on. What’s a good example?

O’Connor: I teach a class on how gender values move into biology. In the 1970s, people did studies on the hormones of menopausal women, but they excluded from the study any women who worked outside the home. The assumption was they must have abnormal hormones if they were working outside the home. They must be “man-women” or something. So there you go. Cultural beliefs, which now seem kind of wacky, then seemed not so unreasonable, and influenced science.

Weatherall: In fact, there are cases where cultural beliefs affect whole communities of scientists over a long period of time. So it’s interesting to reflect on how that changes. And it invariably changes because the community changes, and sometimes it changes just because old people die, and younger scientists come in and realize, “Hold on, why are we assuming this?” And they make their career by criticizing something that used to be widely held and show that it was wrong. In other cases, things change because the community of scientists diversifies. For instance, and tell me if this is wrong, Cailin, more women started working in a field.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

O’Connor: That’s true in many cases. There’s a famous example in primatology. If you look at early science on primate social behavior, it’s largely focused on the behaviors of male primates, especially aggression in social hierarchies. When women grad students started moving into the field, they focused on the behavior of female primates. That revolutionized the field of primate behavior because of the diversity.

What can scientists do to prevent their work from being propagandized?

O’Connor: That’s really tricky because often a lot of it is out of their hands. So, once you produce something, now people can use it however they want. But what needs to happen is a big-scale change: Industry has to stop being able to choose who they fund. As long as they’re able to control who they’re funding, even if they don’t corrupt the scientists, they can corrupt the science.

Cultural beliefs, which now seem kind of wacky, then seemed not so unreasonable, and influenced science.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

How should science be funded?

O’Connor: Through the government or some kind of body held to very high standards of not being influenced by industry.

Weatherall: I think there’s a case to be made for a tax on industries that would otherwise be contributing money to scientific research. They recognize the importance of science for their kinds of products. So you might ask that there be a way of taking the money they would be spending and redirecting it to an organization that was selecting who was getting funded independently.

There was a recent study of Wikipedia that showed the most accurate and high quality articles were produced by an ideologically heterogeneous, diverse set of editors and writers. Does that square with your findings?

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Weatherall: Yes, it’s consistent with the idea that science is best understood as a process that benefits from diversity. There’s another side to that, though. In a marketplace of ideas, which means a lot to us culturally, we think there’s nothing morally problematic about having whatever opinions you have, and expressing those opinions or beliefs. We tend to think that’s OK because true things are going to win out in time. In fact, they haven’t. If someone is monopolizing information flow, and interfering with what kind of information gets out there, that’s going to affect the efficiency of the marketplace of ideas. That’s what influencers, propagandists, and industrial groups are doing. Ideas aren’t spreading properly from one community to another. So we get enclaves. This is what polarization looks like—a failure of reliable beliefs to spread from one community into another community.

O’Connor: Because the marketplace of ideas doesn’t work, we are often voting as if a matter of fact is true or not. We vote for someone who doesn’t believe in climate change and then act as if climate change is not true. That vote doesn’t change whether it’s true, and that vote doesn’t change whether we’re going to face the consequences of climate change. So the problem here is that matters of fact shouldn’t be settled by public vote. They should be settled by gathering evidence and using that evidence to feed into our best tools to figure out what’s true based on it.

What are the best tools for good information?

O’Connor: Maybe we should have something like a ministry of information to decide what’s true.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Weatherall: I had a fascinating conversation with the policymakers in the European Union about their ability to engage critically with science. What they said was, Look, we agree that a certain kind of critical reasoning is essential to having true and reliable beliefs. Unfortunately, we’re elected to represent particular groups and particular interests, and so we don’t get to question certain assumptions because our constituents don’t question those assumptions, and so wouldn’t vote for us. We wouldn’t be doing our representative job if we were questioning those assumptions.

Isn’t that a cop-out?

Weatherall: Yes, but let’s look at our institutions. Look at the way that they’re failing. I think we could still have democracy with institutions that are better engineered, that are developed in response to the ways in which our current institutions are failing. We have some states that have direct voting on referenda and ballot measures. We need to find democratic institutions that are sufficiently representative, that are responsive to citizens but aren’t simply aggregating the opinions and beliefs of the large group.

How can democratic institutions avoid aggregating the beliefs of the large group?

Nautilus Members enjoy an ad-free experience. Log in or Join now .

O’Connor: The thing we suggest, though who knows how you implement this, is having people vote on the things they value. Say I value public safety. Or I value environmentalism. Or I value freedom from government intervention. So you’re voting on the kinds of things you prefer to have in your society rather than voting on actual matters of fact. Then your government should be implementing your values to create a better society, given the things that you want, but using the best evidence and facts to do that.

How can we intervene in social networks to direct people toward truth and facts?

O’Connor: All social media sites should be employing teams to fight active misinformation and disinformation. There should be teams who are constantly adapting to whatever the new sources of misinformation from Russia or industry are, and trying to fight them. On an individual level, it’s more tricky. People just don’t trust others who have different beliefs from them. But from a broader perspective, there are things that could be effective. A vaccine skeptic could find somebody who shares some other beliefs and sense of identity with them. Somebody who can say, Look, I understand why you feel afraid about vaccines or why you’re skeptical about them. Here are ways in which I, too, am like you and I understand your skepticism. Given this ground of mistrust, here are the reasons why I changed my mind and you could, too.

Can systemic changes really overturn false beliefs?

Nautilus Members enjoy an ad-free experience. Log in or Join now .

O’Connor: Of course, we’re always going to have some false beliefs because we’re social learners. It’s easy for false beliefs to propagate from person to person. But that doesn’t mean we’re always going to have the same degree of false belief. If you look at cultural evolution, we developed these cultural systems that help us do better with our brains. We’ve developed amazing learning systems that help little kids learn more effectively than in the past. We also can develop systems that allow us to do the best we can with the brains that we’ve got. It’s not just that we should give up and we’re hopeless. If we have some sort of regulations about what sorts of news people could publish, for example, we can protect ourselves from misinformation.

Weatherall: We can learn what sorts of things we can trust, what sorts of things are reliable. We have to hope that we’re going to become more successful, more effective, or more sophisticated about responding to misinformation that’s spread online. I think there’s evidence that this is happening. Fake news is shared more often by older people than by younger people. Deliberate misinformation is shared much more often by older people. There are a lot of possible explanations for that. One has to do with sophistication in the media. Younger people are more native to media, they are better at navigating it well.

O’Connor: Younger people are more savvy about identifying fake news, able to look at different aspects of some website and say, Oh, this probably isn’t real, and so are less likely to share it.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Brian Gallagher is the editor of Facts So Romantic, the Nautilus blog. Follow him on Twitter @BSGallagher.

Kevin Berger is Nautilus’ editor.

close-icon Enjoy unlimited Nautilus articles, ad-free, for as little as $4.92/month. Join now

! There is not an active subscription associated with that email address.

Join to continue reading.

Access unlimited ad-free articles, including this one, by becoming a Nautilus member. Enjoy bonus content, exclusive products and events, and more — all while supporting independent journalism.

! There is not an active subscription associated with that email address.

This is your last free article.

Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member. $9.99/month. Cancel anytime.