This post is the final of a four-part series on polarised discussions in science and how to deal with misinformation. You can find Part 1 introducing the topic here, Part 2 on climate change here, and Part 3 on evolutionary biology here.
In the preceding two sections we have given a very brief survey of two areas that are the subject of intense public debate, and that see a lot of distortion or denial of factual knowledge to fit preconceived ideas. But the problem is not limited to these areas and we currently find ourselves amidst a storm of misinformation, fake news and alternative facts. In this final section, we draw attention to a number of recent books that will help readers think more clearly, logically and rationally, and give them the tools to see through spin and hyperbole.
Several prominent sceptics have written accessible books on a wide range of pseudoscientific ideas, such as Skeptic: Viewing the World with a Rational Eye (Shermer, 2016), Nonsense on Stilts: How to Tell Science from Bunk (Pigliucci, 2010), or Bad Science (Goldacre, 2008). In recent years, however, there seems to have been an increasing abandonment of reason.
Part of the problem is that, as alluded to in the post on anthropogenic climate change, a lot of scientific research is funded by groups with particular interests, which can lead to flawed results when they already have in mind what they want the science to show. This is discussed at length in Tainted: How Philosophy of Science Can Expose Bad Science (Shrader-Frechette, 2016). Even worse is when such groups purposefully create the appearance of controversy to confuse and mislead the public and protect industry interests, such as the decade-long campaign by the tobacco industry to create the impression there was no scientific consensus on the harmful effects of smoking. David Harker has written the first book-length analysis of this in Creating Scientific Controversies: Uncertainty and Bias in Science and Society (2015), which should help readers to understand and evaluate such cases, and how to respond to them. Politicians are no less guilty of this, as Dave Levitan asserts in Not a Scientist: How Politicians Mistake, Misrepresent, and Utterly Mangle Science (2017).
According to books such as The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters (Nichols, 2017), and Respecting Truth: Willful Ignorance in the Internet Age (McIntyre, 2015), another part of the problem is the internet. In the opinion of these authors, easy access to information and egalitarian platforms in the form of weblogs where everyone can have their own say, are some of the factors that have bred a generation of opinionated, poorly informed people, who think they know enough on a topic after a quick scour of Wikipedia. This is accompanied by an underbelly feeling that expertise is synonymous with elitism, leading to distrust of any form of authority. In his pithy book Are We All Scientific Experts Now? (2014) Harry Collins provocatively puts forth the notion that not everyone’s opinion counts equally. Or, as Robert Dorit wrote in 1997 in American Scientist when reviewing Darwin’s Black Box, ‘[…] opinions should not be mistaken for expertise’.
As Julian Baggini explains in The Edge of Reason: A Rational Skeptic in an Irrational World (2016) this is not about stifling dissenters, or stamping out opposition. Science thrives on scepticism and reasonable debate. But the key word here is reasonable. The current wave of anti-expertise sentiment is not just attacking scientific knowledge, it is attacking the very framework that generates these findings. As Michael Specter said in his 2010 Ted Talk The Danger of Science Denial, ‘you are entitled to your own opinion, but you are not entitled to your own facts’. And, as Prothero argues in Reality Check: How Science Deniers Threaten Our Future (2013), this matters to society at large. Whether we are talking about addressing climate change, or the return of nearly eradicated diseases because more and more people refuse to vaccinate their children, the ill-informed opinions of some can affect us all, especially once they enter voting booths.
We believe that this means that we have a responsibility, as academics, as educators, as librarians, to speak out and communicate why what we do matters, to teach critical thinking. This makes recent books such as Critical Thinking: Tools for Evaluating Research (Nardi, 2017), Making Sense of Science: Separating Substance from Spin (Dean, 2017), A Survival Guide to the Misinformation Age: Scientific Habits of Mind (Helfand, 2016), and Don’t Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking (Kida, 2006) so important. This will require us to become excellent communicators: the media likes to simplify things and deal in snappy sound bites, whereas scientists have to communicate complicated ideas that have great degrees of uncertainty. And, as many of the interviewees in Olson’s documentary Flock of Dodos agreed in its conclusion, with some notable exceptions, scientists at large are poor communicators. Am I Making Myself Clear?: A Scientist’s Guide to Talking to the Public (Dean, 2009) could well be considered an essential part of the academic toolkit. But, as Jo Fidgen concludes around the 38-minute mark in the BBC Radio 4 podcast we referred to in our opening paragraph, ‘cold facts are not enough, they are much more convincing when they are part of a story’. So add Houston, We Have a Narrative: Why Science Needs Story (Olson, 2015) to your toolkit.
To end on a sober note, we must not forget that science is a human endeavour, and as such prone to all the failures and follies of man. In our search for a deeper understanding of the world around us we stumble, we falter, and we fail (on a side-note, this is not all bad, but a necessary part of scientific progress, as Stuart Firestein lays out in Failure: Why Science is So Successful (2015)). Worrying, also, is the 2015 Science paper reporting that a lot of published research findings cannot be replicated (though see this follow-up critique, and a rebuttal of that critique). And although this paper specifically talked about psychology research, a commentary in New Scientist highlighted how other disciplines also suffer from this problem, something which is explored more in-depth in Stepping in the Same River Twice: Replication in Biological Research (Shavit & Ellison, 2017). But this is no reason to discard the scientific process. Science may have its failings, but science can fix it.