Baloney Detection Kit

In the news we often hear stories of ‘scientific studies’ which make claims such as: ‘the most addictive food is pizza’, or ‘the MMR vaccine causes Autism’ [1,2].  Those same news outlets will also report on studies that have found new evidence of climate change, or the effects of cannabis on opioid deaths [3]. Often, the same studies are discussed on multiple news outlets regardless of the credibility of the study.  When we have credible news sources sharing less credible information, how do we separate the wheat from the chaff? In his best selling book The Demon-Haunted World: Science as a Candle in the Dark, Carl Sagan presents what he calls the ‘Fine Art of Baloney Detection’ [4].  The kit contains a set of ideas which can be employed when examining information to determine how credible that information is.  The following bullets are Sagan’s own words, followed by my analysis:

• Wherever possible there must be independent confirmation of the 'facts'.


This is done through the peer-review process.  Having multiple groups of people come to the same conclusion and verify each others’ research method helps reduce confirmation bias and provides additional insight to flaws in the data that may have been missed.  We can do a similar process at home by searching for independent sources of information when trying to verify a claim.  If all the information is stemming from the same source and no other independent sources can be found, be skeptical.

• Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.


It is nearly unheard of for a genuine scientific breakthrough to be met with enthusiasm from the majority of scientists within that respective field.  The Heliocentric Solar System, Continental Drift, Genetic Inheritance, and many more were initially rejected or ignored before being recognized as true [5].  Scientists are human and have a tendency to want to be right. When science is properly debated the resulting scientific argument should change the minds of scientists who don’t initially believe in the conclusion of the study.  


• Arguments from authority carry little weight - 'authorities' have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.


The difference between an authority and an expert is often subtle, and less to do with the person in question and more to do with what they are saying and how we analyze that information.  An authority makes the argument that because they have studied something it must  be correct. This is the argument from authority fallacy.  Even experts can fall victim to this fallacy.  An expert is simply someone who has proven experience in what they are talking about.  The title of expert is given as a way to thin out who we pay attention to. Everyone has opinions, but we don’t have time to listen to them all, especially if they come from a person who doesn't have experience in studying the relevant material.  Experts are those who have shown that they deserve our attention on pertinent matters. That is not to say that all experts in a given field agree on everything. On the contrary, experts tend to disagree on quite a bit. These disagreements are worthy of our attention and critical examination.

• Spin more than one hypothesis. If there's something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among 'multiple working hypotheses', has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.* * This is a problem that affects jury trials. Retrospective studies show that some jurors make up their minds very early - perhaps during opening arguments - and then retain the evidence that seems to support their initial impressions and reject the contrary evidence. The method of alternative working hypotheses is not running in their heads.


This is the method of falsification discussed in the article 'Scientific Argumentation'.  Rather than prove a single hypothesis correct, disprove as many competing hypotheses as possible.  This is how we progress toward an emergent truth.

Image Source  here

Image Source here

• Try not to get overly attached to a hypothesis just because it's yours. It's only a way-station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don't, others will.


Be self-aware in order to avoid confirmation bias.  Being objective when analyzing your beliefs is critical and requires a certain level of intellectual humility (one of the Laws of Critical Thinking).  If a conclusion is made prematurely because the person(s) involved had a personal attachment to it rather than supporting evidence, eventually another scientist or group of scientists will find the fault and correct it.  That is the idea of the ‘error-correcting mechanism of science’.

• Quantify. If whatever it is you're explaining has some measure, some numerical quantity attached to it, you'll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.


I work as an analytical food chemist.  Most of my day is spent quantifying various chemical compositions in solutions.  If a sensory analyst tastes a flavor that shouldn’t be there, that can be a very subjective experience.  Another taster may not taste that off-flavor at all. Being able to quantify chemical composition helps give merit to one hypothesis over the other (that there is or isn’t an off-flavor).

• If there's a chain of argument, every link in the chain must work (including the premise) - not just most of them.


Scientific explanations need to follow either inductively or deductively.  If an explanation follows neither logical structure it leaves no reasonable path for analysis. Furthermore, the Law of Transitive Properties must be obeyed as well if we are to trust our conclusion. For example, consider the following argument:

P1: Michael is taller than Kevin.

P2: Kevin is taller than Jake.

C: Therefore, Michael is taller than Jake.


In this argument, both premises are necessary to form the conclusion.  Take away any premise and the conclusion no longer follows.


• Occam's Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.


Occam’s Razor is often misunderstood because of the use of the word ‘simplest’.  The formal definition is: one should not make more assumptions than the minimum needed [6].  For more information on Occam’s Razor and other philosophical razors, refer to Philosophical Razors.


• Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle - an electron, say - in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result. 


There are many more razors than just Occam’s Razor.  One of my personal favorites is affectionately called ‘Newton’s Flaming Laser Sword’, or ‘Alder’s Razor’.  Alder’s razor is the idea that what cannot be shown through experiment is not worth debate. Alder’s criticism was in science trying to tackle purely philosophical problems.  For example, his response to the question, “what happens when an unstoppable force meets an immovable object,” is roughly either the immovable object moves or it doesn’t, but there is no point in debating it if we can’t ever observe such a phenomenon [7].



Final Thoughts


The Baloney Detection Kit is a great place to start when learning to become a critical thinker.  It gives a solid set of analytical tools to add to your thinking belt. For more analytical tools, check out Intelligent Speculation’s other articles on critical thinking.  Remember, as Critical Thinkers it is just as important to examine our own beliefs as it is to question the illogical beliefs of others.  

“If you want to change the world, start with yourself.”- Mohandas Gandhi


***If you are interested in purchasing The Demon-Haunted World: Science as a Candle in the Dark, head over to our book store here.


References

  1. Glass, Jeremy. Science Proves Pizza is the Most Addictive Food. Thrillist (2015).

  2. Weiss, Chris. 5 Bad Scientific Studies That Fooled Millions. Mic (2013).

  3. Chelsea L. Shover, et al. Association between medical cannabis laws and opioid overdose mortality has reversed over time. Proceedings of the National Academy of Sciences of the United States of America. 6/10/2019

  4. The Demon-Haunted World: Science as a Candle in the Dark. Carl Sagan. Headline Book Publishing. London. 1996

  5. The Doc. Famous Scientists: The Art of Genius.

  6. Heylighen, F. Occam’s Razor. Principia Cybernetica (1995).

  7. Mike Alder. Wikipedia.