Effacing the Preface Paradox - by Kenny Collins

07/18/2022

The Preface Paradox goes as follows:

An author has written a book wherein he makes many claims. He has read over the book many times, sent each idea to reviewers/editors, and cross-referenced everything with existing research. Thus, he has reason to believe that all the claims he makes in the book are true.

However, he knows (from experience or otherwise) that in spite of his hard work and rigorous examination, there are likely still errors in the book. Thus, he has reason to believe that not all the claims in his book are true (or in other words, that at least one of the claims in his book are false).

And thus, he includes a preface at the beginning of the book with some message to note this (e.g. "There are likely some errors in this book and they are my sole responsibility.").

This sounds pretty normal - you can probably find or think of a book that has this kind of preface or addendum regarding the accuracy of claims that are made. However, maybe you realized something strange in the above scenario.

The author clearly went through a lot of effort to verify that the claims he wrote about were true. If he believed they were wrong in some way, he simply would not have included that claim. (Assuming a good-faith author.) So how can he simultaneously include the preface regarding his belief that there are errors in the book?

There's good reason behind that belief as well - tons of books include errors and with a big enough book, it can be hard for the author to fully verify the accuracy of each one. But again - he does believe in the veracity of each claim. We could go through the book asking him "Do you believe in this claim?" and of course he would answer "Yes" each time. Then we could ask "Do you believe that there is at least one of your claims are wrong?", and he might say "Yes" as well, which seems to create an illogical result.

We can extend this scenario to our own beliefs, treating our brain as the aforementioned book. We trust in many ideas in our head - general knowledge, historical facts, song lyric memorization, domain/field/career specific information, etc. And yet, if someone asked you "Do you think everything you believe is correct?", it seems natural to say no, which reiterates the contradiction. Why would you believe something you don't think is correct? You wouldn't. So why would you believe that anything you believe is incorrect?

Ponder the paradox a bit before reading the explanation.


One way to explain away this scenario is with a slight change in our thinking.

Previously, we were assuming that any belief could be assigned a dichotomous value judgment - Yes/No, True/False, 100%/0% sure, and so on. Instead, let's consider that beliefs can be held with different degrees of certainty. You might be totally sure that you can hold your breath for 5 seconds, but not so sure whether you can hold your breath for 45 seconds. Hence, we can assign probabilities to these beliefs (e.g. "I am 75% sure that..."). In fact, it absolutely makes sense to not hold any belief with 100% certainty since nothing is objectively factual. And for simplicity, we can assume that certainties with over 50% map to Yes/True, and certainties under 50% map to No/False.

What do these probabilities mean? Well, we can say that based on the information and our knowledge, that probability measures how likely the claim is correct. For example, if you think Team A is going to win a sports game, that belief will likely become stronger if you see them score a goal. What's important is that the probability can change just by a change in your perspective/the information you hold.

Once we shift to this model, some basic probability resolves the paradox. Let's say the author has just 15 claims he makes in the book, and that he's 95% certain about each individual one being true. If you went up to him and asked "Do you think your first idea is true?", he'd most likely say "Yes" with a high degree (95%) of certainty. However, the odds of all 15 ideas being true is (95%)¹⁵ = (0.95)¹⁵ = 0.463... ≈ 46%. Which would map to No, which would be why he includes the preface (and believes that there is an error in the book somewhere).

The more the author verifies each individual belief, the higher his belief in the idea becomes. If he reads over his ideas many times and consults other editors and writers, maybe his certainty in each idea can get bumped from 95% to 99%. That way after 15 ideas, he can still have (99%)¹⁵ = 0.99¹⁵ ≈ 86% confidence that all ideas in his book are correct.

And the more claims the author includes in his book, the less certain he can be about the overall book being totally correct. If he had 30 ideas in his book instead (each with 95% certainty), he would only have (95%)³⁰ = 0.95³⁰ ≈ 21% certainty that his book was error-free.

The paradox is now resolved, and all it took was considering the concept of belief certainty a little deeper. Instead of it being dichotomous (0% or 100%), there are infinite degrees to it (e.g. 25%, 50%, 75%, etc.). Like most concepts, the answer lies on a spectrum.

Note that earlier we made these assumption that having beliefs that one is >50% certain maps to Yes/True, and the others map to No/False (in terms of belief or acting on those beliefs). Of course, in real life, it is more complex than that. Since you are often using these values to decide on what actions you should perform, the actual probabilities need to be contrasted with the risk/reward associated with the events, as well as the cost of acting on the events. The chance of you dying in a car crash is low, but that doesn't mean you consider it a 0% chance and decide to not wear a seatbelt and drive blindfolded. The chance of an author's book being entirely correct might be over 50%, but he might include the preface to preserve his reputation in the slim chance there is an error.