From The Signal and the Noise: Why So Many Predictions Fail-but Some Don’t by Nate Silver:
Distinguishing the signal from the noise requires both scientific knowledge and self-knowledge: the serenity to accept the things we cannot predict, the courage to predict the things we can, and the wisdom to know the difference.”
A clever formulation of the Serenity Prayer. There is a classic Mark Twain aphorism, “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” Ain’t that the truth:
One of the pervasive risks that we face in the information age, as I wrote in the introduction, is that even if the amount of knowledge in the world is increasing, the gap between what we know and what we think we know may be widening.”
[…]
“As the statistician George E. P. Box wrote, “All models are wrong, but some models are useful.” What he meant by that is that all models are simplifications of the universe, as they must necessarily be. As another mathematician said, “The best model of a cat is a cat.”…The key is in remembering that a model is a tool to help us understand the complexities of the universe, and never a substitute for the universe itself.”
The model is a tool, not a reality. If only we were better at remembering that most metrics and dashboards are proxies for things we care about and not the actual things we care about.
We are always filtering. The more inbound, the more we rely on crude and deforming filters.
Meanwhile, exposure to so many new ideas was producing mass confusion. The amount of information was increasing much more rapidly than our understanding of what to do with it, or our ability to differentiate the useful information from the mistruths. Paradoxically, the result of having so much more shared knowledge was increasing isolation along national and religious lines. The instinctual shortcut that we take when we have “too much information” is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies with those who have made the same choices and enemies ofthe rest.”
We are also, as Daniel Kahneman argued, always trying to construct causal explanations of the world. We understand life as “the steady flow of hindsight in the valley of the normal.”
“The most calamitous failures of prediction usually have a lot in common. We focus on those signals that tell a story about the world as we would like it to be, not how it really is. We ignore the risks that are hardest to measure, even when they pose the greatest threats to our well-being. We make approximations and assumptions about the world that are much cruder than we realize. We abhor uncertainty, even when it is an irreducible part of the problem we are trying to solve.”
We see this failure in Goodhart’s law & Taleb’s Black Swan. Conditional probability, when used appropriately, remains one of our best tools:
“Bayes’s theorem deals with epistemological uncertainty—the limits of our knowledge…What isn’t acceptable under Bayes’s theorem is to pretend that you don’t have any prior beliefs. You should work to reduce your biases, but to say you have none is a sign that you have many. To state your beliefs up front—to say “Here’s where I’m coming from”—is a way to operate in good faith and to recognize that you perceive reality through a subjective filter.”
The Bayesian is a huge part of making good decisions as a doctor.