Definition
Bayes' theorem is a mathematical formula that describes how to update the probability of a hypothesis when new evidence becomes available. It connects prior knowledge with observed data to produce a revised probability.
How It Works
You start with an initial belief (prior probability), observe new evidence, and then calculate an updated belief (posterior probability).
A disease affects 1% of the population. A test for the disease is 90% accurate (it correctly identifies 90% of sick people and 90% of healthy people).
You test positive. What is the probability you actually have the disease?
Using Bayes' theorem: P(sick|positive) = (0.90 x 0.01) / ((0.90 x 0.01) + (0.10 x 0.99)) = 0.009 / 0.108 = about 8.3%
Despite the "90% accurate" test, there is only an 8.3% chance you are actually sick. The low base rate (1%) means most positives are false positives.
Why It Matters
Bayes' theorem is the foundation of Bayesian statistics, an entire branch of statistical thinking. It powers email spam filters, medical diagnostic tools, recommendation systems, and machine learning algorithms.
The theorem also reveals a critical insight: the accuracy of a test is not enough to interpret results. You must also consider how common the condition is (the base rate). Ignoring base rates is one of the most common reasoning errors people make.
Bayes' theorem tells you how to update probabilities with new evidence. Always consider the base rate - a "highly accurate" test can still produce mostly false positives for rare conditions.