What Is Variance in Statistics?

Definition

Variance is a statistical measure that quantifies how far a set of values is spread out from the mean. It is calculated by averaging the squared differences between each value and the mean. The larger the variance, the more spread out the data.

How to Calculate Variance

Find the mean, subtract it from each value, square each result, and then average those squared differences.

Example

Daily temperatures for a week (in Celsius): 20, 22, 19, 21, 23

Mean: (20 + 22 + 19 + 21 + 23) / 5 = 21

Squared differences: (20-21)^2 + (22-21)^2 + (19-21)^2 + (21-21)^2 + (23-21)^2 = 1 + 1 + 4 + 0 + 4 = 10

Variance: 10 / 5 = 2

Why It Matters

Variance is a building block for many statistical methods. It is central to ANOVA (analysis of variance), regression, and portfolio theory in finance. Understanding variance helps you quantify risk, assess consistency, and compare the reliability of different processes.

In practice, you will often see standard deviation used for reporting because it is easier to interpret. But behind the scenes, variance is doing the mathematical heavy lifting. Many formulas in advanced statistics work with variance directly because squared values have convenient mathematical properties.

Key Takeaway

Variance measures data spread using squared units. For everyday interpretation, take its square root to get the standard deviation.

← Back to Glossary