Law of Large Numbers: 4 Examples of the Law of Probability
Written by MasterClass
Last updated: Feb 10, 2022 • 4 min read
The law of large numbers suggests even the most seemingly random processes adhere to predictable calculations. This law of averages asserts the more you expand your sample size, the more likely you’ll find the results hewing close to your initially projected mean. Learn more about this fixture of probability and statistics.
Learn From the Best
What Is the Law of Large Numbers?
The law of large numbers (LLN) is a theoretical probability distribution that states you’ll reach your expected value (or predicted average) if you run the same trial a large number of times. There is both a weak law of large numbers and a strong law of large numbers. The weak law of large numbers presents a formula showing the LLN principle leads to likely outcomes. The strong law of large numbers presents a formula showing it leads to nearly certain ones.
In some cases—as exemplified by the Cauchy distribution function (a continuous distribution function), for instance—extensive hypothesis testing proves the law sometimes doesn’t apply, but this is a very rare occurrence.
A Brief History of the Law of Large Numbers
French theoretician Siméon Denis Poisson gave the law of large numbers its name, but it was Jacob Bernoulli who discovered it. Bernoulli was a seventeenth-century Swiss mathematician and the author of the seminal text Ars Conjectandi. Prior to receiving its more general name, the probabilistic precept’s name went by the label of “Bernoulli’s theorem.”
Other mathematicians deserve attribution for further contributions to this law. The Russian math wizards Andrey Kolmogorov and Andrey Markov expanded the initial theorem greatly. Pafnuty Chebyshev’s inequality theorem tacks on to the law to provide an even greater sense of certainty for predictions. Other related discoveries in the field since—like Bayes’ theorem and the central limit theorem—helped build an even stronger methodology for extremely precise predictions.
How to Use the Law of Large Numbers
Consider the case of a random number generator and assume you program it to output a random variable from one to ten every time. Your respective value would be all these numbers added up and then divided by ten ((1+2+3+4+5+6+7+8+9+10) / 10 = 5.5). By this metric, you determine you have a high probability of obtaining an expected value of 5.5 after you start running and averaging trials from your number generator.
If you only run your generator a small number of times, you’ll likely produce a standard deviation from the 5.5 (in other words, it’s likely to be much higher or lower than that expected number). But as the trials increase, you become more and more likely to reach that 5.5. expected value, so long as your variables remain independent and identically distributed (i.i.d.).
4 Examples of the Law of Large Numbers
You’ll find examples of the law of large numbers in action throughout the worlds of gambling, finance, and statistical analysis. Consider these four applicable scenarios to better understand of how the probability theory works in the real world.
- 1. Business growth rates: For many outside observers, the fluctuations of the stock market seem like truly random events. In reality, once you take enough of these calculations into account, they become easier to predict with relative frequency. Even the most successful companies often see a regression toward the mean (a growth rate that soars high only to come back to average growth over time). A large sample mean of multiple companies—such as the Dow Jones Industrial Average or the S&P 500—can help third parties predict the average performance of entire swaths of the market.
- 2. Casino games: Casino owners rely on the parameters set by the law of large numbers to ensure “the house always wins,” as the saying goes. In effect, though individual gamblers might win big at times, the sample average always leads to a far more normal distribution in which the casino takes in more money than it pays out. Players aware of this law and other statistical principles—like the gambler’s fallacy and the Bayesian inference (or Bayesian analysis)—can sometimes outwit the system or convince themselves to walk away while still ahead.
- 3. Coin tosses: With enough coin flips, you’ll always have a one-in-two chance of calling heads or tails correctly. So long as the person flipping uses a fair coin, these repetitions obey the law of large numbers by producing a steady proportion of heads to tails. Keep in mind this might not feel like the case in the short term—and in the world of probability, “short term” might extend well beyond one hundred or more coin tosses.
- 4. Population growth: Suppose a government census taker hopes to project the future population mean of the country. Based on past analysis, they realize families have two children on average throughout their country. After a random sampling of three families, the census taker sees one has four children, one has a single child, and the last has nine. While this might seem like proof the initial prediction was bunk, as the number of trials increases, it will become apparent that the more families the census taker surveys, the more likely the two-child average overall becomes.
Learn More
Get the MasterClass Annual Membership for exclusive access to video lessons taught by science luminaries, including Terence Tao, Bill Nye, Neil deGrasse Tyson, Chris Hadfield, Jane Goodall, and more.