Law of large numbers

From WikiMD's Food, Medicine & Wellness Encyclopedia

Lawoflargenumbers
Law of Large Numbers Example Graph
DiffusionMicroMacro

Law of Large Numbers (LLN) is a fundamental principle in probability theory and statistics that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials should be close to the expected value, and will tend to become closer as more trials are performed. The LLN is crucial for the fields of statistics, actuarial science, and gambling, among others, where it underpins many of the theoretical foundations.

Overview[edit | edit source]

The Law of Large Numbers is divided into two types: the Weak Law of Large Numbers (WLLN) and the Strong Law of Large Numbers (SLLN). Both versions of the law provide conditions under which sample averages converge to the expected value as the sample size increases, but they differ in the mode of convergence and the assumptions required for the convergence to hold.

Weak Law of Large Numbers[edit | edit source]

The Weak Law of Large Numbers, also known as Khintchine's law after Russian mathematician Aleksandr Khintchine, states that for any given positive number ε, the probability that the absolute difference between the sample average and the expected value is greater than ε converges to zero as the sample size increases to infinity. Mathematically, if X₁, X₂, ..., Xₙ are i.i.d. (independent and identically distributed) random variables with expected value μ and finite variance, then for any ε > 0,

\[ P\left(\left|\frac{1}{n}\sum_{i=1}^{n}X_i - \mu\right| > \epsilon\right) \rightarrow 0 \text{ as } n \rightarrow \infty. \]

Strong Law of Large Numbers[edit | edit source]

The Strong Law of Large Numbers states that the sample averages almost surely converge to the expected value. This means that the probability that the sample averages converge to the expected value is equal to one. Formally, if X₁, X₂, ..., Xₙ are i.i.d. random variables with expected value μ, then

\[ P\left(\lim_{n\rightarrow\infty} \frac{1}{n}\sum_{i=1}^{n}X_i = \mu\right) = 1. \]

Applications[edit | edit source]

The Law of Large Numbers has wide-ranging applications in various fields. In insurance, it helps in estimating the number of claims and the necessary reserves. In finance, it is used to model the returns of large portfolios. In polling and survey methodology, it underpins the accuracy of sample estimates. In quality control and manufacturing, it justifies the use of sample averages to estimate population parameters.

History[edit | edit source]

The concept of the Law of Large Numbers was first introduced by Jacob Bernoulli in the late 17th century, in his work "Ars Conjectandi". Bernoulli demonstrated a special case of the law, showing that the probability of the difference between the relative frequency of an event in a series of independent trials and the true probability of the event becomes arbitrarily small as the number of trials increases. It was later generalized and formalized by other mathematicians, including Siméon Denis Poisson, Andrey Kolmogorov, and Aleksandr Khintchine.

See Also[edit | edit source]

Wiki.png

Navigation: Wellness - Encyclopedia - Health topics - Disease Index‏‎ - Drugs - World Directory - Gray's Anatomy - Keto diet - Recipes

Search WikiMD


Ad.Tired of being Overweight? Try W8MD's physician weight loss program.
Semaglutide (Ozempic / Wegovy and Tirzepatide (Mounjaro) available.
Advertise on WikiMD

WikiMD is not a substitute for professional medical advice. See full disclaimer.

Credits:Most images are courtesy of Wikimedia commons, and templates Wikipedia, licensed under CC BY SA or similar.


Contributors: Prab R. Tumpati, MD