Laws of Large Numbers

Author

Oren Bochman

Published

Friday, January 2, 2026

In many cases important mathematical results have a number of increasingly generally versions that are proved over time. Often some are easier to prove than others - sometimes the more general versions are harder to prove but not always as greater abstractness can sometimes make proofs easier. It is however challenging for newcomers to approach these results and understand the relationships between the various versions and perhaps more significantly, thier proofs which may have been done using different techniques many years apart. One big help is to understand the historical context in which these results were proved as the motivations of subsequent mathematicians were often influenced by the work of thier predecessors as well as the subsequent discovery of shortcomings in earlier results as greater rigour became the norm in mathematics.

Perhaps the first version of this result is due to Italian Physician and Mathematician Girolamo Cardano (1501-1576) who published the first solution of the cubic. In his Liber de Ludo Aleae (Book on Games of Chance) around 1564 stated a version of the law of large numbers in the context of gambling. He observed that as the number of trials increases, the relative frequency of an event approaches its theoretical probability. However, Cardano did not provide a formal proof of this observation and after he got in trouble with the inquisition, his non medical works were banned and only published posthumously in 1663. This was however an age in which mathematics lacked much rigour and was grounded in words and images rather than algebra and equations we are used to today. I would characterize it as an age where science was primarily in organizing old results and less about new discoveries and proofs. Negative numbers were not even accepted as valid mathematical objects at this time.

Next up was the Swiss Mathematician Jakob Bernoulli (1655-1705). His version of the Law of large numbers, was published his Ars Conjectandi in 1713 posthumously. It worth while mentioning that this is one taught in most introduction to probability courses and is sometimes called the weak law of large numbers and is easier to prove than other laws of large numbers. It is called weak because it deals with convergence in probability rather than almost sure convergence. However another point is that it makes a strong assumption that the trials are independent and identically distributed (i.i.d), one which may not hold in many practical situations and eventually was discovered to be unnecessary.

In it he proved the following theorem. This version is now called the Bernoulli law of large numbers and it assumes that the trials are independent and identically distributed (i.i.d).

Let X_1, X_2, ..., X_n be a sequence of i.i.d random variables with common mean μ and variance \sigma^2 < \infty. Then for any \varepsilon > 0, P\left(\left|\frac{1}{n} \sum_{i=1}^n X_i - \mu \right| \geq \varepsilon\right) \to 0 as n \to \infty.

This means that the sample average converges in probability to the expected value as the number of trials increases. So far so good.

Andrey Markov


Algebra

600 = 2 x + y 350 = 1 x + y 600 - 350 = (2x + y) - (1x + y) = x 250 = x y = 600 - 2(250) = 100


p + d = 11 p + c = 13 d + c =3

2p + d + c = 24 2p = 24 - 3 = 21 p = 10.5 d = 11 - p = 0.5 c = 13 - p = 2.5 p + d + c = 10.5 + 0.5 + 2.5 = 13.5


a < b c < d

2c = a + b
2b = c + d

a < c < b < d

+—+—+—+ a b c d d-a = 60

b-c = (a+b +c +d) /2 60 = - a + d

add the first two equations:

a = d - 60 sub a into first equation: 0 = d - 60 + b - 2c

b - c = ?

Citation

BibTeX citation:
@online{bochman2026,
  author = {Bochman, Oren},
  title = {Laws of {Large} {Numbers}},
  date = {2026-01-02},
  url = {https://orenbochman.github.io/posts/2026/2026-01-02-laws-of-large-numbers/},
  langid = {en}
}
For attribution, please cite this work as:
Bochman, Oren. 2026. “Laws of Large Numbers.” January 2, 2026. https://orenbochman.github.io/posts/2026/2026-01-02-laws-of-large-numbers/.