Law of large numbers



In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials should be close to the expected value, and will tend to become closer as more trials are performed.

The LLN is important because it guarantees stable long-term results for the averages of some random events. For example, while a casino may lose money in a single spin of the roulette wheel, its earnings will tend towards a predictable percentage over a large number of spins. Any winning streak by a player will eventually be overcome by the parameters of the game.

It is important to remember that the law only applies (as the name indicates) when a large number of observations is considered. There is no principle that
 * a small number of observations will coincide with the expected value (the results of opening a single Pro Kit Box can deviate significantly form its official drop rates), or that
 * a streak of one value will immediately be "balanced" by the others. (This is called the "gambler's fallacy": For example, even if a Tech Box grants Early and Initial Tech with the same probability, a streak of 20 Early Tech cards does not mean that a streak of Initial Tech or even 1 Initial Tech will follow.)

Forms
There are two different versions of the law of large numbers, called the weak and the strong law of large numbers. Both versions state that, given a sequence of random variables $$X_1, X_2, \ldots = (X_n)_{n \in \N} $$ with expected values $$\mathrm E[X_1] = \mathrm E[X_1] = \ldots = \mu$$, the sample average


 * $$\bar{X}_n=\frac1n(X_1+\cdots+X_n) $$

converges to the expected value


 * $$\overline{X}_n \, \to \, \mu \qquad\textrm{for}\qquad n \to \infty$$.