Assume \(X_1, X_2, \cdots, X_n\) are i.i.d. (independent and identically distributed) RVs with mean \(\mu\) and variance \(\sigma^2\).
\[ \bar{X}_n \stackrel{\text{def}}{=} \frac{X_1+X_2+\cdots+X_n}{n}=\frac{1}{n}\sum_{i=1}^n X_i \]
Loosely speaking, the LLN states that with high probability, the sample mean \((\bar{X}_n)\) of many i.i.d. random variables is very close to the true mean \((\mu)\).
Note \(\bar{X}\) is not a number, but a random variable.
\[ \small{ \begin{aligned} \bar{X}_n&=\frac{X_1+X_2+\cdots+X_n}{n} \\ \\ \text{E}[\bar{X}_n]&=\frac{\text{E}[X_1]+\text{E}[X_2]+\cdots+\text{E}[X_n]}{n}=\frac{n\mu}{n}=\mu \\ \\ \text{var}(\bar{X}_n)&=\frac{\text{var}(X_1)+\text{var}(X_2)+\cdots+\text{var}(X_n)}{n^2}=\frac{n\sigma^2}{n^2}=\frac{\sigma^2}{n} \\ \end{aligned} } \]
We apply the Chebyshev inequality and obtain
\[ \text{P}\big(|\bar{X}_n - \mu| \geq \epsilon \big) \leq \frac{\sigma^2}{n\epsilon^2}, \;\; \text{for all $\epsilon > 0$.} \]
\[ \text{P}\big(|\bar{X}_n - \mu| \geq \epsilon \big) \leq \frac{\sigma^2}{n\epsilon^2}, \;\; \text{for all $\epsilon > 0$.} \]
For any fixed \(\epsilon\), the right-hand side of the inequality approaches to zero as \(n\) increases.
Mark a blue dot for heads, a red dot for tails
Let \(X_1, X_2, \cdots, X_n\) be i.i.d. Bernoulli RV with \(p=0.5\).
\[ \mu = \text{E}[X_i] = 0.5, \;\;\;\;\;\;\bar{X}_n = \frac{1}{n}\sum_{i=1}^n X_i \]
The proportion of heads (\(\bar{X}_n\)) gets closer to 0.5 as \(n\) increases.
Let \(X_1, X_2, \cdots, X_n\) be i.i.d. RVs with mean \(\mu\).
\[ \lim_{n\rightarrow +\infty}\text{P}\big(|\bar{X}_n - \mu| < \epsilon\big)=1, \;\; \text{for every $\epsilon >0$}. \]
We say the sequence \(\bar{X}_n\) converges to \(\mu\) in probability.


Keep tossing a six-sided die.




