Skip to content

Expectation#

Definition: Expectation (Discrete Case)

The expectation of a discrete random variable \(X\) with support \(S = \{s_1, s_2, \dotsc\}\) and probability mass function \(p\) is the value of the series

\[ \sum_{i=1}^{\infty} s_i \cdot p(s_i), \]

if it exists.

Note: Finite Support

If the support \(S\) is finite, then the Expectation of \(X\) reduces to the sum

\[ \sum_{i = 1}^{|S|} s_i \cdot p(s_i) \]

Definition: Expectation (Continuous Case)

The expectation of a continuous random variable \(X\) with probability function \(f\) is the integral of \(x f(x)\) from \(-\infty\) to \(+\infty\):

\[ \int_{-\infty}^{+\infty} x f(x) \mathop{\mathrm{d}x} \]

Notation

The Expectation of a random variable \(X\) is usually denoted in one of the following ways:

\[ EX \qquad \mathrm{E}[X] \qquad \mathrm{E}(X) \qquad \mathbb{E}(X) \qquad \langle X \rangle \qquad \bar{X} \qquad \mu_X \]

Note

The Expectation of a random variable may also be called its expected value or mean.

Properties#

Theorem: Range of the Expected Value

The Expectation of a random variable \(X\) always lies between its minimum and maximum value:

\[ \mathbb{E}[X] \in [\min X; \max X] \]
Proof

TODO

Theorem: Linearity of Expectation

Expectation is a linear operation - for all Random Variables \(X\) and \(Y\) and all \(\lambda, \mu \in \mathbb{R}\), we have

\[ \mathbb{E}[\lambda X + \mu Y] = \lambda \, \mathbb{E}[X] + \mu \, \mathbb{E}[Y] \]
Proof

TODO

Theorem: Law of the Unconscious Statistician (LOTUS)

Let \(X\) be a random variable and let \(g: \mathbb{R} \to \mathbb{R}\) be a real function.

If \(X\) is discrete with support \(S = \{x_1, x_2, \dotsc \}\) and probability mass function \(p\), the Expectation of \(g(X)\) is given by the value of the following series:

\[ \mathbb{E}[g(X)] = \sum_{i} g(x_i) \cdot p(x_i) \]

If \(X\) is continuous with probability density function \(p\), then the Expectation of \(g(X)\) is given by the following integral:

\[ \mathbb{E}[g(X)] = \int_{-\infty}^{+\infty} g(x) p(x) \mathop{\mathrm{d}x} \]
Proof

TODO

Independent Random Variables#

Definition: Independent Random Variables

Two Random Variables \(X\) and \(Y\) are independent if and only if the events \(x \lt X\) and \(y \lt Y\) are independent for all \(x, y \in \mathbb{R}\).

Properties#

Theorem: Expectation of the Product of Independent Random Variables

The Expectation of the product of two independent Random Variables \(X\) and \(Y\) is equal to the product of their expectations:

\[ \mathbb{E}[X \cdot Y] = \mathbb{E}[X] \cdot \mathbb{E}[Y] \]
Proof

TODO