Expectation#
Definition: Expectation (Discrete Case)
The expectation of a discrete random variable \(X\) with support \(S = \{s_1, s_2, \dotsc\}\) and probability mass function \(p\) is the value of the series
if it exists.
Note: Finite Support
If the support \(S\) is finite, then the Expectation of \(X\) reduces to the sum
Definition: Expectation (Continuous Case)
The expectation of a continuous random variable \(X\) with probability function \(f\) is the integral of \(x f(x)\) from \(-\infty\) to \(+\infty\):
Notation
The Expectation of a random variable \(X\) is usually denoted in one of the following ways:
Note
The Expectation of a random variable may also be called its expected value or mean.
Properties#
Theorem: Range of the Expected Value
The Expectation of a random variable \(X\) always lies between its minimum and maximum value:
Proof
TODO
Theorem: Linearity of Expectation
Expectation is a linear operation - for all Random Variables \(X\) and \(Y\) and all \(\lambda, \mu \in \mathbb{R}\), we have
Proof
TODO
Theorem: Law of the Unconscious Statistician (LOTUS)
Let \(X\) be a random variable and let \(g: \mathbb{R} \to \mathbb{R}\) be a real function.
If \(X\) is discrete with support \(S = \{x_1, x_2, \dotsc \}\) and probability mass function \(p\), the Expectation of \(g(X)\) is given by the value of the following series:
If \(X\) is continuous with probability density function \(p\), then the Expectation of \(g(X)\) is given by the following integral:
Proof
TODO
Independent Random Variables#
Definition: Independent Random Variables
Two Random Variables \(X\) and \(Y\) are independent if and only if the events \(x \lt X\) and \(y \lt Y\) are independent for all \(x, y \in \mathbb{R}\).
Properties#
Theorem: Expectation of the Product of Independent Random Variables
The Expectation of the product of two independent Random Variables \(X\) and \(Y\) is equal to the product of their expectations:
Proof
TODO