Skip to content

Random Variables#

Definition: Random Variable

Suppose we have an experiment with sample space \(\Omega\).

A random variable is a real-valued function \(X: \Omega \to \mathbb{R}\).

Definition: Discrete and Continuous Random Variables

If the image of \(X\) is countable, then we call \(X\) a discrete random variable. Otherwise, we call it a continuous random variable.

Notation

It is typical to denote random variables via capital letters.

A random variable is just a way to assign a value to each outcome in the sample space of an experiment.

Example

Consider the experiment of tossing a coin twice. The sample space is \(S = \{\mathrm{TT}, \mathrm{HT}, \mathrm{TH}, \mathrm{HH}\}\). One random variable we could define is the number of heads \(X\) which appear in the outcome. We would then have

\[ \begin{aligned} X(\mathrm{TT}) = 0 \\ X(\mathrm{TH}) = 1 \\ X(\mathrm{HT}) = 1 \\ X(\mathrm{HH}) = 2 \end{aligned} \]

Cumulative Distribution Functions#

Definition: Cumulative Distribution Function (CDF)

The cumulative distribution function of a continuous random variable \(X: \Omega \to \mathbb{R}\) is the function \(F_X: \mathbb{R} \to [0;1]\) which to each \(x \in \mathbb{R}\) assigns the probability that \(X\) is less than \(x\) when the experiment is carried out.

\[ F_X(x) \overset{\text{def}}{=} P(X \lt x) \]

Theorem: Probability in an Interval

The probability that the value of a continuous random variable \(X\) falls in the interval \([[Random Variables#Cumulative Distribution Functions|a; b)\) is the difference in its [cumulative distribution function]] evaluated at \(a\) and \(b\):

\[ P(a \le X \lt b) = F(b) - F(a) \]
Proof

TODO

Theorem: Monotony of the CDF

The cumulative distribution function of each continuous random variable is increasing.

Proof

TODO

Theorem: Limits of the CDF

The [limits](../../Analysis/Real%20Analysis/Real%20Functions/Limits%20(Real%20Functions.md) of the cumulative distribution function \(F_X\) of each continuous random variable \(X\) for \(x \to -\infty\) and \(x \to +\infty\) are \(0\) and \(1\), respectively.

\[ \begin{aligned} \lim_{x \to -\infty} F_X(x) = 0 \\ \lim_{x \to +\infty} F_X(x) = 1 \end{aligned} \]
Proof

TODO

Discrete Random Variables#

Definition: Discrete Random Variables

A random variable is discrete if and only if its image is countable.

Essentially, a discrete random variable can take on either finitely many values or it can take on infinitely many values which can be arranged in a real sequence.

Example

Consider the experiment of tossing a coin twice. The sample space is \(S = \{\mathrm{TT}, \mathrm{HT}, \mathrm{TH}, \mathrm{HH}\}\). One random variable we could define is the number of heads \(X\) which appear in the outcome. We would then have

\[ \begin{aligned} X(\mathrm{TT}) = 0 \\ X(\mathrm{TH}) = 1 \\ X(\mathrm{HT}) = 1 \\ X(\mathrm{HH}) = 2 \end{aligned} \]

This is a discrete random variable, since it can take on only three possible values, namely \(0\), \(1\) and \(2\).

Probability Mass Functions#

Definition: Probability Mass Function

The probability mass function of a discrete random variable \(X: \Omega \to \mathbb{R}\) is the function \(p_X: \mathbb{R} \to [0;1]\) which to each possible value \(x \in X(\Omega)\) of \(X\) assigns the probability that \(X\) is equal to \(x\) when the experiment is carried out.

\[ p_X(x) \overset{\text{def}}{=} P(X = x) \]

Definition: Support of a Discrete Random Variable

The support of a discrete random variable \(X: \Omega \to \mathbb{R}\) is set of all values \(x_1, x_2, \dotsc \in \mathbb{R}\) which \(X\) can take on with a nonzero probability.

\[ \{ x \in \mathbb{R} \mid P(X = x) \gt 0\} \]

Definition: Mode

A mode of a discrete random variable \(X\) is the real number \(m \in \mathbb{R}\) for which the probability mass function of \(X\) is maximum.

Continuous Random Variable#

Definition: Continuous Random Variable

A random variable is continuous if and only if it is not discrete and its cumulative distribution function is differentiable except for possibly finitely many points where it is continuous but not differentiable.

Probability Density Function#

Definition: Probability Density Function

The probability density function of a continuous random variable is the derivative of its cumulative distribution function.

Properties#

Theorem: Non-negativity of Probability Density

The probability density function \(f\) of a continuous random variable is always non-negative.

\[ f(x) \ge 0 \]
Proof

TODO

Theorem: Probability in Interval

The probability that a continuous random variable \(X\) falls within the interval \([a;b]\) is given by the definite integral of its probability density function \(f\) over \([a;b]\):

\[ P(a \le X \le b) = \int_{[a;b]} f \]
Proof

TODO

Theorem: Normalization of Probability Density

The integral of the probability density function \(f\) of a continuous random variable from \(-\infty\) to \(+\infty\) is \(1\):

\[ \int_{-\infty}^{+\infty} f = 1 \]
Proof

TODO

Bibliography#