Probability#
At its core, the probability of an [[Experiments|event]] is just a [[The Real Numbers|real number]] between \(0\) and \(1\), inclusively, which measures the likelihood of that event occurring.
Definition: Probability Space
A probability space \((\Omega, P)\) is a [[Experiments|sample space]] \(\Omega\) equipped with a [[../Analysis/Real Analysis/Real-Valued Functions|real-valued]] probability function \(P: \mathcal{P}(\Omega) \to [0;1]\) defined on the [[../Set Theory/Sets]] of \(\Omega\) with the following properties:
- \(P(\varnothing) = 0\) and \(P(\Omega) = 1\);
- For every [[../Set Theory/Cardinality|countable]] [[../Set Theory/Collections|collection]] \(\mathcal{E} = \{E_1, E_2, \dotsc \}\) of [[Experiments|mutually exclusive]] [[Experiments|events]], we have
Notation
Some people may denote the probability space as \((\Omega, \mathcal{P}(\Omega), P)\).
Definition: (Absolute) Probability
Given an [[Experiments|event]] \(E \in \mathcal{P}(\Omega)\), we call \(P(E)\) the (absolute) probability of \(E\).
Properties#
Theorem: Probability of Unions
If \(A\) and \(B\) are arbitrary [[Experiments|events]] in a [[Probability Spaces|probability space]], then
Proof
TODO
Conditional Probability#
Definition: Conditional Probability
Let \(A\) and \(B\) be two [[Experiments|events]] in a [[Probability Spaces|probability space]].
The probability of \(A\) given \(B\) is defined as
Note: Prior and Posterior Probabilities
In the context of conditional probabilities, the number \(P(A)\) is often called the prior probability and \(P(A\mid B)\) the posterior probability of \(A\).
Conditional probability is a measure of the likelihood that \(A\) will occur if we know that \(B\) has occurred.
Definition: Independent Events
Let \(A\) and \(B\) be two [[Experiments|events]] in a [[Probability Spaces|probability space]].
We say that \(A\) is independent of \(B\) if the [[Probability Spaces|conditional probability]] of \(A\) given \(B\) is the same as the [[Probability Spaces|absolute probability]] of \(A\).
Theorem: Mutual Independence
If \(A\) is [[Probability Spaces|independent]] of \(B\), then \(B\) is also [[Probability Spaces|independent]] of \(A\).
Proof
This follows directly from [[Probability Spaces#Properties|Bayes' rule]]
Properties#
Theorem: Bayes' Rule
If \(A\) and \(B\) are two [[Experiments|events]] in a [[Probability Spaces|probability space]], then their [[Probability Spaces#Conditional Probability|conditional probabilities]] are related as follows:
Note
Bayes' rules essentially allows us to switch the events around.
Proof
By definition,
and so
Similarly,
and so
By combining the two equations, we obtain the result from the theorem.
Theorem: Law of Total Probability
Let \((\Omega, P)\) be a [[Probability Spaces|probability space]], let \(\{B_1, \dotsc, B_n\}\) be a [[../Set Theory/Collections|collection]] of [[Experiments|events]] and let \(A\) be some other [[Experiments|event]].
If \(\{B_1, \dotsc, B_n\}\) are [[Experiments|mutually exclusive]] and their [[../Set Theory/Collections|union]] is \(\Omega\), then the [[Probability Spaces|probability]] of \(A\) is the sum of its [[Probability Spaces|conditional probability]] given each \(B_i\) multiplied by the [[Probability Spaces|probability]] of \(B_i\):
Proof
TODO