Matrix Invertibility#
Definition: Identity Matrix
The \(n\times n\)-identity matrix \(I_n \in F^{n \times n}\) over some field \(F\) is the square matrix which has the multiplicative identity of \(F\) as its entries on the diagonal and whose other entries are the additive identity of \(F\):
Theorem: Multiplication with the Identity Matrix
The product of any \(m\times n\)-matrix \(A\) with the identity matrix \(I_m\) on the left or the identity matrix \(I_n\) is \(A\) itself.
Proof
TODO
Definition: Invertible Matrix
A square matrix \(A \in F^{n \times n}\) is invertible if there exists a square matrix \(A^{-1} \in F^{n \times n}\) such that the matrix products \(AA^{-1}\) and \(A^{-1}A\) are equal to the identity matrix \(I_n\).
The matrices \(A\) and \(A^{-1}\) are called inverses of each other.
Theorem: The Invertible Matrix Theorem
The following statements are equivalent for every square matrix \(A \in F^{n \times n}\):
- \(A\) is invertible.
- The transpose of \(A\) is invertible.
- The determinant of \(A\) is not zero, i.e. \(\det(A) \ne 0_F\).
- The reduced row echelon form of \(A\) is the identity matrix \(I_n\).
- The system of linear equations \(A \vec{x} = \vec{b}\) has a single solution for each \(\vec{b} \in F^n\).
- The column space of \(A\) is the vector space \(F^n\).
- The row space of \(A\) is the vector space \(F^{1 \times n}\).
- The rank of \(A\) is \(n\).
Proof
TODO
Theorem: Antidistributivity of Matrix Inversion
Matrix inversion is antidistributive over matrix products - if \(A, B \in F^{n \times}\) and their matrix product \(AB\) are invertible, then:
Proof
TODO
Algorithm: Matrix Inversion
To find the inverse of an invertible matrix \(A \in F^{n \times n}\):
- Notate an \(n\times 2n\)-matrix \((A\mid I_n)\) by sticking the identity matrix \(I_n\) to the right of \(A\).
- Perform Gauss-Jordan elimination on \((A \mid I_n)\). If \(A\) is indeed invertible, the final result will be \((I_n \mid A^{-1})\).
Example
Let's find the inverse of \(A = \begin{bmatrix}6 & 8 & 3 \\ 4 & 7 & 3 \\ 1 & 2 & 1\end{bmatrix}\). Notate
Perform Gauss-Jordan elimination:
Theorem: Inverting \(2\times2\)-Matrices
A \(2\times 2\)-matrix \(A = \begin{bmatrix}a & b \\ c & d\end{bmatrix}\) is invertible if and only if
If \(A\) is invertible matrix, then
Proof
TODO