Determinants#
Definition: Determinant
The determinant of a square matrix \(M \in F^{n \times n}\) is the determinant of the linear transformation \(f: F^n \to F^n\) defined as \(f(\mathbf{v}) = M\mathbf{v}\).
Notation
Example
Let \(M = \begin{bmatrix} 1 & 2 \\ 3 & 4\end{bmatrix}\) be a real matrix. We use the determinant form \(\delta: F^2 \times F^2 \to F\) defined on the standard basis as \(\delta (\mathbf{e}_1, \mathbf{e}_2) = 1\).
By the definition of a determinant, we have:
Since \(\delta (\mathbf{e}_1, \mathbf{e}_2) = 1\) by definition, we have:
We now use the fact that \(\delta\) is multilinear:
Now we use the fact that \(\delta\) is alternating:
Theorem: Determinant of Matrix Product
The determinant of the product of two square matrices \(A, B \in F^{n \times n}\) is the product of the determinants of \(A\) and \(B\):
Proof
TODO
Theorem: Determinant of the Transpose
The determinant of a square matrix is equal to the determinant of its transpose:
Proof
TODO
Theorem: Determinant of the Inverse
If \(A \in F^{n \times n}\) is invertible, then the determinant of \(A^{-1}\) is the reciprocal of \(A\)'s determinants:
Proof
TODO
Theorem: Determinant of Scalar Multiplication
The determinant has the following property for all square matrices \(A \in F^{n \times n}\) and all \(\alpha \in F\):
Proof
TODO
Theorem: Determinant via Columns
Let \(A \in F^{n \times n}\) be a square matrix and let \(\delta: (F^n)^n \to F\) be the determinant form defined on the standard basis as \(\delta(\mathbf{e}_1, \dotsc, \mathbf{e}_n) = 1\).
If \(\mathbf{a}_1, \dotsc, \mathbf{a}_n\) are the columns of \(A\), then the determinant of \(A\) is equal to \(\delta(\mathbf{a}_1, \dotsc, \mathbf{a}_n)\):
Tip: Row and Column Operations
From this and the fact that \(\det A = \det A^{\mathsf{T}}\) we can immediately derive the effects of row and column operations on the determinant of \(A\):
- Swapping any two columns or any two rows switches the sign of the determinant.
- Multiplying a row or column by some \(\lambda \in F\), results in the determinant of \(A\) being multiplied by \(\lambda\).
- Adding a non-zero multple of a row / column to another row / column has no effect.
Proof
TODO
Theorem: Eigenvalues and Determinant
Let \(A \in F^{n \times n}\) be a square matrix
If \(A\) has \(l\) distinct eigenvalues \(\lambda_1, \dotsc, \lambda_l\) and the sum of their algebraic multiplicities is equal to \(n\), then the determinant of \(A\) is given as follows:
Proof
TODO
Theorem: Determinant of a \(2\times 2\)-Matrix
The determinant of every \(2\times 2\)-matrix \(A = \begin{bmatrix}a & b \\ c & d\end{bmatrix}\) is given by
Proof
We use the determinant form \(\delta: F^2 \times F^2 \to F\) defined on the standard basis as \(\delta (\mathbf{e}_1, \mathbf{e}_2) = 1\).
By the definition of a determinant, we have:
Since \(\delta (\mathbf{e}_1, \mathbf{e}_2) = 1\) by definition, we have:
We now use the fact that \(\delta\) is multilinear:
Now we use the fact that \(\delta\) is alternating (specifically that \(\delta(\mathbf{e}_i, \mathbf{e}_i)=0\) and \(\delta(\mathbf{e}_2, \mathbf{e}_1) = -\delta(\mathbf{e}_1, \mathbf{e}_2)\)):
Theorem: Determinant of a \(3\times 3\)-Matrix
The determinant of every \(3 \times 3\)-matrix \(A = \begin{bmatrix}a & b & c \\ d & e & f \\ g & h & i\end{bmatrix}\) is given by
Proof
TODO
Theorem: Leibniz Formula for Determinants
Let \(A = (a_{i,j}) \in F^{n \times n}\) be a square matrix and let \(S_n\) be the symmetric group on \(\{1, \dotsc, n\}\).
The determinant of \(A\) is given by
where \(\operatorname{sgn}(\sigma)\) is the sign of \(\sigma\).
Proof
TODO
Theorem: Laplace Expansion (Cofactor Expansion)
Let \(A = (a_{ij}) \in F^{n \times n}\) be a square matrix.
Notation
We use \(A_{ij}\) to denote the square matrix \(A_{ij} \in F^{(n-1)\times (n-1)}\) obtained by removing the \(i\)-th row and the \(j\)-th column of \(A\) and then sticking the rest of \(A\)'s rows and columns together.
The determinant of \(A\) is given by the following:
Proof
TODO