Orthogonal Matrices#
Definition: Orthogonal Matrix
An orthogonal matrix is a real matrix \(A \in \mathbb{R}^{n\times n}\) which is the matrix representation of some orthogonal transformation \(T: \mathbb{R}^n \to \mathbb{R}^n\) with respect to the standard basis of \(\mathbb{R}^n\):
Theorem: Length Preservation \(\implies\) Orthogonal Matrices
A real square matrix is orthogonal if and only if multiplying a real column vector \(v \in \mathbb{R}^n\) by it preserves its length:
Proof
TODO
Theorem: Inverses of Orthogonal Matrices
If \(A \in \mathbb{R}^{n \times n}\) is an orthogonal matrix, then it is invertible and its inverse is its transpose:
Proof
TODO
Theorem: Determinants of Orthogonal Matrices
The determinant of an orthogonal matrix is either \(+1\) or \(-1\).
Proof
TODO
Theorem: Orthonormal Bases from an Orthogonal Matrix
If \(A\in \mathbb{R}^{n \times n}\) is an orthogonal matrix, then:
- the columns of \(A\) form an orthonormal basis of the real vector space \(\mathbb{R}^n\);
- the columns of its transpose \(A^\mathsf{T}\) form an orthonormal basis of the real vector space \(\mathbb{R}^n\).
Proof
TODO