Skip to content

Orthogonal Matrices#

Definition: Orthogonal Matrix

An orthogonal matrix is a real matrix \(A \in \mathbb{R}^{n\times n}\) which is the matrix representation of some orthogonal transformation \(T: \mathbb{R}^n \to \mathbb{R}^n\) with respect to the standard basis of \(\mathbb{R}^n\):

\[A = {}_{E_n}[T]_{E_n}\]

Theorem: Length Preservation \(\implies\) Orthogonal Matrices

A real square matrix is orthogonal if and only if multiplying a real column vector \(v \in \mathbb{R}^n\) by it preserves its length:

\[||Av|| = ||v||\]
Proof

TODO

Theorem: Inverses of Orthogonal Matrices

If \(A \in \mathbb{R}^{n \times n}\) is an orthogonal matrix, then it is invertible and its inverse is its transpose:

\[A^{-1} = A^{\mathsf{T}}\]
Proof

TODO

Theorem: Determinants of Orthogonal Matrices

The determinant of an orthogonal matrix is either \(+1\) or \(-1\).

Proof

TODO

Theorem: Orthonormal Bases from an Orthogonal Matrix

If \(A\in \mathbb{R}^{n \times n}\) is an orthogonal matrix, then:

Proof

TODO