Skip to content

Matrix Exponentiation#

Theorem: Convergence of the Matrix Exponential

Let \(F\) be the field of the real numbers or the field of the complex numbers.

The matrix power series

\[\sum_{k = 0}^{\infty} \frac{1}{k!}\boldsymbol{A}^k = I_n + \boldsymbol{A} + \frac{1}{2}\boldsymbol{A}^2 + \frac{1}{3!}\boldsymbol{A}^3 + \cdots\]

is convergent for all square matrices \(\boldsymbol{A} \in F^{n \times n}\).

Definition: Matrix Exponential Function

The matrix exponential function is the Matrix function defined by the aforementioned power series:

\[\exp: F^{n \times n} \to F^{n \times n} \qquad \exp(\boldsymbol{A}) = \sum_{k = 0}^{\infty} \frac{1}{k!}\boldsymbol{A}^k\]

Notation

\[\mathrm{e}^{\boldsymbol{A}} \qquad \exp (\boldsymbol{A})\]
Proof

TODO

Theorem: Matrix Exponential of Diagonal Matrices

The matrix exponential of a real diagonal matrix \(\boldsymbol{A} = \operatorname{diag}(a_1, \dotsc, a_n) \in \mathbb{R}^{n \times n}\) is the real diagonal matrix of the real exponentials of \(\boldsymbol{A}\)'s diagonal entries:

\[\mathrm{e}^{\operatorname{diag}(a_1, \dotsc, a_n)} = \operatorname{diag}(\mathrm{e}^{a_1}, \dotsc, \mathrm{e}^{a_n})\]
\[\boldsymbol{A} = \begin{bmatrix}a_1 & 0 & \cdots & 0 \\ 0 & a_2 & \ddots & \vdots \\ \vdots & \ddots & \ddots & 0 \\ 0 & \cdots & 0 & a_n\end{bmatrix} \implies \mathrm{e}^{\boldsymbol{A}} = \begin{bmatrix}\mathrm{e}^{a_1} & 0 & \cdots & 0 \\ 0 & \mathrm{e}^{a_2} & \ddots & \vdots \\ \vdots & \ddots & \ddots & 0 \\ 0 & \cdots & 0 & \mathrm{e}^{a_n}\end{bmatrix}\]
Proof

TODO

Theorem: Matrix Exponential of Nilpotent Matrices

If a real square matrix \(\boldsymbol{A} \in \mathbb{R}^{n \times n}\) is nilpotent with \(\boldsymbol{A}^r = \boldsymbol{0}\), then its matrix exponential is given by the following sum:

\[\mathrm{e}^{\boldsymbol{A}} = \sum_{k = 0}^{r-1} \frac{1}{k!} \boldsymbol{A}^k\]
Example

Consider the following \(\boldsymbol{A} \in \mathbb{R}^{n \times n}\):

\[\boldsymbol{A} = \begin{bmatrix} 0 & 1 \\ 0 & 0\end{bmatrix}\]

It is nilpotent, since \(\boldsymbol{A}^k = \boldsymbol{0}\) for all \(k \ge 2\). Its matrix exponential is thus the following:

\[\begin{aligned}\mathrm{e}^{\boldsymbol{A}} & = \sum_{k = 0}^1 \frac{1}{k!}\boldsymbol{A}^k \\ & = \frac{1}{0!}\boldsymbol{A}^0 + \frac{1}{1!} \boldsymbol{A}^1 \\ & = \boldsymbol{I}_n + \boldsymbol{A} \\ & = \begin{bmatrix} 1 & 0 \\ 0 & 1\end{bmatrix} + \begin{bmatrix}0 & 1 \\ 0 & 0\end{bmatrix} \\ & = \begin{bmatrix}1 & 1 \\ 0 & 1 \end{bmatrix}\end{aligned}\]
Proof

TODO

Theorem: Matrix Exponential of Diagonalizable Matrices

If a square matrix \(\boldsymbol{A} \in F^{n \times n}\) is diagonalizable with \(\boldsymbol{A} = \boldsymbol{P} \boldsymbol{D} \boldsymbol{P}^{-1}\), where \(\boldsymbol{D} = \operatorname{diag}(\lambda_1, \dotsc, \lambda_n)\), then its matrix exponential is the following:

\[\mathrm{e}^{\boldsymbol{A}} = \boldsymbol{P} \mathrm{e}^{\boldsymbol{D}}\boldsymbol{P}^{-1}\]
Example

Consider the following real square matrix:

\[\boldsymbol{A} = \begin{bmatrix} 1 & 2 \\ 2 & 1\end{bmatrix}\]

It has the eigenvalues \(\lambda_1 = 3\) and \(\lambda_2 = -1\) with the following corresponding eigenvectors:

\[\boldsymbol{v}_1 = \begin{bmatrix}1 \\ 1\end{bmatrix} \qquad \boldsymbol{v}_2 = \begin{bmatrix} 1 \\ -1 \end{bmatrix}\]

It is thus diagonalizable:

\[\boldsymbol{P} = \begin{bmatrix} 1 & 1 \\ 1 & -1\end{bmatrix} \qquad \boldsymbol{D} = \begin{bmatrix}3 & 0 \\ 0 & -1\end{bmatrix} \qquad \boldsymbol{P}^{-1} = \begin{bmatrix}\frac{1}{2} & \frac{1}{2} \\ \frac{1}{2} & -\frac{1}{2}\end{bmatrix}\]

Its matrix exponential is thus the following:

\[\begin{aligned}\mathrm{e}^{\boldsymbol{A}} & = \boldsymbol{P} \mathrm{e}^{\boldsymbol{D}}\boldsymbol{P}^{-1} \\ & = \begin{bmatrix} 1 & 1 \\ 1 & -1\end{bmatrix} \begin{bmatrix}\mathrm{e}^3 & 0 \\ 0 & \mathrm{e}^{-1}\end{bmatrix} \begin{bmatrix}\frac{1}{2} & \frac{1}{2} \\ \frac{1}{2} & -\frac{1}{2} \end{bmatrix} \\ & = \frac{1}{2}\begin{bmatrix} \mathrm{e}^3 + \mathrm{e}^{-1} & \mathrm{e}^3 - \mathrm{e}^{-1} \\ \mathrm{e}^3 - \mathrm{e}^{-1} & \mathrm{e}^3 + \mathrm{e}^{-1}\end{bmatrix}\end{aligned}\]
Proof

TODO

Theorem: Exponential of Sum

Let \(\boldsymbol{A}, \boldsymbol{B} \in \mathbb{R}^{n \times n}\) be real square matrix.

If the matrix product of \(\boldsymbol{A}\) and \(\boldsymbol{B}\) is commutative, i.e. \(\boldsymbol{A}\boldsymbol{B} = \boldsymbol{B} \boldsymbol{A}\), then the exponential of their sum is the product of their exponentials:

\[\mathrm{e}^{\boldsymbol{A} + \boldsymbol{B}} = \mathrm{e}^{\boldsymbol{A}}\mathrm{e}^{\boldsymbol{B}}\]
Proof

TODO

Theorem: Invertibility of the Matrix Exponential

The matrix exponential is invertible:

\[(\mathrm{e}^{\boldsymbol{A}})^{-1} = \mathrm{e}^{-\boldsymbol{A}}\]
Proof

TODO