Skip to content

Hessian Matrix#

Theorem: Hessian Matrix via Partial Derivatives

Let \(f: \mathcal{D} \subseteq \mathbb{R}^n \to \mathbb{R}\) be a real scalar field.

If \(f\) is twice totally differentiable at \(\boldsymbol{p} \in \operatorname{int} \mathcal{D}\), then \(f\)'s Hessian matrix at \(\boldsymbol{p}\) is given by \(f\)'s partial derivatives as follows:

\[H_f(\boldsymbol{p}) = \begin{bmatrix}\partial_1 \partial_1 f(\boldsymbol{p}) & \cdots & \partial_1\partial_n f(\boldsymbol{p}) \\ \vdots & \ddots & \vdots \\ \partial_n \partial_1 f(\boldsymbol{p}) & \cdots & \partial_n\partial_n f(\boldsymbol{p})\end{bmatrix}\]
Example: \(f(\boldsymbol{x}) = \boldsymbol{a}^{\mathsf{T}}\boldsymbol{x}\)

Consider the real scalar ield \(f: \mathbb{R}^n \to \mathbb{R}\) defined as

\[f(\boldsymbol{x}) = \boldsymbol{a}^{\mathsf{T}}\boldsymbol{x}\]

for some fixed real vector \(\boldsymbol{a} = \begin{bmatrix} a^1 & \cdots & a^n \end{bmatrix}^{\mathsf{T}}\in \mathbb{R}^n\).

Its Hessian matrix is zero everywhere:

\[H_f(\boldsymbol{p}) = \boldsymbol{0}\]
Example: \(f(\boldsymbol{x}) = \boldsymbol{x}^{\mathsf{T}} \boldsymbol{A} \boldsymbol{x}\)

Consider the real scalar field \(f: \mathbb{R}^n \to \mathbb{R}\) defined as

\[f(\boldsymbol{x}) = \boldsymbol{x}^{\mathsf{T}} \boldsymbol{A} \boldsymbol{x}\]

for some real matrix \(\boldsymbol{A} \in \mathbb{R}^{n \times n}\).

Its Hessian matrix is the following:

\[H_f(\boldsymbol{p}) = \boldsymbol{A} + \boldsymbol{A}^{\mathsf{T}}\]
Example: \(f(\boldsymbol{x}) = ||\boldsymbol{x}||\)

Consider the real scalar field \(f: \mathbb{R}^n \to \mathbb{R}\) defined as follows:

\[f(\boldsymbol{x}) = ||\boldsymbol{x}||\]

It is totally differentiable on \(\mathbb{R}^n \setminus \{\boldsymbol{0}\}\) with the following partial derivatives:

\[\partial_{i}f(\boldsymbol{x}) = \frac{x_i}{||\boldsymbol{x}||}\]

We thus have:

\[\partial_{i}\partial_{i}f(\boldsymbol{x}) = \frac{||\boldsymbol{x}|| - x_i \frac{x_i}{||\boldsymbol{x}||}}{||\boldsymbol{x}||^2} = \frac{1}{||\boldsymbol{x}||} - \frac{x_i^2}{||\boldsymbol{x}||^3}\]
\[\partial_{i}\partial_{j} f(\boldsymbol{x}) = -\frac{x_i x_j}{||\boldsymbol{x}||^3} \qquad i \ne j\]

For its Hessian matrix, we have:

\[H_f(\boldsymbol{x}) = \frac{1}{||\boldsymbol{x}||}I_n - \frac{1}{||\boldsymbol{x}||^3}\boldsymbol{x}\boldsymbol{x}^{\mathsf{T}}\]
Proof

TODO

Theorem: Hessian Matrix via Gradient

Let \(f: \mathcal{D} \subseteq \mathbb{R}^n \to \mathbb{R}\) be a real scalar field.

If \(f\) is twice totally differentiable at \(\boldsymbol{p} \in \operatorname{int} \mathcal{D}\), then the columns of \(f\)'s Hessian matrix at \(\boldsymbol{p}\) are the gradients of \(f\)'s partial derivatives:

\[H_f(\boldsymbol{p}) = \begin{bmatrix}\vert & \vert & \vert \\ \nabla(\partial_1 f)(\boldsymbol{p}) & \cdots & \nabla(\partial_n f)(\boldsymbol{p}) \\ \vert & \vert & \vert \end{bmatrix}\]
Proof

TODO

Theorem: Symmetry of the Hessian Matrix

Let \(f: \mathcal{D} \subseteq \mathbb{R}^n \to \mathbb{R}\) be a real scalar field.

If \(f\) is twice totally differentiable at \(\boldsymbol{p} \in \operatorname{int} \mathcal{D}\), then its Hessian matrix there is symmetric.

Proof

TODO