Jacobian Matrix#
Definition: Jacobian Matrix
Let \(f: \mathcal{D} \subseteq \mathbb{R}^m \to \mathbb{R}^n\) be a real vector function.
If \(f\) is totally differentiable at \(\boldsymbol{p} \in \mathcal{D}\), then its Jacobian matrix at \(\boldsymbol{p}\) is the matrix representation of its total derivative there with respect to the standard bases of \(\mathbb{R}^m\) and \(\mathbb{R}^n\).
Notation
Definition: Regularity
Let \(f: \mathcal{D} \subseteq \mathbb{R}^m \to \mathbb{R}^n\) be a real vector function and let \(\boldsymbol{p}\) be an interior point of \(\boldsymbol{p}\).
We say that \(f\) is regular if it is totally differentiable at \(\boldsymbol{p}\) and its Jacobian matrix there is invertible.
Theorem: Jacobian via Partial Derivatives
Let \(f: \mathcal{D} \subseteq \mathbb{R}^m \to \mathbb{R}^n\) be a real vector function.
If \(f\) is totally differentiable at \(\boldsymbol{p} \in \mathcal{D}\), then its Jacobian matrix \(J_f(\boldsymbol{p})\) is given by the partial derivatives of \(f\)'s component functions \(f_1, \dotsc, f_n\) as follows:
Example
Consider the real vector function \(f: \mathbb{R}^2 \to \mathbb{R}^3\) defined as follows:
It is totally differentiable on \(\mathbb{R}^2\) with the following Jacobian matrix:
Example: \(f(\boldsymbol{x}) = \boldsymbol{A} \boldsymbol{x}\)
Consider the real vector function \(f: \mathbb{R}^m \to \mathbb{R}^n\) defined as
for some fixed real matrix \(\boldsymbol{A} \in \mathbb{R}^{n\times m}\). It is totally differentiable on \(\mathbb{R}^m\) with the following Jacobian matrix:
Example
Let \(f: \mathbb{R}^n \to \mathbb{R}\) be a real scalar field which is twice totally differentiable.
The Jacobian matrix of \(f\)'s gradient is \(f\)'s Hessian matrix:
Example: Polar Coordinate Transformation
Consider the coordinate transformation \(f: [0,+\infty)\times [0, 2\uppi) \to \mathbb{R}^2\) from polar coordinates:
It is totally differentiable on \((0, +\infty) \times (0, 2 \uppi)\) with the following Jacobian matrix:
Example: Cylindrical Coordinate Transformation
Consider the coordinate transformation \(f: [0,+\infty)\times [0, 2\uppi) \times \mathbb{R} \to \mathbb{R}^3\) of cylindrical coordinates:
It is totally differentiable on \((0, +\infty) \times (0, 2 \uppi) \times \mathbb{R}\) with the following Jacobian matrix:
Example: Spherical Coordinate Transformation
Consider the coordinate transformation \(f: [0,+\infty)\times [0, \uppi] \times [0, 2\uppi) \to \mathbb{R}^3\) of spherical coordinates:
It is totally differentiable on \((0,+\infty)\times (0, \uppi) \times (0, 2\uppi)\) with the following Jacobian matrix:
Proof
TODO
Theorem: Jacobian via Gradients
Let \(f: \mathcal{D} \subseteq \mathbb{R}^m \to \mathbb{R}^n\) be a real vector function.
If \(f\) is totally differentiable at \(\boldsymbol{p} \in \mathcal{D}\), then the rows of its Jacobian matrix \(J_f(\boldsymbol{p})\) are the gradients of \(f\)'s component functions \(f_1, \dotsc, f_n\):
Proof
TODO
Theorem: Jacobian of Linear Combination
Let \(f:\mathcal{D}_f \subseteq \mathbb{R}^m \to \mathbb{R}^n\) and \(g: \mathcal{D}_g \subseteq \mathbb{R}^m \to \mathbb{R}^n\) be real vector functions.
If \(f\) and \(g\) are totally differentiable at \(\boldsymbol{p} \in \mathcal{D}_f \cap \mathcal{D}_g\), then the Jacobian matrix of \(\lambda f + \mu g\) is given by the Jacobian matrices of \(f\) and \(g\) as
for all \(\lambda, \mu \in \mathbb{R}\).
Proof
TODO
Theorem: Chain Rule for Jacobian Matrices
Let \(g: \mathcal{D}_g \subseteq \mathbb{R}^m \to \mathbb{R}^n\) and \(f: \mathcal{D}_f \subseteq \mathbb{R}^n \to \mathbb{R}^p\) be real vector functions.
If \(g\) is totally differentiable at \(\boldsymbol{p} \in \mathcal{D}_g\) and \(f\) is totally differentiable at \(g(\boldsymbol{p}) \in \mathcal{D}_f\), then the Jacobian matrix of the composition \(f \circ g\) is given by the matrix product of the Jacobian matrices of \(f\) and \(g\) as follows:
Proof
TODO
Mean Value Inequality
Let \(f: \mathcal{D} \subseteq \mathbb{R}^m \to \mathbb{R}^n\) be a real vector function and let \(\boldsymbol{a}, \boldsymbol{b} \in \mathcal{D}\) such that \(L = \{\boldsymbol{a} + t(\boldsymbol{b} - \boldsymbol{a}) \mid t \in [0,1]\} \subseteq \mathcal{D}\).
If \(f\) is continuous on \(L\) and totally differentiable on \(\operatorname{int} L\), then
where the matrix norm can be induced by any vector norms on \(\mathbb{R}^m\) and \(\mathbb{R}^n\).
Proof
TODO