Skip to content

Linear ODEs#

Definition: Linear Ordinary Differential Equations

An ordinary differential equation

\[ F\left(x, y, y', y'', \dotsc, y^{(n)}\right) = 0 \]

is linear if there exist functions \(a_0, a_1, \dotsc, a_n, b\) such that the ODE can be expressed as

\[a_0(x) y + a_1(x)y' + \cdots + a_n(x) y^{(n)} = b(x)\]

Theorem: Existence and Uniqueness

Let \(I \subseteq \mathbb{R}\) be some open interval and let

\[a_0(x) y + a_1(x)y' + \cdots + a_{n-1}(x) y^{(n-1)} + y^{(n)} = b(x)\]

be a linear ODE with initial conditions \((x_0, y_0), \dotsc, (x_n, y_n)\), where \(x_0, \dotsc, x_n \in I\).

If \(a_0, a_1, \dotsc, a_{n-1}\) and \(b\) are continuous on \(I\), then there exists one and only one function \(\phi: I \to \mathbb{R}\) which satisfies the initial conditions and the linear ODE for every \(x \in I\).

Proof

TODO

First-Order Linear ODEs#

A first-order linear ODE can be expressed as

\[Q(x)y' + P(x)y = G(x)\]

for some functions \(P\), \(Q\) and \(G\). We often need transform this into a more standard form by dividing by \(Q(x)\) and imposing the condition that \(Q(x) \ne 0\):

\[y' + p(x) = g(x),\]

where \(p = P / Q\) and \(g = G / Q\).

Theorem: Variation of Parameters

Consider a first-order linear ODE which can be expressed as follows:

\[y' + p(x)y = q(x)\]

A real function \(\phi\) is a solution on some subset \(S \subseteq \mathbb{R}\) if and only if it can be expressed as

\[\phi(x) = c(x) \mathrm{e}^{-P(x)},\]

where \(P(x)\) is any antiderivative of \(p(x)\) on \(S\) and \(c(x)\) is an antiderivative of \(q(x)\mathrm{e}^{P(x)}\) on \(S\).

Example: \(y'(t) + \frac{y(t)}{t} = \cos t\)

Consider the following first-order linear ODE:

\[y'(t) + \frac{y(t)}{t} = \cos t\]

It has the following form:

\[y' + p(t)y(t) = q(t) \qquad p(t) = \frac{1}{t} \qquad q(t) = \cos t\]

The antiderivatives of \(p\) for \(t \in (0, \infty)\) are the following:

\[P(t) = \int p(t) \, \mathrm{d}t = \int \frac{1}{t} \, \mathrm{d}t = \ln t + C_1, \qquad C_1 \in \mathbb{R}\]

For \(c(t)\), we get:

\[c(t) = \int q(t)\mathrm{e}^{P(t)} \, \mathrm{d}t = \int t \mathrm{e}^{C_1} \cos t \,\mathrm{d}t = \mathrm{e}^{C_1}(t \sin t + \cos t + C_2), C_2 \in \mathbb{R}\]

Therefore:

\[y(t) = c(t)\mathrm{e}^{-P(t)} = \mathrm{e}^{C_1}(t \sin t + \cos t + C_2) \mathrm{e}^{-\ln t -C_1} = \frac{t \sin t + \cos t + C_2}{t}, C_2 \in \mathbb{R}\]
Proof

TODO

Algorithm: Solving First-Order Linear ODEs

We are given the following first-order linear ODE:

\[ y' + p(x)y = g(x) \]

To solve this, we use the method of integrating factors. The goal is to find some function \(\mu(x)\) such that we can use the product rule to express the left-hand side as the derivative of \((\mu(x)y)\). We can then use antidifferentiation to find the solutions.

  1. Multiply both sides by a yet unknown function \(\mu(x)\):
\[ \mu(x)y' + \mu(x)p(x)y = \mu(x)g(x) \]
  1. In order for \(\mu(x)y' + \mu(x)p(x)y\) to be the derivative of \((\mu(x)y)\), we need \(\mu(x)p(x)\) to be equal to \(\mu'(x)\) because the product rule gives us \((\mu(x)y)' = \mu(x)y' + \mu'(x)y\):
\[ \mu(x)y' + \mu(x)p(x)y = (\mu(x)y)' \iff \mu'(x) = \mu(x)p(x) \]
  1. Divide both sides of \(\mu'(x) = \mu(x)p(x)\) by \(\mu(x)\), imposing the condition \(\mu(x) \ne 0\):
\[ \frac{1}{\mu(x)}\mu'(x) = p(x) \]
  1. By also imposing the condition that \(\mu(x) \gt 0\) and then antidifferentiating both sides, we can transform this further:
\[ \ln (\mu(x)) = \int p(x) \mathop{\mathrm{d}x} + C \]
  1. We take the real exponential function of both sides:
\[ \mu(x) = \mathrm{e}^{\int p(x) \mathop{\mathrm{d}x} + C} \]

For each choice of \(C \in \mathbb{R}\), the expression \(\int p(x) \mathop{\mathrm{d}x} + C\) is some antiderivative \(P(x)\) of \(p(x)\). Thus, if we can find an antiderivative \(P(x)\) of \(p(x)\), then we can find a \(\mu(x) = \mathrm{e}^{P(x)}\) which satisfies the condition \((\mu(x)y)' = \mu(x)y' + \mu'(x)y\). We can easily verify this as well by applying the chain rule and the rules for the derivative of the real exponential function:

\[ (\mu(x)y)' = (\mathrm{e}^{P(x)}y)' = \mathrm{e}^{P(x)}y' + (\mathrm{e}^{P(x)})'y = \mathrm{e}^{P(x)}P'(x)y = \mathrm{e}^{P(x)}y' + \mathrm{e}^{P(x)}p(x)y = \mu(x)y' + \mu'(x)y \]

Moreover, since the real exponential function is always positive, we have a \(\mu(x)\) which satisfies the previously imposed condition that \(\mu(x) \gt 0\).

  1. We have shown how to find an appropriate \(\mu(x)\), so we can proceed with the original equation:
\[ (\mu(x)y)' = \mu(x)g(x) \]
  1. We antidifferentiate both sides:
\[ \mu(x)y = \int \mu(x)g(x) \mathop{\mathrm{d}x} + C \]
  1. To obtain the solutions, we divide both sides by \(\mu(x)\):
\[ y = \frac{1}{\mu(x)} \left(\int \mu(x)g(x) \mathop{\mathrm{d}x} + C\right) \]

For each choice of \(C \in \mathbb{R}\), the expression \(\int \mu(x)g(x) \mathop{\mathrm{d}x} + C\) is just some antiderivative \(\mathcal{F}\) of \(\mu(x)g(x)\). Therefore, the solutions of the original equation on some open subset \(U \subseteq \mathbb{R}\) are the functions \(y: U \to \mathbb{R}\) which for all \(x \in U\) can be expressed for as

\[ y(x) = \frac{1}{\mu(x)} \mathcal{F}(x) = \frac{1}{\mathrm{e}^{P(x)}} \mathcal{F}(x) = \mathrm{e}^{-P(x)}\mathcal{F}(x), \]

where \(P(x)\) is any antiderivative of \(p(x)\) and \(\mathcal{F}(x)\) is any antiderivative of \(\mu(x)g(x) = \mathrm{e}^{P(x)}g(x)\).

Summary

Given some open subset \(U \subseteq \mathbb{R}\), the solutions of the ODE on \(U\) are the functions \(y: U \to \mathbb{R}\) which on \(U\) can be expressed as

\[ y(x) = \mathrm{e}^{-P(x)}\mathcal{F}(x) \]

for some antiderivative \(P\) of \(p\) and some antiderivative \(\mathcal{F}\) of \(\mathrm{e}^{P(x)}g(x)\).

Example: \(y' - 2y = 4 - x\)

Here we have \(p(x) = -2\) and \(g(x) = 4 - x\). We want to rewrite this as

\[ (\mu(x)y)' = \mu(x)(4 - x) \]

The antiderivatives of \(p\) are given by

\[ \int p(x) \mathop{\mathrm{d}x} = \int -2 \mathop{\mathrm{d}x} = -2x + C \]

We choose \(C = 0\) to make everything simple and so

\[ \mu(x) = \mathrm{e}^{-2x} \]

The equation thus becomes

\[ (\mathrm{e}^{-2x} y)' = \mathrm{e}^{-2x}(4-x) \]

We antidifferentiate both sides:

\[ \mathrm{e}^{-2x} y = \int \mathrm{e}^{-2x}(4-x) \mathop{\mathrm{d}x} \]

Use integration by parts on the right-hand side:

\[ \begin{aligned} \int \mathrm{e}^{-2x}(4-x) \mathop{\mathrm{d}x} &= (4-x)\left(-\frac{1}{2}\mathrm{e}^{2x}\right) - \int \frac{1}{2}\mathrm{e}^{-2x} \mathop{\mathrm{d}x} \\ &= -\frac{1}{2}(4-x)\left(\mathrm{e}^{-2x}\right) - \frac{1}{2} \mathrm{e}^{-2x} \mathop{\mathrm{d}x} \\ &= -\frac{1}{2}(4-x)\left(\mathrm{e}^{-2x}\right) - \frac{1}{2}\left(-\frac{1}{2}\mathrm{e}^{-2x} + C\right) \\ &= -\frac{1}{2}(4-x)\left(\mathrm{e}^{-2x}\right) + \frac{1}{4}\mathrm{e}^{-2x} + C \\ &= -2\mathrm{e}^{-2x} + \frac{1}{2}x\mathrm{e}^{-2x} + \frac{1}{4}\mathrm{e}^{-2x} + C \\ &= \left(-2 + \frac{1}{4}\right)\mathrm{e}^{-2x} + \frac{1}{2}x\mathrm{e}^{-2x} + C \\ &= \left(-\frac{8}{4} + \frac{1}{4}\right)\mathrm{e}^{-2x} + \frac{1}{2}x\mathrm{e}^{-2x} + C \\ &= -\frac{7}{4}\mathrm{e}^{-2x} + \frac{1}{2}x\mathrm{e}^{-2x} + C \\ &= \mathrm{e}^{-2x} \left(\frac{1}{2}x - \frac{7}{4}\right) + C \end{aligned} \]

Therefore, we have

\[ \mathrm{e}^{-2x} y = \mathrm{e}^{-2x} \left(\frac{1}{2}x - \frac{7}{4}\right) + C \]

and so the solutions are

\[ y(x) = C\mathrm{e}^{2x} + \frac{1}{2}x - \frac{7}{4} \]
Example: \(xy' + 2y = 4x^2\)

TODO

Linear ODEs with Constant Coefficients#

Definition: Linear ODE with Constant Coefficients

A linear ODE with constant coefficients is a linear ODE which can be written in the form

\[a_n y^{(n)} + a_{n-1} y^{(n-1)} + \cdots + a_1 y' + a_0 y = b,\]

where \(b, a_0, \dotsc, a_n \in \mathbb{R}\) are real numbers.

Definition: Characteristic Polynomial

The characteristic polynomial of a homogeneous linear ODE with constant coefficients

\[a_n y^{(n)} + a_{n-1} y^{(n-1)} + \cdots + a_1 y' + a_0 y = 0,\]

is the following complex polynomial:

\[P(\lambda) = a_n \lambda^n + a_{n-1} \lambda^{n-1} + \cdots + a_1 \lambda + a_0\]

Theorem: Solutions from Characteristic Polynomials

Consider the following homogeneous linear ODE with constant coefficients:

\[y^{(n)} + a_{n-1} y^{(n-1)} + \cdots + a_1 y' + a_0 y = 0\]

The roots of its characteristic polynomial can be used to generate a basis for the solution space:

$\(\mathrm{e}^{\lambda t}, t \mathrm{e}^{\lambda t}, \dotsc, t^{k-1}\mathrm{e}^{\lambda t}\)$

$\(\mathrm{e}^{at}\cos(bt), \mathrm{e}^{at}\sin(bt),t\mathrm{e}^{at} \cos(bt), t\mathrm{e}^{at}\sin(bt), \dotsc, t^{k-1} \mathrm{e}^{at} \cos(bt), t^{k-1} \mathrm{e}^{at} \sin(bt)\)$

Example

Consider the following homogeneous linear ODE with constant coefficients:

\[y''' - 6y'' + 11 y' - 6y = 0\]

It has the characteristic polynomial

\[P(\lambda) = \lambda^3 -6\lambda^2 + 11\lambda - 6\]

whose roots are the following:

\[\lambda_1 = 1 \qquad \lambda_2 = 2 \qquad \lambda_3 = 3\]

We get the following basis:

\[\{ \mathrm{e}^{t}, \mathrm{e}^{2t}, \mathrm{e}^{3t}\}\]

The general solution is thus the following:

\[y(t) = c_1 \mathrm{e}^t + c_2 \mathrm{e}^{2t} + c_3 \mathrm{e}^{3t} \qquad c_1, c_2, c_3 \in \mathbb{R}\]
Example

Consider the following homogeneous linear ODE with constant coefficients:

\[y^{(4)} - 4y''' + 5y'' - 4y' + 4y = 0\]

It has the characteristic polynomial

\[P(\lambda) = \lambda^4 - 4\lambda^3 + 5\lambda^2 - 4\lambda + 4\]

whose roots are the following:

\[\lambda_1 = 2 \qquad \lambda_2 = 2 \qquad \lambda_3 = \mathrm{i} \qquad \lambda_4 = -\mathrm{i}\]

We get the following basis:

\[\{ \mathrm{e}^{2t}, t\mathrm{e}^{2t}, \cos(t), \sin(t) \}\]

The general solution is thus the following:

\[y(t) = c_1 \mathrm{e}^{2t} + c_2 t\mathrm{e}^{2t} + c_3 \cos(t) + c_4 \sin(t) \qquad c_1, c_2, c_3, c_4 \in \mathbb{R}\]
Proof

TODO

Theorem: Reduction to Linear System

Consider the following homogeneous linear ODE with constant coefficients:

\[y^{(n)} + a_{n-1} y^{(n-1)} + \cdots + a_1 y' + a_0 y = 0\]

A function \(\phi\) is a solution on some interval \(I \subseteq \mathbb{R}\) if and only if \(\boldsymbol{\psi}\) is a solution on \(I\) of the linear system

\[\boldsymbol{y}' = \boldsymbol{A}\boldsymbol{y},\]

where:

\[\boldsymbol{\psi} = \begin{bmatrix} \phi \\ \phi' \\ \vdots \\ \phi^{(n-1)}\end{bmatrix} \qquad \boldsymbol{A} = \begin{bmatrix} 0 & 1 & 0 & \cdots & 0 \\ 0 & 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & 1 \\ -a_0 & -a_1 & -a_2 & \cdots & -a_{n-1} \end{bmatrix}\]
Proof

TODO