Differential equation

A differential equation is a functional equation involving functions and their derivatives.

The order of a differential equation is the largest order of any derivative that appears in the equation.

Differential equations are often presented along with an initial condition; that is, a given value of the function or dependent variable at some value of its independent variable.

Examples

$f(x) = f'(x)$ has solutions $Ce^x$ for all real constants $C$. With the initial condition $f(0) = 1$, $f(x) = e^x$ becomes the unique solution.

$f(x) = -f''(x)$ has solutions $C \cos(x + a)$ for all real constants $C$ and $a$. The solutions with $a = 0$ are $C \cos x$; those with $a = -\frac{\pi}{2}$ are $C \sin x$. The initial condition $f'(0) = 0$ produces the cosine solutions; $f(0) = 0$ produces the sine solutions.

Solutions

Separation of variables is a convenient technique for solving certain types of differential equations. Essentially, the method involves rewriting the equation so that each side is an expression in only one variable and then taking the antiderivative of both sides.

When solving differential equations, it is best to notate functions using a single variable name instead of spelling out the function and its arguments, for example using $y$ instead of $f(x)$. Here we also use Leibniz notation $\frac{dy}{dx}$ for the derivative because it allows for manipulating $dy$ and $dx$ individually.

Worked example

To solve the differential equation \[\frac{dy}{dx} = y(7-y),\] we move all terms containing $x$ and $dx$ to the right and all terms containing $y$ and $dy$ to the left, thus obtaining \[\frac{1}{y(7-y)} \, dy = dx.\] Now $dy$ and $dx$ are factors on opposite sides, so we can antidifferentiate both sides: \[\int \frac{1}{y(7-y)} \, dy = \int dx.\] The right integral is simply $x + c_X$ for some constant $c_X$.

Using partial fractions, basic integration rules, and identities of the logarithm and absolute value functions, the left side becomes \[\int \frac{1}{y(7-y)} \, dy = \int \frac{1}{7} \left( \frac{1}{y} + \frac{1}{7-y} \right) \, dy = \frac{1}{7} \left( \ln \left| \frac{y}{7 - y} \right| \right) + c_Y,\] again for some constant $c_Y$.

Constants of integration $c_X$ and $c_Y$ can be combined into a single constant $C$ (this generally happens in separation of variables), so we write \[\frac{1}{7} \left( \ln \left| \frac{y}{7 - y} \right| \right) = x + C.\]

When $y$ is in the range $[0,7]$, algebraic manipulation leads to the solution \[y = 7 - \frac{7}{e^{7(x + C)} + 1}.\] Given any initial condition, we can solve for the value of the constant $C$.

Approximations

Euler's method uses repeated tangent-line approximations to approximate a value $f(b)$ of the solution to a first-order differential equation given an initial condition $f(a)$.

If $b > a$, Euler's method works by subdividing $[a,b]$ into smaller intervals $[a,c_1], [c_1,c_2], \dots , [c_{n-2}, c_{n-1}], [c_{n-1}, b]$, sometimes called steps. Starting at $a = c_0$, for each step $i$, the value of $f(c_i)$ (at the end of the step) is approximated via a tangent line about $x = c_{i-1}$ (the beginning of the step, where $f(c_{i-1})$ is known and $f'(c_{i-1})$ can be computed in terms of $c_{i-1}$ and $f(c_{i-1})$ using the given differential equation), until $b = c_n$ is reached.

The formula for the tangent-line approximation is \[f(c_i) \approx f(c_{i-1}) + (c_i - c_{i-1})f'(c_{i-1}).\]

The quantity $c_i - c_{i-1}$ is called the step size. Euler's method can be employed when $b < a$ by simply using negative step sizes.

Constant expressions

Certain expressions involving solutions to differential equations can be proven constant by noting that their derivatives are always $0$. These constant expressions can then be used to prove properties of the solutions.

For example, when $f(x) = -f''(x)$, \begin{align*} \left( f(x)^2 + f'(x)^2 \right)' &= \left( f(x)^2 \right) ' + \left( f'(x)^2 \right) ' \\ &= 2f(x)f'(x) + 2f'(x)f''(x) \\ &= 2f'(x) \left( f(x) + f''(x) \right) \\ &= 0. \end{align*} Using $\sin' x = \cos x$ allows for reconstructing the familiar identity \[\sin^2 x + \cos^2 x = \sin^2 0 + \cos^2 0 = 0^2 + 1^2 = 1\] for all real $x$.

When $f(x) = f'(x)$, for any real constant $S$, \begin{align*} \left( f(x)f(S-x) \right)' &= f'(x)f(S-x) + f(x)(f(S-x))' \\ &= f'(x)f(S-x) + f(x)(S-x)'f'(S-x) \\ &= f(x)f(S-x) + f(x)(-1)f(S-x) \\ &= 0. \end{align*} Letting $S = a + b$ and evaluating at both $x = 0$ and $x = a$ gives \[f(0)f(a+b) = f(a)f(b),\] which using $e^0 = 1$ becomes the familiar identity \[e^{a+b} = e^ae^b\] for all real $a$ and $b$.

This article is a stub. Help us out by expanding it.