Ordinary differential equations: Solving ODEs by an integrating factor
A non-homogeneous first-order linear differential equation
We consider the non-homogeneous first-order linear differential equation \[y'(t)=a(t)\cdot y(t)+b(t)\] where \(a(t)\neq 0\) and \(b(t)\neq 0\). This means that the ODE
- has order 1, because only the first derivative \(y'(t)\) is present;
- is linear because of the linear form in which \(y\) and \(y'\) are present;
- non-homogeneous as \(b(t)\neq 0\).
Let \(A(t)\) be an antiderivative of \(a(t)\) (it exists when \(a(t)\) is continuous on a closed interval).
Then, \(e^{-A(t)}\) is an integrating factor.
Let \(a(t)\) and \(b(t)\) be nonzero continuous function and let \(A(t)\) be an antiderivative of \(a(t)\). Then the general solution of the ODE \[y'(t)=a(t)\cdot y(t)+b(t)\] is \[y(t)=e^{A(t)}\cdot (F(t)+c)\] where \(F(t)\) is an antiderivative of \(e^{-A(t)}\cdot b(t)\) and \(c\) is an integration constant.
Unlock full access