We present a program to solve by a numerical scheme a system of
first-order linear differential equations of the form
A1y' + A0y = f
where the matrices A0 and A1 are
| 0 -1 | | 1 0 | | 0 |
A0 = | |, A1 = | |, f = | |
| 1 -1 | | 0 1 | | 3(t-1)exp(t) |
With adapted initial conditions, the solution is given by
y1(t) = t*exp(-t), y2(t) = (1-t)*exp(-t)
ODESolver ode(BDF2,theTimeStep,theFinalTime,2);
DMatrix<double> A0(2,2), A1(2,2); ode.setMatrices(A0,A1); Vect<double> y(2), f(2); y(1) = 0; y(2) = 1; f(1) = 0; f(2) = -3; ode.setInitial(y); ode.setRHS(f); ode.setInitialRHS(f); |
TimeLoop { A1 = 1; A0(1,1) = 0; A0(1,2) = -1; A0(2,1) = 1; A0(2,2) = -1; f(1) = 0; f(2) = 3*(theTime-1)*exp(-theTime); ode.runOneTimeStep(); } |