solving $\dot{x}(t) + x(t) = \delta (t) $ Using Laplace transform for
$x(0) = 1$, we get:
$sX(s)-1 + X(s) = 1$
$X(s) = \frac{2}{s+1}$
so, $x(t) = 2e^{-t}$
However, evaluating at t=0,
$x(0) = 2 \neq 1$
This disagrees with the initial condition. What went wrong here?
Best Answer
Too long for a comment
There are two important points in this problem.
If you check the solution that you found, you will see that it does not satisfy the equation: $$ \Big(\frac{d}{dt}+1\Big)2e^{-t}=0\neq \delta(t)$$
The general solution of an inhomogeneous equation (i.e. equation with non-zero RHS) is the general solution of a homogeneous equation, plus a particular solution of an inhomogeneous equation.
If we put the boundary condition, for example, $x(0)=2$, we get the answer $$x(t)=(h(t)+1)\,e^{-t}$$