Solving the “Transport” PDE in the sense of distributions with Dirac Delta Source

distribution-theorylinear-pdepartial differential equationstransport-equation

Let $\delta_0$ be the standard Dirac Delta distribution. I wish to solve the PDE $$u_t+cu_x=\delta_0$$ in the sense of distributions with initial condition $u(x,0)=g(x)$ for some continuous $g$. That is, I wish to find $u(x,t)$ such that
$$-\iint_\mathbb{R} u(x,t)(\phi_t+c\phi_x)dA=\phi(0,0)$$
where $\phi$ is any so-called test function.

Can anyone point in me the right direction? I tried to take a Fourier transform but that didn't seem to do much.

Edit:

To respond to a comment, taking the Fourier transform yields:
$$\mathcal{F}(u)_t+cik\mathcal{F}(u)=1$$
This is equivalent to the ODE $$f'(t)+cikf(t)=1$$
This ODE is solved by
$$\mathcal{F}(u)=f(t)=C e^{-(i kc t)} – i/(kc)$$
I'm unsure of where to go from here, or if this is correct.

Best Answer

The r.h.s. of the partially Fourier-transformed equation in OP is incorrect. Indeed, spatial Fourier transformation of the 2D Dirac $\delta_0 =\delta(x)\delta(t)$ gives $\delta(t)$, not $1$. Moreover, the weak form in OP is incorrect too. Integrating by parts, we have \begin{aligned} 0 &= \iint_{\Bbb R\times\Bbb R_+} (u_t + cu_x-\delta_0)\phi\,\text d x\,\text d t \\ &= -\int_{\Bbb R} g\phi|_{t=0}\, \text d x - \iint_{\Bbb R\times\Bbb R_+} u(\phi_t + c\phi_x)\,\text d x\,\text d t - \phi(0,0) \end{aligned} for any test function $\phi$.

The present problem amounts to the computation of the Green's function for the non-homogeneous advection equation $u_t+cu_x=f$. Fourier transformation in space and time of the PDE yields $$ \text i(\omega-kc)\, \mathcal{F}_t\mathcal{F}_x u = 1 $$ where $\mathcal{F}_t = \int\text dt\, e^{-\text i\omega t}$ and $\mathcal{F}_x = \int\text dx\, e^{\text ik x}$. Thus, the solution is represented as \begin{aligned} u(x,t) &= \frac{1}{(2\pi)^2}\iint \frac{e^{\text i(\omega t-kx)}}{\text i (\omega-kc)}\text dk\,\text d\omega \\ &= \frac{\text{sgn}(t)}2 \left( \frac{1}{2\pi} \int e^{-\text i k(x-ct)}\text dk \right) \\ &= \tfrac12 \text{sgn}(t)\, \delta(x-ct) \end{aligned} where the residue theorem was used (singularity at $\omega=kc$ -- see this post). Using the superposition principle, the solution to the initial problem may be expressed as $$ u(x,t) = g(x-ct)+\tfrac12 \text{sgn}(t) \, \delta(x-ct) \, . $$ As pointed out in the comments, an alternative consists in using Duhamel's principle, cf. this article.

Related Question