[Math] 1st order pde and lagrange method.

partial differential equations

Consider the first order Partial differential equation (Which is actually Invischid Burger's equation)

$$u_{t}+u(x,t)u_x{}=0$$

With the initial condition ,

$$u(x,0)=\frac{1}{1+x^2}.$$ Then,

$(a)$ Show that there exists $t>0$, such that the problem has unique solution in the strip

$$\mathbb{R} \times (0,t),$$

$(b)$ Show that there may not exist solutions for the strip

$\mathbb{R} \times (0,\infty)$,

$(c)$ Also find a $t'$ such that for $\epsilon \gt0$, there exists solution in $\mathbb{R}\times (0,t')$ but not in
$\mathbb{R} \times (0,t'+\epsilon).$

Best Answer

By Lagrange's Theorem,

$$\frac{dt}{1}=\frac {dx}{u}=\frac{du}{0}$$

So, from the first and last equation, $$\frac{dt}{1}=\frac{du}{0}$$ or,$$u(x,t)=c$$ Similarly, from the first and second equation, $$ \frac{dt}{1}=\frac{dx}{u},$$ or, $$\frac{dt}{1}=\frac{dx}{c},$$ Integrating, we get $$x=c\times {t}+d $$ Hence $$x-u\times{t}=d$$

so,

$$u(x,t)=f(x-ut)$$

initially,

$$u(x,0)=\frac{1}{1+x^{2}}$$

so, $$u(x,t)= \frac{1}{1+(x-ut)^{2}}$$

Related Question