Problem Uniqueness $-\Delta u + u^3 = 0$

analysislaplacianpartial differential equations

Let $\Omega \subset \mathbb{R}^n$ open, bounded, connected, and with regular boundary. Assuming that $a(x) \geq 0$ for every $x \in \partial \Omega$, show that there is at most a regular solution to the problem
$$\left\{\begin{array}{rcl}
-\Delta u + u^3 & = & 0, \ \ \mbox{in} \ \Omega\\
\dfrac{du}{d\nu}(x) + a(x)u(x) & = & h(x), \ \ \mbox{on} \ \partial \Omega.
\end{array}\right.$$

Solution Proposal: Suppose there are $u$
and $v$ solutions to the above problem. Now consider $$w = u-v.$$
Since $u$ and $v$ are continuous functions in the compact $\overline{\Omega}$, then they admit maximum and, consequently, there is $x_0 \in \overline{\Omega}$ such that
$$w(x_0) = \max_{\overline{\Omega}} w.$$
Now, let's look at two cases:

1° case: Suppose that $x_0 \in \Omega$. Since this point is the maximum, we have $\Delta w(x_0) \leq 0$. Soon,
$$u(x_0)^3 – v(x_0)^3 \leq 0 \ \ \Rightarrow \ \ [u(x_0) – v(x_0)][u^2(x_0) + u(x_0)v(x_0 ) + v^2(x_0)] \leq 0.$$

Assertion: $u^2(x_0) + u(x_0)v(x_0) + v^2(x_0) \geq 0$;

Assume otherwise. Soon,
$$0 \leq [u(x_0) + v(x_0)]^2 = u^2(x_0) + 2u(x_0)v(x_0) + v^2(x_0) < u(x_0)v(x_0). $$
As $u^2(x_0), v^2(x_0) \geq 0$, we have an absurdity. So follows the statement.

From the \textit{Assertion}, we have $u(x_0) \leq v(x_0)$ and, consequently, it follows that $w(x) \leq 0$ in $\overline{\Omega}$ since $w(x_0 ) \leq 0$. Now note that
\begin{eqnarray}\label{pv1}
\nonumber
\int_{\partial \Omega} \dfrac{\partial w}{\partial \nu} dS_x + \int_{\partial \Omega} a(x)w(x)dS_x = 0 & \Rightarrow & \int_{\Omega} \Delta w(x) dx = -\int_{\partial \Omega} a(x)w(x)dS_x\\
& \Rightarrow & \int_{\Omega} \Delta w(x) dx \geq 0.
\end{eqnarray}

Finally, as it counts
\begin{equation}\label{pv2}
\Delta w = [u(x)]^3 – [v(x)]^3 = \underbrace{[u(x) – v(x)]}_{\leq 0}\underbrace{[u^2 (x) + u(x)v(x) + v^2(x)]}_{\geq 0} \leq 0
\end{equation}

in $\overline{\Omega}$, we conclude that
\begin{equation}\label{pv3}
\int_{\Omega} \Delta w(x) dx \leq 0.
\end{equation}

So we have $\Delta w = 0$ in $\Omega$.

2° case: Suppose that $x_0 \in \partial \Omega$. [I couldn't show how in case 1!!!]

Finally, as for cases 1 and 2 we have $\Delta w = 0$ in $\Omega$, it follows that
$$u^3 – v^3 = (u-v)(u^2 + uv + v^2) = 0 \ \ \mbox{em} \ \ \Omega.$$
So we have

$\bullet$ If $u – v = 0$ in $\Omega$, then $u = v$ in $\Omega$;

$\bullet$ If $u^2 + uv + v^2 = 0$ in $\Omega$, then
$$0 \leq (u + v)^2 = u^2 + 2uv + v^2 = uv.$$
Hence, we conclude that $u = v = 0$ in $\Omega$.

Doubts: In this perspective, how to do case 2? Also, as I show
$$u = v \ \ \mbox{in} \ \ \partial \Omega?$$

Best Answer

If $a(x) > 0$ instead of $a(x) \ge 0$, case 2 can be handled like this:

Since $x_0$ is a local maximum of $w$, we have $$\left.\frac{dw}{d\nu}\right|_{x=x_0} \ge 0 \implies -a(x_0)w(x_0) \ge 0 \implies w(x_0) \le 0$$ From this, it follows $w(x) \le 0$ over $\Omega$. The rest follows essentially same argument in your analysis of case 1.

I have no idea what happens if $a(x_0)$ is allowed to vanish. However, there is an alternate way to prove the uniqueness. In fact, we can generalize it a little bit.

Let $F : \Omega \times \mathbb{R} \to \mathbb{R}$ be any regular enough function such that for any fixed $x$, $F(x,y)$ is strictly increasing in $y$. For the problem at hand, take $F(x,y)$ to be $y^3$.

Let $u, v$ be two solutions of the boundary value problem: $$\begin{array}{ll} -\Delta \phi(x) + F(x,\phi(x)) = 0,& x \in \Omega\\ \hat{n}(x)\cdot \nabla\phi(x) + a(x) \phi(x) = h(x),& x \in \partial\Omega \end{array} $$ where $\hat{n}(x)$ is the unit outward normal of $\partial\Omega$ at $x$.

Let $w = u-v$ be their difference. Since $F(x,y)$ is increasing in $y$, for any $x \in \Omega$, we have

$$w(x)(F(x,u(x))-F(x,v(x)) \ge 0$$ Furthermore, equality is achieved at and only at those $x$ where $w(x) = 0$. From this, we find

$$\mathcal{I} \stackrel{def}{=} \int_\Omega (|\nabla w(x)|^2 + w(x)(F(x,u(x))-F(x,v(x))) dx \ge 0$$

Since $|\nabla w|^2 = \nabla\cdot( w \nabla w) - w \Delta w$, the integrand above can be simplified to $\nabla\cdot(w \nabla w)$.

Apply divergence theorem, we obtain

$$\mathcal{I} = \int_\Omega \nabla\cdot (w \nabla w) dx = \int_{\partial\Omega} \hat{n}\cdot (w \nabla w) dS_x = -\int_{\partial\Omega} aw^2 dS_x \le 0$$

This forces $\mathcal{I} = 0$ and hence for all $x \in \Omega$: $$\begin{align} & |\nabla w(x)|^2 + w(x)(F(x,u(x))-F(x,v(x))) = 0 \\ \implies & w(x)(F(x,u(x))-F(x,v(x))) = 0 \\ \implies & w(x) = 0 \end{align}$$

Related Question