[Math] Evans’ PDE book exercise Chapter 9 problem 7

partial differential equations

Let $\varepsilon > 0$. Define

$${\beta _\varepsilon }(z) = \left\{ {\begin{aligned}
&0 &\text{if}\;z\ge 0,\\
&\frac{z}{\varepsilon } &\text{if}\;z \le 0,
\end{aligned}} \right.
$$
and suppose ${u_\varepsilon } \in H_0^1(U)$ is the weak solution of
$$
\left\{
\begin{aligned}
– \Delta {u_\varepsilon } + {\beta _\varepsilon }({u_\varepsilon })
&= f\quad \text{in }\;U,
\\
{u_\varepsilon } &= 0\quad \text{on }\;\partial U ,
\end{aligned} \right.
$$
where $f\in L^2(U)$.

Prove that as $\varepsilon \to 0$, $u_{\varepsilon} \rightharpoonup u$ weakly in $H_0^1(U)$,$u$ being the unique nonnegative solution of the variational inequality
$$\int_U {Du \cdot D(w – u)dx \ge } \int_U {f(w – u)dx} $$
for all $w \in H_0^1(U)$ with $w \ge 0$ a.e.

Best Answer

We have that $$\tag{1}\int\nabla u_\epsilon\nabla\varphi+\int\beta(u_\epsilon)\varphi=f\varphi,\ \forall\ \varphi\in H_0^1$$

By taking $\varphi=u_\epsilon$, we conclude $$\tag{2}\int|\nabla u_\epsilon|^2\leq\int fu_\epsilon$$

which implies by Holder inequality that $\|\nabla u_\epsilon\|_2$ is bounded, therefore, we can suppose that $u_\epsilon \rightharpoonup u$ in $H_0^1$. Moreover, we can suppose that $u_\epsilon \to u$ in $L^2$. We combine these convergences with $(2)$ to get $$\tag{3}\int |\nabla u|^2\leq \int fu$$

On the other hand, define $\mathcal{K}=\{w\in H_0^1:\ w\geq 0\}$. As you can easily verify, $\int\beta(u_\epsilon)w\leq 0$ in $\mathcal{K}$, hence, from $(1)$, we conclude that $$\tag{4}\int\nabla u_\epsilon\nabla\varphi\geq \int f\varphi,\ \forall\ \varphi\in \mathcal{K}$$

Again, we use the convergences we have, to conclude from $(4)$ $$\tag{5}\int\nabla u\nabla\varphi\geq\int f\varphi,\ \forall\ \varphi\in\mathcal{K}$$

We conclude from $(3)$ and $(5)$ the desired inequality. To conclude that $u\in\mathcal{K}$ and $u$ is unique, you can argue like this:

I - $u$ is the minimizer of some functional $I:\mathcal{K}\to\mathbb{R}$, what is $I$?

II - In this particular setting, for $u$ to minimizes $I$ is equivalently for $u$ to satisfies the variational inequality.

III - The solution of the optimization problems is unique.

As @RayYang suggested this book is a good one to understand it better, however I would like to point out that the main argument here can be found in any good book of convex analysis.

It is worth to note that what we are proving here is that $-I'(u)\in \mathcal{N}_{\mathcal{K}}(u)$, where $\mathcal{N}_{\mathcal{K}}(u)$ is the normal cone of $u$ with respect to $\mathcal{K}$ and when $I$ is a convex differentialbe function, II is valid, i.e. $-I'(u)\in \mathcal{N}_{\mathcal{K}}(u)$ if and only if $u$ minimizes $I$.