[Math] Questions about weak derivatives

functional-analysispartial differential equationsreal-analysisweak-derivatives

There are two definitions of generalized differentiation that seem relevant to the context of PDEs. (That is we generalize what objects can be differentiated but we stay in Euclidean space. There are also other types of generalizations that change the space like Frechet or Gateaux derivatives in Banach spaces.)

One is that we take any distribution (continuous linear functional on $C^\infty_{com}$ with the topology of convergence of all derivatives in sup norm) and we precompose it with differentiation times a negative sign. There is no question of existence here.

The other is called the "weak derivative" and it is supposed to be applied only to locally integrable Borel measurable functions, but again on some open subset of Euclidean space. Then, the weak derivative may or may not exist, and when it does the definition is that it is locally integrable and must satisfy the integration by parts relation against smooth compactly supported test functions.

I gather that the first generalization is a generalization even of the second one, as long as one identifies locally integrable functions with their integration distributions. So first extends second extends ordinary differentiation. My questions are as follows, if that is correct:

  1. Which distributions have antiderivatives? Which locally integrable functions have weak antiderivatives? In the weak case, it seems like this has to do with absolute continuity?

  2. It is easy for me to formalize a statement that says that derivatives depend on only local information for weak derivatives. But is there something like this for distributional derivatives? What would it mean to look locally at a distribution?

  3. For weak derivatives only, which may not exist, what are some examples of when an $\alpha$th derivative exists, but there exists a $\beta\le \alpha$ for which the $\beta$th weak derivative does not exist? What if instead $\beta\ge \alpha$, which may be nontrivial since I don't think antiderivatives come for free?

  4. Is there a heuristic, much like there was for ordinary derivatives, that allows me to tell if the weak derivative will exist, and perhaps even to quickly compute the answer? Right now, all I can see is that if the function is piecewise smooth, then if a weak derivative exists, then you know all the behavior except at the joining points, and those don't matter since the weak derivative is only defined up to a.e. So then if you can come up with a good reason why this guess doesn't actually work, then no other candidate could possibly work either.

(I am reading out of Knapp's book on Advanced Analysis)

Best Answer

So:

  1. Yes. $u$ is AC iff $\exists w\in L^1_{loc}$ such that $u=\int w dx$.
  2. No. Distributions are by definition functionals, not function. Some distributions can be identified with a function (such as linear functionals on $L^1_{loc}$), but in general it does not make sense to talk about the value of a distribution at a point in space. You can only talk about the pairing of the distribution with a test function.
  3. Not sure what you mean. If the derivative of an integrable function exists, then it must coincide with its weak derivative. So if $\beta\leq \alpha$, the existence of the $\alpha-th$ derivative implies the existence of the $\beta-th$ weak derivative. The opposite way it's trivial: take the integral of $|x|$, which has a (strong) derivative of order one, a weak derivative of order 2, but no derivatives of order higher than 2.
  4. I'm not 100% sure if this is true, but I feel like, in order to have a weak derivative, a function must be equal almost everywhere to an AC function.