Derivative of weighted Dirac delta function

calculusdirac deltafunctional-analysisreal-analysis

I have a function of the form

$$f(x,y) = \sum_{i=1}^{n} w(x,y) \delta(y-g_i(x))$$
where $x \in \mathbb{R}^{n}$, $y \in \mathbb{R}$, $g_i(x)$ is an increasing continuous function and $\delta(\cdot)$ is Dirac's delta function. The coefficient $w(x,y)$ is constant along sets $\{(x,y) : y = g_i(x)\}$, so that $w(x,g_i(x)) = w_i$, where $w_i$ is a constant. Otherwise, I make no assumptions on $w$. In particular, I do not assume $w$ is differentiable on $\mathbb{R}^{n} \times \mathbb{R}$.

I have an integral of the form $I(x) =\int_{0}^{\infty} f(x,y)\mu(y) dy = \int_{0}^{\infty} \sum_{i=1}^{n} w(x,y) \delta(y-g_i(x)) \mu(y) dy$. By the properties of the Dirac Delta function, I know that
$$I(x) = \sum_{i=1}^n w_i \mu(g_i(x)) = \sum_{i=1}^{n} w(x,g_i(x)) \mu(g_i(x)).$$

I am interested in computing $\frac{\partial I}{\partial x_j}$, but I don't know how to use the product rule with a weighted combination of Dirac Delta functions. My intuition tells me that I can treat $w_i$ as a constant, so that

$$\frac{\partial I}{\partial x_j} = \sum_{i=1}^{n} w_i \frac{\partial \mu}{\partial y} \frac{\partial g_i}{\partial x_j}$$

Is this intuition correct? Or does the fact that $w(x,y)$ is not differentiable mean that $\frac{\partial I}{\partial x_j}$ does not exist?

Best Answer

Yes, the constraint on $w(x,y)$ means you can validly compute the gradient directly from your one equation

$$ I(x) = \sum_{i=1}^n w_i \mu(g_i(x)) $$

Since this equation is valid for every $x \in \mathbb{R}^n$, and the constants $w_i$ do not depend on $x$, you can show the existence and formula of $\nabla I(x)$ from just that equation. The integral equation and values of $w$ outside the set of $w(x,g_i(x))$ might behave strangely, but they can't break conclusions logically drawn from just the above.

Related Question