[Math] Gradient of a maximum

calculusmultivariable-calculusoptimizationvector analysis

How do you compute the gradient of a function that involves a maximum? For example, I have the function:
$$ f(\vec{t}) = v(1-\exp(-\lambda\cdot \max(t_0,t_1)))$$
With $v$ and $\lambda$ constant, for which values of $t$ is this funciton at a maximum.

I know I can use the chain rule to compute the partial derivatives, but I'm not sure how to deal with something like $\nabla_t \max(\vec{t}) $

If analytical solutions are difficult, is there a numerical way to approximate this?

Best Answer

It's not hard, you just need to work with piecewise definitions. Derivatives in general are a local notion, so really all that matters is you're not explicitly in the knife-edge case where $t_0 = t_1$ (aka the diagonal of your domain). As long as $t_0 \neq t_1$ you can always find a sufficiently small open nbhd around your point where you can just treat the max operator as a constant for the sake of finding derivatives!

Related Question