[Math] how to find the slope of a curve at origin

derivatives

What is the slope of the curve $x^3 + y^3 = 3axy$ at origin and how to find it because after following the process of implicit differentiation and plugging in $x=0$ and $y=0$ in the derivative we get $\frac{0}{0}$…..how to solve these kinds of problems

Best Answer

As $(x,y)$ get close to $0$, the behavior of the solution set is determined by the terms of smallest total degree.

For small $x$ and $y$, the values of $x^3$ and $y^3$ will be much smaller than $3axy$, so the zeroes of the function will be approximately where the zeroes of $0=3axy$ are -- that is, near the origin the curve will look like the solutions to that, which is just the two coordinate axes. So the curve will cross itself at the origin, passing through the origin once horizontally and once vertically.

(This is also why implicit differentiation can't work at the origin -- the solution set simply doesn't look like a straight line there under any magnification).

Related Question