[Math] Why are singular conics reducible

algebraic-geometry

I'm currently working through Rick Miranda's book on Algebraic Geometry and Riemann Surfaces, and I've been stuck on a problem in the first chapter, and I can't seem to get anywhere.
I think that for example Bezout's theorem would solve it, but I would want something more elementary, which I think there is.

Let X be an affine plane curve of degree 2, that is, defined by a quadratic polynomial f(z,w). Suppose that f is singular. Show that f(z,w) then factors as the product of linear factors.

UPDATE
So far I've done the following, set $f(x,y) = ax^2+bxy+cx+dy+ey^2+f$. Say that $p=(m,n)$ is a root, and that it is singular. Set $z=(x+m)$, $w=(y+n)$. Then we have a polynomial:
f(z,w) which will have a singular point at (0,0). Taking the partial derivatives, and further, we solve for some coefficients, and at the end we get that a conic, singular polynomial should be (after some transformations) of the form: $az^2+bzw+cw^2$, which is reducible into linear factors. However, I'm not completly sure this method is correct, so any tips would be helpful, as to whether I'm on the right path or not.

Best Answer

Hint: Make a linear change of coordinates so that (one of) the singular point(s) is located at $(0,0)$. (Check that this is okay in the context of this particular problem.) Now consider the Taylor series expansion of $f(z,w)$, i.e. write $f = f_0 + f_1 + f_2 + ...$, where $f_n$ is homogeneous of degree $n$ in $z$ and $w$. For which degrees $n$ is $f_n$ non-zero? What does this tell you?

Related Question