The discriminant of any monic polynomial is the product $\prod_{i \neq j} (x_i - x_j)^2$ of the squares of the differences of the roots (in an algebraic closure, e.g. $C$). Cf. the Wikipedia article on this. Consequently, if the roots are all real and distinct, this must be positive.
(If the polynomial is not monic, the factor $a_0^{2n-2}$ is thrown in, for $a_0$ the leading coefficient and $n$ the degree; this is positive for a real polynomial.)
Note:
I added a proof that for odd $n$
the only root is $-1$.
Generalizing
Milo Brandt's answer,
which I thought of before I saw his,
this applies to a polynomial
of any even degree.
If the polynomial is of degree $2n$,
using his argument,
we need to find out
how many real roots
$p(x)
=x^{2n}+x^{2n-1}+...+x+1
$
can have.
But
$p(x)
=\frac{x^{2n+1}-1}{x-1}
$
has no real roots
because
the numerator and denominator
have the same sign
and at 1,
their common root,
$p(x) = 2n+1$.
For odd $n$,
the only real root is $-1$.
$n=3$ shows what happens;
I will then give the proof
for general odd $n$.
$x^3+x^2+x+1
=\frac{x^4-1}{x-1}
=\frac{(x^2+1)(x^2-1)}{x-1}
=\frac{(x^2+1)(x+1)(x-1)}{x-1}
=(x^2+1)(x+1)
$
for $x \ne 1$.
The only real root
is, obviously,
$x=-1$.
For general odd $n$,
since $n+1$ is even,
let $n+1 = 2^km$
where $m$ is odd.
Then,
just for $n=3$,
above,
$\begin{array}\\
x^n+x^{n-1}+...+x+1
&=\frac{x^{n+1}-1}{x-1}\\
&=\frac{x^{2^km}-1}{x-1}\\
&=\frac{(x^{2^{k-1}m}+1)(x^{2^{k-1}m}-1)}{x-1}\\
&=\frac{(x^{2^{k-1}m}+1)(x^{2^{k-2}m}+1)(x^{2^{k-2}m}-1)}{x-1}\\
&=\frac{(x^{2^{k-1}m}+1)(x^{2^{k-2}m}+1)...(x^{2m}+1)(x^m+1)(x^m-1)}{x-1}\\
&=(x^{2^{k-1}m}+1)(x^{2^{k-2}m}+1)...(x^{2m}+1)(x^m+1)\frac{x^m-1}{x-1}\\
\end{array}
$
Since $m$ is odd,
as proved above,
$\frac{x^m-1}{x-1}$
has no real roots.
All the terms
$x^{2^jm}+1$
for $j \ge 1$
are at least $1$
since the exponent is even.
Finally, since $m$ is odd,
$x^m+1$
has as its only real root
$x=-1$.
Therefore,
the whole polynomial
has $-1$ as its only real root.
Best Answer
First of all solve the equation for its two roots, hence
$x_1 = \frac{-b -\sqrt{b^2-4ac}}{2a}; x_2 = \frac{-b +\sqrt{b^2-4ac}}{2a}$. (Assume those roots exist in real numbers, otherwise there are less then 2 anyway).
Now you can write your equation as $ax^2 +bx+c = a(x-x_1)(x-x_2)$.
Assume there is a $x_3 \neq x_2,x_1$ with $ax_3^2+bx_3+c=0$.
A real product is zero if and only if (at least) one of its factors is zero. Therefore
$a(x-x_1)(x-x_2) = 0 \Leftrightarrow (x=x_1 \vee x=x_2)$
It follows that $ax_3^2+bx_3+c \neq 0$ what contradicts to the assumption.