[Math] When should we *not* use Gaussian elimination to find Eigen vectors

eigenvalues-eigenvectorsgaussian eliminationlinear algebra

This is a matrix, A, I came across while studying eigenvectors from here.

\begin{bmatrix}
1 & 2 \\
2 & 4 \\
\end{bmatrix}

where $AX=0$

I tried solving by using Gaussian elimination as suggested here. But I kept getting $y = 0, z = 0$. So then I solved it by just finding the ratio as shown here. Which gave me the correct answer of $y = 2 , z = -1$.

I understood that Gaussian failed because row transformations do not preserve Eigen values/vectors. But I'm confused as to when we can apply it and when we can't.

I'm a newbie to linear algebra, so it is possible my understanding is wrong.

Edit :
This(image) is where I actually got confused. When I tried to find the Eigen vectors using Gaussian elimination, I kept getting $y = 0, z = 0$. Whereas by the ratio method I got the correct values.
enter image description here

And that's why I was wondering why Gaussian elimination didn't work here and how to know when to not apply it.

Best Answer

If you know what the eigenvalues are, for instance by finding the roots of the characteristic polynomial, then you can apply Gaussian elimination to find solutions to the system $$(M-\lambda \operatorname{Id})X=0$$ where $\lambda$ is one of your eigenvalues and $X$ is an unknown eigenvector associated with it.

Given that your question is more precisely when not to apply Gaussian elimination, as far as I can see there is no other situation than this one where Gaussian elimination would be useful.