If $\rm Y$ is symmetric, then it is diagonalizable, its eigenvalues are real, and its eigenvectors are orthogonal. Hence, $\rm Y$ has an eigendecomposition $\rm Y = Q \Lambda Q^{\top}$, where the columns of $\rm Q$ are the eigenvectors of $\rm Y$ and the diagonal entries of diagonal matrix $\Lambda$ are the eigenvalues of $\rm Y$.
If $\rm Y$ is also positive semidefinite, then all its eigenvalues are nonnegative, which means that we can take their square roots. Hence,
$$\rm Y = Q \Lambda Q^{\top} = Q \Lambda^{\frac 12} \Lambda^{\frac 12} Q^{\top} = \underbrace{\left( Q \Lambda^{\frac 12} \right)}_{=: {\rm V}} \left( Q \Lambda^{\frac 12} \right)^{\top} = V^{\top} V$$
Note that the rows of $\rm V$ are the eigenvectors of $\rm Y$ multiplied by the square roots of the (nonnegative) eigenvalues of $\rm Y$.
Best Answer
Another method is to check there are no negative pivots in row reduction (after taking into account the possibility of 0's on the diagonal). The procedure can be written recursively as follows:
1) If $A$ is $1 \times 1$, then it is positive semidefinite iff $A_{11} \ge 0$.
Otherwise:
2) If $A_{11} < 0$, then $A$ is not positive semidefinite.
3) If $A_{11} = 0$, then $A$ is positive semidefinite iff the first row of $A$ is all 0 and the submatrix obtained by deleting the first row and column is positive semidefinite.
4) If $A_{11} > 0$, for each $j > 1$ subtract $A_{j1}/A_{11}$ times row 1 from row $j$, and then delete the first row and column. Then $A$ is positive semidefinite iff the resulting matrix is positive semidefinite.