Note that the identity matrix is a diagonal matrix with non-negative diagonal elements, hence its singular value decomposition is simply
$$I_n = I_n \cdot I_n \cdot I_n^\ast,$$
all its singular values are $1$, whence $\lVert I_n\rVert_2 = 1$.
am I supposed to use SDV decomposition each time I want to calculate a two-norm?
Not necessarily. Note that an alternative way to define the two-norm is
$$\lVert A\rVert_2 = \sup_{\lVert x\rVert_2 = 1} \lVert Ax\rVert_2.$$
With the singular value decomposition
$$A = U\Sigma V^\ast,$$
we obtain
$$\begin{align}
\sup_{\lVert x\rVert_2 = 1} \lVert Ax \rVert_2 &= \sup_{\lVert x\rVert_2 = 1} \lVert U\Sigma V^\ast x\rVert_2\\
&= \sup_{\lVert x\rVert_2 = 1} \lVert \Sigma V^\ast x\rVert_2\\
&= \sup_{\lVert y\rVert_2 = 1} \lVert \Sigma y\rVert_2,
\end{align}$$
since $U$ and $V^\ast$ are norm-preserving, and it is not hard to see that $\sup_{\lvert y\rVert_2 = 1} \lVert \Sigma y\rVert_2$ is the largest singular value.
This supremum is in general not easy to find, but with that, we can see
$$\lVert A\rVert_2^2 = \sup_{\lVert x\rVert_2 = 1} \lVert Ax\rVert_2^2 = \sup_{\lVert x\rVert_2 = 1} \langle Ax, Ax\rangle = \sup_{\lVert x\rVert_2 = 1} \langle x, A^\ast Ax\rangle,$$
and $A^\ast A$ is a positive semidefinite hermitian matrix, thus $\lVert A\rVert_2^2$ is the largest eigenvalue of $A^\ast A$, which sometimes is easier to compute.
However, computing operator norms like $\lVert A\rVert_2$ is in general not a trivial task.
Regarding the Frobenius norm,
$$\lVert I_n \rVert_F = \sqrt{\sum_{i=1}^n \sum_{j=1}^n \delta_{ij}^2} = \sqrt{n},$$
where $\delta_{ij}$ is the Kronecker symbol, $\delta_{ij} = 1$ if $i = j$, and $\delta_{ij} = 0$ if $i\neq j$.
For any square $A$, $\rho(A)\leq\|A\|_2$, where $\rho(A)$ is the spectral radius of $A$, with the equality (but not necessarily) if $A$ is normal. Besides the general inequality, $\rho(A)$ and $\|A\|_2$ can be completely unrelated. Consider, e.g.,
$$
A_\alpha:=\pmatrix{0&\alpha\\0&0}
$$
with $\rho(A_\alpha)=0$ but $\|A_\alpha\|_2=|\alpha|$. All the eigenvalues are zero but the 2-norm can be an arbitrary non-negative number (depending on $\alpha$).
Best Answer
Given a symmetric matrix, you have a complete set of eigenvalues and orthogonal eigenvectors. Any vector can be represented as a linear combination of the eigenvectors. Multiply your matrix by an arbitrary unit vector decomposed into the eigenvectors. Then note that the maximum length of the resultant vector is achieved when the input vector is along the eigenvector associated with the largest eigenvalue in absolute value.