Note that the identity matrix is a diagonal matrix with non-negative diagonal elements, hence its singular value decomposition is simply
$$I_n = I_n \cdot I_n \cdot I_n^\ast,$$
all its singular values are $1$, whence $\lVert I_n\rVert_2 = 1$.
am I supposed to use SDV decomposition each time I want to calculate a two-norm?
Not necessarily. Note that an alternative way to define the two-norm is
$$\lVert A\rVert_2 = \sup_{\lVert x\rVert_2 = 1} \lVert Ax\rVert_2.$$
With the singular value decomposition
$$A = U\Sigma V^\ast,$$
we obtain
$$\begin{align}
\sup_{\lVert x\rVert_2 = 1} \lVert Ax \rVert_2 &= \sup_{\lVert x\rVert_2 = 1} \lVert U\Sigma V^\ast x\rVert_2\\
&= \sup_{\lVert x\rVert_2 = 1} \lVert \Sigma V^\ast x\rVert_2\\
&= \sup_{\lVert y\rVert_2 = 1} \lVert \Sigma y\rVert_2,
\end{align}$$
since $U$ and $V^\ast$ are norm-preserving, and it is not hard to see that $\sup_{\lvert y\rVert_2 = 1} \lVert \Sigma y\rVert_2$ is the largest singular value.
This supremum is in general not easy to find, but with that, we can see
$$\lVert A\rVert_2^2 = \sup_{\lVert x\rVert_2 = 1} \lVert Ax\rVert_2^2 = \sup_{\lVert x\rVert_2 = 1} \langle Ax, Ax\rangle = \sup_{\lVert x\rVert_2 = 1} \langle x, A^\ast Ax\rangle,$$
and $A^\ast A$ is a positive semidefinite hermitian matrix, thus $\lVert A\rVert_2^2$ is the largest eigenvalue of $A^\ast A$, which sometimes is easier to compute.
However, computing operator norms like $\lVert A\rVert_2$ is in general not a trivial task.
Regarding the Frobenius norm,
$$\lVert I_n \rVert_F = \sqrt{\sum_{i=1}^n \sum_{j=1}^n \delta_{ij}^2} = \sqrt{n},$$
where $\delta_{ij}$ is the Kronecker symbol, $\delta_{ij} = 1$ if $i = j$, and $\delta_{ij} = 0$ if $i\neq j$.
For any square $A$, $\rho(A)\leq\|A\|_2$, where $\rho(A)$ is the spectral radius of $A$, with the equality (but not necessarily) if $A$ is normal. Besides the general inequality, $\rho(A)$ and $\|A\|_2$ can be completely unrelated. Consider, e.g.,
$$
A_\alpha:=\pmatrix{0&\alpha\\0&0}
$$
with $\rho(A_\alpha)=0$ but $\|A_\alpha\|_2=|\alpha|$. All the eigenvalues are zero but the 2-norm can be an arbitrary non-negative number (depending on $\alpha$).
Best Answer
As pointed out by Bungo in the comments, all eigenvalues of $A^TA$ are non-negative and real. So there is no need to take absolute values before you compare them.
The 2-norm of a linear transformation $A$ is (assuming finite dimensions) the maximal value of $\|Av\|$ among all unit vectors $v$ in the domain of $A$. We have $$ \|Av\| = \sqrt{v^TA^TAv} $$ The spectral theorem ($A^TA$ is a symmetric matrix) says that the domain of $A^TA$ (which is also the domain of $A$) has an orthonormal basis consisting of eigenvectors of $A^TA$. Thus if we decompose $v$ into this basis, we see that the largest possible value of $v^TA^TAv$ is exactly the largest eigenvalue of $A^TA$.