Note that the identity matrix is a diagonal matrix with non-negative diagonal elements, hence its singular value decomposition is simply
$$I_n = I_n \cdot I_n \cdot I_n^\ast,$$
all its singular values are $1$, whence $\lVert I_n\rVert_2 = 1$.
am I supposed to use SDV decomposition each time I want to calculate a two-norm?
Not necessarily. Note that an alternative way to define the two-norm is
$$\lVert A\rVert_2 = \sup_{\lVert x\rVert_2 = 1} \lVert Ax\rVert_2.$$
With the singular value decomposition
$$A = U\Sigma V^\ast,$$
we obtain
$$\begin{align}
\sup_{\lVert x\rVert_2 = 1} \lVert Ax \rVert_2 &= \sup_{\lVert x\rVert_2 = 1} \lVert U\Sigma V^\ast x\rVert_2\\
&= \sup_{\lVert x\rVert_2 = 1} \lVert \Sigma V^\ast x\rVert_2\\
&= \sup_{\lVert y\rVert_2 = 1} \lVert \Sigma y\rVert_2,
\end{align}$$
since $U$ and $V^\ast$ are norm-preserving, and it is not hard to see that $\sup_{\lvert y\rVert_2 = 1} \lVert \Sigma y\rVert_2$ is the largest singular value.
This supremum is in general not easy to find, but with that, we can see
$$\lVert A\rVert_2^2 = \sup_{\lVert x\rVert_2 = 1} \lVert Ax\rVert_2^2 = \sup_{\lVert x\rVert_2 = 1} \langle Ax, Ax\rangle = \sup_{\lVert x\rVert_2 = 1} \langle x, A^\ast Ax\rangle,$$
and $A^\ast A$ is a positive semidefinite hermitian matrix, thus $\lVert A\rVert_2^2$ is the largest eigenvalue of $A^\ast A$, which sometimes is easier to compute.
However, computing operator norms like $\lVert A\rVert_2$ is in general not a trivial task.
Regarding the Frobenius norm,
$$\lVert I_n \rVert_F = \sqrt{\sum_{i=1}^n \sum_{j=1}^n \delta_{ij}^2} = \sqrt{n},$$
where $\delta_{ij}$ is the Kronecker symbol, $\delta_{ij} = 1$ if $i = j$, and $\delta_{ij} = 0$ if $i\neq j$.
Notice that if $|a_{km}|=max_{i,j}|a_{ij}|$ then $|Ae_m|_{\infty}=|a_{km}|$, where $e_m$ is the column vector with 1 in the $m$ entry and $0$ otherwise.
It is clear that $|Ax|_{\infty}\leq |a_{km}|$, if $x\geq 0$ and $|x|_1=1$ .
So $max_{|x|_1=1\ ,\ x\geq 0}|Ax|_{\infty}=max_{i,j}|a_{ij}|$
Best Answer
Suppose the maximum sum $\sum_{j=1}^n |a_{ij}|$ is gained in row number $i$. Then for each $j$ let $x_j=1$ if $a_{ij}\geq 0$ and $x_j=-1$ if $a_{ij}<0$. Then take $x=(x_1,...,x_n)$. That way for each $j$ we have $a_{ij}x_j=|a_{ij}|$ and hence for each row $k$ we get:
\begin{align*} \left|\sum_{j=1}^n a_{kj}x_j\right| &\leq \sum_{j=1}^n \left|a_{kj}x_j\right|\\ &=\sum_{j=1}^n \left|a_{kj}\right|\\ &\leq \sum_{j=1}^n \left|a_{ij}\right|\\ &=\left|\sum_{j=1}^n a_{ij}x_j \right| \end{align*}
Hence $\|Ax\|_\infty=\sum_{j=1}^n |a_{ij}|$.