If $A$ is singular, then 1 is an eigenvalue of $I-A$. So if the matrix norm is induced the 2-norm (i.e. the largest singular value), $\| I-A\|$ is at least 1 since the largest singular value of a matrix is not less than its eigenvalue in absolute value. Proved by contradiction.
Since $\| I-A\|<1$, we have $$\|A^{-1}\|=\|I-(I-A)\|=\|\sum_{i=0}^{\infty}(I-A)^i\|\le\sum_{i=0}^{\infty}\|(I-A)\|^i=\frac{1}{ 1 - \| I - A \|}$$
A sufficient condition for the Neumann series $\frac{1}{ 1 - x}=\sum_{i=0}^{\infty}x^i$ is $|x|<1$.
If $A$ is real and symmetric, then $A$ is orthogonally diagonalisable with real eigenvalues. That is, there exists an orthonormal basis of eigenvectors $v_1, \ldots, v_n$, with respective eigenvalues $\lambda_1, \ldots, \lambda_n \in \Bbb{R}$. An arbitrary vector $x$ can be represented as a linear combination of this basis:
$$x = a_1 v_1 + \ldots + a_n v_n.$$
We have, using Pythagoras's theorem,
$$\|x\|^2 = \|a_1 v_1\|^2 + \ldots + \|a_n v_n\|^2 = |a_1|^2 + \ldots + |a_n|^2.$$
Applying $A$, we also get
\begin{align*}
\|Ax\|^2 &= \|a_1 Av_1 + \ldots + a_n Av_n\|^2 \\
&= \|a_1 \lambda_1 v_1 + \ldots + a_n \lambda_n v_n\|^2 \\
&= |a_1|^2 |\lambda_1|^2 + \ldots + |a_n|^2 |\lambda_n|^2.
\end{align*}
So, the problem of finding $\max_{\|x\| = 1} \|Ax\|^2$ is equivalent to maximising $|a_1|^2 |\lambda_1|^2 + \ldots + |a_n|^2 |\lambda_n|^2$, subject to $\|(a_1, \ldots, a_n)\| = 1$ (with the Euclidean norm). This quantity is maximised by biasing towards the largest value of $|\lambda_i|$, i.e. if $|\lambda_i| \ge |\lambda_j|$ for all $j$, then we get the maximum by setting $a_i = 1$ and each other $a_j = 0$.
In other words, any vector $x$ in the unit sphere which maximises $\|Ax\|^2$ is inevitably an eigenvector corresponding to one of the largest eigenvalues (in the sense that the absolute value is greatest). The actual maximum of $\|Ax\|^2$ is $|\lambda|^2$, where $\lambda$ is the largest eigenvalue. Thus, the operator norm $\|A\|$ by this definition should be $|\lambda|$.
Next, let's consider $\max_{\|x\|=1}|x^\top Ax|$. Let's keep the above linear combination for $x$ as before. Remembering that $v_1, \ldots, v_n$ is orthonormal, we get
\begin{align*}
|x^\top Ax| &= |x \cdot (Ax)| \\
&= |(a_1 v_1 + \ldots + a_n v_n) \cdot (a_1 \lambda_1 v_1 + \ldots + a_n \lambda_n v_n)| \\
&= |\lambda_1 a_1^2 + \ldots + \lambda_n a_n^2|.
\end{align*}
In similar fashion, we restrict $a_1^2 + \ldots + a_n^2 = 1$. If we have a largest eigenvalue $\lambda_i$, then
$$|\lambda_1 a_1^2 + \ldots + \lambda_n a_n^2| \le |\lambda_1| a_1^2 + \ldots + |\lambda_n| a_n^2 \le |\lambda_i|(a_1^2 + \ldots + a_n^2) = |\lambda_i|,$$
an upper bound which is again achieved when $x$ is a corresponding eigenvalue $v_i$. Again, the maximum is simply the modulus of the largest eigenvalue, which agrees with the previous definition.
Best Answer
Hint: Use the definition of the inverse matrix
The adjugate of the 2 × 2 matrix
$$\mathbf{A} = \begin{pmatrix} {{a}} & {{b}}\\ {{c}} & {{d}} \end{pmatrix}$$
is $$\operatorname{adj}(\mathbf{A}) = \begin{pmatrix} \,\,\,{{d}} & \!\!{{-b}}\\ {{-c}} & {{a}} \end{pmatrix}.$$