In what follows I assume that rank$(A)=s>k$ (otherwise, the claim follows trivially, since we can pick $B=A$ and $\sigma_{k+1}=0$).
For any $C\in M^{m\times n}(\mathbb{C})$ and $z\in\mathbb{C}^n$,
$$||C||:=\sup_{x\in\mathbb{C}^n}\frac{||Cx||}{||x||}\geq \frac{||Cz||}{||z||}.$$
Hence, it is sufficient to show that for any $B\in M^{m\times n}$ such that rank$(B)\leq k$, there exists some $z\in\mathbb{R}^n$ such that
$$||(A-B)z||\geq \sigma_{k+1}.$$
Then, by SVD, we have that
$$A=U\Sigma V^T = \sum_{i=1}^s\sigma_iu_iv_i^T,$$
where $\sigma_1\geq\sigma_2\geq\dots\geq \sigma_s$ and $\{u_1,\dots,u_s\}$ and $\{v_1,\dots,v_s\}$ are orthonormal sets of vectors.
Let $B\in M^{m\times n}$ be such that rank$(B)\leq k$. The dimension of the nullspace $B$ is greater or equal to $n-k$ while that of the span of $\{v_1,\dots,v_{k+1}\}$ is equal to $k+1$. Hence, this two subspaces intersect. Hence, there is some vector $z\in\mathbb{C}^n$ of unit length such that
$$Bz=0,\quad\quad z\in\textrm{span}\{v_1,\dots,v_{k+1}\}.$$
So,
$$(A-B)z=Az=\sum_{i=1}^s\sigma_iu_iv_i^Tz=\sum_{i=1}^{k+1}\sigma_iu_iv_i^Tz.$$
Since $\{u_1,\dots,u_s\}$ and $\{v_1,\dots,v_s\}$ are orthonormal sets of vectors,
$$||(A-B)z||^2=\left(\sum_{i=1}^{k+1}\sigma_i z^Tv_iu_i^T\right)\left(\sum_{i=1}^{k+1}\sigma_i u_iv_i^Tz\right)=\sum_{i=1}^{k+1}\sigma_i^2(v_i^Tz)^2\geq \sigma_{k+1}^2||z||^2.$$
I mistakenly thought $\|\cdot\|$ was the $\ell^2$-norm. See below the line for the general situation.
Hint:
$$\|Ax\|^2=\left\|\begin{bmatrix}\lambda_1 x_1 \\ \vdots \\ \lambda_n x_n\end{bmatrix}\right\|^2 = \lambda_1^2 x_1^2 + \cdots + \lambda_n^2 x_n^2 \le (\max_i \lambda_i^2) (x_1^2 + \cdots + x_n^2) = (\max_i \lambda_i^2) \|x\|^2.$$
By looking at the definition of $\|A\|$, can you now compute $\|A\|$?
Computing $\|A^{-1}\|$ is similar, since it is also a diagonal matrix.
General situation:
- For any [submultiplicative] matrix norm $\|\cdot\|$, we have $\|A\| \ge \max_i |\lambda_i|$. (See below.)
- Since subordinate norms are submultiplicative matrix norms, this inequality holds in the setting of your question.
- Moreover, by considering $x$ being the standard basis vectors, we see that we actually have the equality $\|A\| = \max_i |\lambda_i|$. Can you conclude from here?
Proof of Claim 1: Let $\|\cdot \|$ be a [submultiplicative] matrix norm. Let $x$ be a $\lambda_i$-eigenvector, and let $X$ be the $n \times n$ matrix whose columns are all $x$. Then
$$|\lambda_i| \|X\| = \|\lambda_i X\| = \|A X\| \le \|A\| \|X\|.$$
Best Answer
A orthogonal matrix $Q$ is satisfies $\|Qx\|=\|x\|$ for all $x$.
Therefore the sets $\{x\in\mathbb R^n:\|x\|=1\}$ and $\{x\in\mathbb R^n:\|Qx\|=1\}$ are equal and so
$$\|A\|=\max_{\|x\|=1}{\|Ax\|}=\max_{\|Qx\|=1}{\|AQx\|}=\max_{\|x\|=1}{\|AQx\|}=\|AQ\|$$
and $$\|A\|=\max_{\|x\|=1}{\|Ax\|}=\max_{\|x\|=1}{\|QAx\|}=\|QA\|$$
so multiplying a matrix by an orthogonal matrix (from the left or the right) does not change it's norm.
Since $Q^{-1}$ is also orthogonal:
$$\kappa(A)=\|A^{-1}\|\|A\|=\|Q^{-1}A^{-1}\|\|AQ\|=\|(AQ)^{-1}\|\|AQ\|=\kappa(AQ)$$ and similary $\kappa(A)=\kappa(QA)$.