[Math] condition number of matrix plus constant times identity

eigenvalues-eigenvectorsinverselinear algebramatrices

I saw this post on the eigenvalues of a matrix plus a constant times the identity matrix.

Say $A$ is an $n\times n$ matrix (real and non-singular) with eigenvalues $\lambda_1,\ldots,\lambda_n$, then the eigenvalues of $A+cI$ are $\lambda_1+c,\ldots,\lambda_n+c$.

My question is on the condition number of $A+cI$, can this also be expressed in terms of the condition number of $A$? We have
$$\kappa(A)=\sqrt{\frac{\lambda_{\mathrm{max}}\left(A^TA\right)}{\lambda_{\mathrm{min}}\left(A^TA\right)}},$$
so
$$\kappa(A+cI)=\sqrt{\frac{\lambda_{\mathrm{max}}\Big((A+cI)^T(A+cI)\Big)}{\lambda_{\mathrm{min}}\Big((A+cI)^T(A+cI)\Big)}}.$$
Now of course $\lambda(A)=\lambda(A^T)$, but is there a relation between $\lambda(A)$ and $\lambda(A^TA)$? That is really all that I'm missing to connect $\kappa(A+cI)$ with $\kappa(A)$.


Edit I'll restate my question, as the current answer is not what I was looking for. I am not necessarily looking for a relation of the form $\kappa(A+cI)=f(\kappa(A))$ for some function $f$, since that apparently does not exist.

What I hope does exist, is a relation between $\lambda(A)$ and $\lambda(A^TA)$, such that I can rewrite the above and arrive at some relation between $\kappa(A)$ and $\kappa(A+cI)$. So, does that exist?

For example when $A$ is SPD,
$$\kappa(A)=\frac{\lambda_{\mathrm{max}}(A)}{\lambda_{\mathrm{min}}(A)}\quad \mbox{ and }\quad \kappa(A+cI)=\frac{\lambda_{\mathrm{max}}(A)+c}{\lambda_{\mathrm{min}}(A)+c}$$ and
$$\kappa(A)<\kappa(A+cI)\quad \mathrm{ or } \quad \kappa(A)>\kappa(A+cI) $$
depending on $c$. Now I was wondering if similar results could be derived for general $A$.

Best Answer

Condition number depends primarily on singular values of $A$. (which are the eigenvalues of $A^TA$) Adding $cI$ can be described simply as a modification on eigenvalues.

In order to modify singular values of $A$, you need to add $UcIV=cUV$, if $A=U D_{iag}(s_i) V$ is the singular value decomposition.

I think this reflects the difficulty of expressing simply the Condition number of $A+cI$.

(For real symmetric matrices $V=U^T$, for condition number of complex matrices, I guess you should use $A^*A$ instead of $A^TA$. Are the problem restricted to matrixes with real elements?)

Response to the edited question:
If $A=U D V$, where $D=D_{iag}(s_i)$, then $A^TA=V^T D U^TU D V=V^TD^2V$.
If you add $cI$ to A, then
$(A+cI)=UDV+cI=U(D+cU^T V^T)V$, and
$(A+cI)^T(A+cI)=V^T (D^T+cVU)(D+cU^T V^T) V $ so the further $VU$ is from being diagonal, the further your singular values might fall from $s_i+c$.

I am expecting that there is no exact formulation, but you can vizualise the change of condition number numerically with random matrices, starting from symmetric ones, and adding small amount $\mu$ of non-symmetric part.
$A'=B+\mu C$, $B=1/2(A+A^T)$, $C=-1/2 (A-A^T)$.