Two diagonalizable matrices $A$ and $B$ will commute if they are simultaneously diagonalizable; i.e., we can write
$$A = P^{-1} D_A P$$
and
$$B=P^{-1} D_B P$$
where $D_A$ and $D_B$ are diagonal, and the elements of $D_A$ and $D_B$ represent the eigenvalues of $A$ and $B$, respectively.
Pointed out by Ian in the comments, a nondiagonalizable matrix commutes with some matrices, in particular itself.
Now, observe the following for simultaneously diagonalizable $A$ and $B$:
$$A+B = P^{-1}D_AP + P^{-1}D_BP=P^{-1}(D_A+D_B)P.$$
Consequently, the eigenvalues of $A+B$ are given by the elements on the diagonal of $D_A+D_B$.
Note: considering you example, $A$ and $B=\alpha I$, $A$ and $B$ are simultaneously diagonalizable.
Extra note: It is important to recognize that for the general case that I have mentioned, there is some ambiguity as to precisely what the eigenvalues are. More precisely, the equality
$$\lambda_{A+B,i} = \lambda_{A,i}+\lambda_{B,i}$$ does not hold in general. However, given a $\lambda_{A+B,i}$, it is always possible to find a $\lambda_{A,j}$ and $\lambda_{B,k}$ such that
$$\lambda_{A+B,i} = \lambda_{A,j}+\lambda_{B,k}.$$ I'm not sure if anyone else can elaborate on this.
For symmetric matrices you have the Courant-Fischer min-max Theorem:
$$\lambda_k(A) = \min \{ \max \{ R_A(x) \mid x \in U \text{ and } x \neq 0 \} \mid \dim(U)=k \}$$
with
$$R_A(x) = \frac{(Ax, x)}{(x,x)}.$$
Now, your assertion follows easily, since $R_{A+B}(x) > \max\{R_A(x), R_B(x)\}$.
This theorem is also helpful to prove other nice properties of the eigenvalues of symmetric matrices. For example:
\begin{equation*}
\lambda_k(A) + \lambda_1(B) \le \lambda_k(A+B) \le \lambda_k(A) + \lambda_n(B)
\end{equation*}
This shows the continuous dependence of the Eigenvalues on the entries of the matrix, and also your assertion.
Best Answer
For an easy example where the inequality is strict, take $T=\begin{bmatrix} 0&1\\0&0\end{bmatrix}$. Then $$ \sum_{j=1}^2|\lambda_j|^2=0+0<1=\operatorname{Tr}\left(\begin{bmatrix} 0&0\\0&1\end{bmatrix}\right)=\operatorname{Tr}(T^*T). $$ As for the inequality, use the Schur decomposition to get an orthonormal basis $\{e_1,\ldots,e_n\}$ such that $T$ is triangular. Then \begin{align} \sum_{j=1}^n|\lambda_j|^2 &=\sum_{j=1}^n|\langle Te_j,e_j\rangle|^2\\ &\leq\sum_{j=1}^n\|Te_j\|^2\,\|e_j\|^2\\ &=\sum_{j=1}^n\langle T^*Te_j,e_j\rangle\\ &=\operatorname{Tr}(T^*T). \end{align}
As mentioned by amsmath, equality occurs precisely when we have equality in Cauchy-Schwars in the second line. This means that $Te_j$ and $e_j$ have to be colinear for all $j$, i.e. there exist numbers $\lambda_j$ with $Te_j=\lambda_je_j$. Then $T$ is normal.