Let $V$ be a matrix containing the eigenvectors (of which there are $n$, where these are $n \times n$ matrices). Then
$$
V^{-1} A V = D_1
$$
is diagonal, as is $V^{-1} B V = D_2$. (It'd be a really good exercise for you to figure out why,) If you now compute, you get
$$
V^{-1}(cA + dB)V = (cV^{-1}A + dV^{-1}B) V = c V^{-1}AV + dV^{-1}BV = cD_1 + dD_2
$$
which is again diagonal. $$D_{(i,i)} = cD_{1(i,i)} + dD_{2(i,i)}$$
This is a theorem with a name: it is called the Spectral Theorem for Hermitian (or self-adjoint) matrices. As pointed out by Jose' Carlos Santos, it is a special case of the Spectral Theorem for normal matrices, which is just a little bit harder to prove.
Actually we can prove the spectral theorem for Hermitian matrices right here in a few lines.
We are going to have to think about linear operators rather than matrices. If $T$ is a linear operator on a finite dimensional complex inner product space $V$, its adjoint $T^*$ is another linear operator determined by $\langle T v, w\rangle = \langle v, T^* w \rangle$ for all $v, w \in V$. (Note this is a basis-free description.) $T$ is called Hermitian or self-adjoint if $T = T^*$.
Let $B$ be an $n$-by-$n$ complex matrix and $B^*$ the conjugate transpose matrix. Let $T_B$ and $T_{B^*}$ be the corresponding linear operators. Then $(T_B)^* = T_{B^*}$, so a a matrix is Hermitian if and only if the corresponding linear operator is Hermitian.
Let $A$ be a Hermitian linear operator on a complex inner product space $V$ of dimension $n$. We need to consider $A$--invariant subspaces of $V$, that is linear subspaces $W$ such that $A W \subseteq W$. We should think about such a subspace as on an equal footing as our original space $V$. In particular, any such subspace is itself an inner product space, $A_{|W} : W \to W$ is a linear operator on $W$, and $A_{|W}$ is also Hermitian.
If $\dim W \ge 1$, $A_{|W}$ has an least one eigenvector $w \in W$ -- because any linear operator at all acting on a (non-zero) finite dimensional complex vector space has at least one eigenvector.
The basic phenomenon is this: Let $W$ be any invariant subspace for $A$. Then $W^\perp$ is also invariant under $A$. The reason is that if $w \in W$ and $x \in W^\perp$, then
$$
\langle w, A x\rangle = \langle A^* w , x \rangle = \langle A w, x \rangle = 0,
$$
because $Aw \in W$ and $x \in W^\perp$. Thus $A x \in W^\perp$.
Write $V = V_1$.
Take one eigenvector $v_1$ for $A$ in $V_1$. Then $\mathbb C v_1$ is $A$--invariant. Hence $V_2 = (\mathbb C v_1)^\perp$ is also $A$ invariant. Now just apply the same argument to $V_2$: the restriction of $A$ to $V_2$ has an eigenvector $v_2$ and the perpendicular complement $V_3$ to $\mathbb C v_2$ in $V_2$ is $A$--invariant. Continuing in this way, one gets a sequence of mutually orthogonal eigenvectors and a decreasing sequence of invariant subpsaces, $V = V_1 \supset V_2 \supset V_3 \dots$ such that $V_k$ has dimension $n - k + 1$. The process will only stop when we get to $V_n$ which has dimension 1.
Best Answer
In your example, matrices $A$ and $B$ are both diagonalizable (and both have $n$ independent eigenvectors), so it's not an instance of the thing you're describing:
(Also, since $A$ and $B$ are both symmetric in this example, we know in advance that they should be diagonalizable.)
But in general, no: just because $A$ commutes with $B$ and $B$ is diagonalizable, doesn't mean that $A$ is diagonalizable (in the same basis that diagonalizes $B$, or otherwise). For instance, any matrix (diagonalizable or otherwise) commutes with the zero matrix and the identity matrix.
Also, the Jordan form of a matrix lets us write it as $D + N$ in some basis, where $D$ is diagonal, $N$ is nilpotent (and therefore not diagonalizable in general) and $D$ commutes with $N$, giving us a whole slew of counterexamples.