This smells very fishy. I hope you are not asking the others to do your homework.
Anyway, we will first quickly check that $X$ has $n$ positive eigenvalues and $m$ negative eigenvalues. In fact, we have the following congruence relation:
$$
\begin{pmatrix}I_n &0\\-B^TA^{-1} &I_m\end{pmatrix}
\begin{pmatrix}A&B\\ B^T&0\end{pmatrix}
\begin{pmatrix}I_n&-A^{-1}B\\0&I_m\end{pmatrix}
=\begin{pmatrix}A&0\\0&-B^TA^{-1}B\end{pmatrix} = Y\ \textrm{ (say)}.
$$
So, by Sylvester's law of inertia, $X$ and $Y$ have the same numbers of positive, zero and negative eigenvalues. As $A$ is positive definite and $-B^TA^{-1}B$ is negative definite, we have finished this check.
Now we will show that the negative spectrum of $X$ lies inside $P$ and the positive spectrum lies inside $Q$. Put it another way, the endpoints of $P$ and $Q$ give upper and lower bounds to the positive and negative eigenvalues of $X$. Among these four endpoints, three of them can be obtained easily and we will tackle them first. The right endpoint of $P$ is a bit problematic and we will deal with it later. Suppose $\lambda$ (necessarily nonzero) is an eigenvalue of $X$ and $(u^T,v^T)^T$ is a corresponding eigenvector. Then
$$
\begin{cases}
Au + Bv = \lambda u,\\
B^Tu = \lambda v.
\end{cases}
$$
Clearly $u\not=0$, or else the above equations would give $v=0$, contradicting that $(u^T, v^T)^T$ is an eigenvector. So, WLOG, we may assume that $\|u\|=1$. Substituting $v = \frac{1}{\lambda}B^Tu$ into the first equation in the above, mutiply both sides by $\lambda u^T$ on the left, we get $\lambda^2 - \lambda u^TAu - u^TBB^Tu = 0$. Thus
$$
(\ast):\ \lambda = \frac12\left(u^TAu \pm \sqrt{(u^TAu)^2 + 4\|B^Tu\|^2}\right).
$$
So, for each positive eigenvalue, we have
\begin{align}
\lambda &= \frac12\left(u^TAu + \sqrt{(u^TAu)^2 + 4\|B^Tu\|^2}\right)\\
&\le \frac12\left(\max_{\|u\|=1}u^TAu + \sqrt{\max_{\|u\|=1}(u^TAu)^2 + 4\max_{\|u\|=1}\|B^Tu\|^2}\right)\\
&\le \frac12\left(a_1 + \sqrt{a_1 + 4b_1^2}\right).
\end{align}
Also,
$$
\lambda = \frac12\left(u^TAu + \sqrt{(u^TAu)^2 + 4\|B^Tu\|^2}\right)
\ge u^TAu \ge \min_{\|u\|=1}u^TAu = a_n.
$$
This shows that the positive spectrum of $X$ lies inside $Q$. Next, we deal with the left endpoint of $P$, or the lower end of the negative spectrum of $X$. For $a>0$ and $b\ge0$, define $f(a,b) = \frac12\left(a - \sqrt{a^2+4b^2}\right)$. Hence each negative eigenvalue of $X$ is of the form $\lambda = f\left(u^TAu, \|B^Tu\|\right)$. Since $\frac{\partial f}{\partial a} = \frac12 - \frac {a}{2\sqrt{\ldots}} \ge 0$ and $\frac{\partial f}{\partial b} = \frac {-2b}{\sqrt{\ldots}} \le 0$,
$$
\lambda \ge \min_{\|u\|=1}f(u^TAu, \|B^Tu\|^2)
\ge f(\min_{\|u\|=1}u^TAu,\ \max_{\|u\|=1}\|B^Tu\|^2) = f(a_n, b_1).
$$
This gives the left endpoint of $P$.
It remains to obtain the right endpoint of $P$. This is not as easy to tackle. Unless $m=n$, if you put $\lambda\le\max f(u^TAu, \|B^Tu\|^2) \le f(\max u^TAu, \min\|B^Tu\|^2)$, you will get $\lambda\le f(\max u^TAu, 0)=0$, which is completely useless (as negative numbers are always bounded above by 0). In other words, in maximizing $(\ast)$, we cannot optimize $u^TAu$ and $\|B^Tu\|^2$ separately. This looks too difficult, so I will turn to another approach.
In advanced matrix theory, we have a Courant-Fischer-Weyl min-max principle, which says that if the eigenvalues of an order-$N$ Hermitian matrix $X$ are ordered in descending order, say, $\lambda_1\ge\lambda_2\ge\ldots\ge\lambda_N$, then
$$
\lambda_k = \min_{\dim(S)=N-k+1}\max_{\begin{matrix}w\in S\\ \|w\|=1\end{matrix}} w^\ast Xw.
$$
Now, for every unit vector $w\in\mathbb{R}^{n+m}$, write $w^T = (\theta u^T,\sqrt{1-\theta^2}v^T)$, where $\theta\in[0,1]$ and $u,v$ are unit vectors in respectively $\mathbb{R}^n$ and $\mathbb{R}^m$. Then
$$w^TXw = \theta^2 u^TAu + 2\theta\sqrt{1-\theta^2}u^TBv.$$
Since eigenvalues of $X$ are invariant under orthogonal transform, WLOG we may assume that $B$ is already a (perhaps non-square) diagonal matrix with its singular values lying on the diagonal. Let $\{u_1,\ldots,u_n\}$ be the canonical basis of $\mathbb{R}^n$ and $\{v_1,\ldots, v_m\}$ be the canonical basis of $\mathbb{R}^m$. Then $B^Tu_i=b_iv_i$ for $i=1,2,\ldots, m$. Now consider the $m$-dimensional subspace $W\subset\mathbb{R}^{n+m}$ spanned by $\{(u_i^T, -v_i^T)^T: i=1,2,\ldots,m\}$. If $w^T = (\theta u^T,\sqrt{1-\theta^2}v^T)\in W$, then
\begin{align}
w^TXw &= \theta^2 u^TAu + 2\theta\sqrt{1-\theta^2}u^TBv\\
&\le \theta^2 u^TAu - 2\theta\sqrt{1-\theta^2}b_m\\
&\le \theta^2 a_1 - 2\theta\sqrt{1-\theta^2}b_m.
\end{align}
Apply the min-max principle with $N=n+m$ and $k=n+1$, we get
$$
\lambda_{n+1} = \min_{\dim(S)=m}\max_{\begin{matrix}w\in S\\ \|w\|=1\end{matrix}} w^T Xw
\le \max_{\begin{matrix}w\in W\\ \|w\|=1\end{matrix}} w^T Xw
\le \max_{0\le\theta\le 1}\left\{\theta^2 a_1 - 2\theta\sqrt{1-\theta^2}b_m\right\}.
$$
The final step is to show that
$$
\max_{0\le\theta\le 1}\left\{\theta^2 a_1 - 2\theta\sqrt{1-\theta^2}b_m\right\} = \frac 12\left(a_1-\sqrt{a_1^2+4b_m^2}\right).
$$
This can be proved by using the completing-square method and first-year calculus/geometry and I will leave it to you.
I've found a partial solution, namely when $a = b = 0$. The thing to realize is that $D$ and $E$ actually commute so they share a common set of eigenvectors. We can then make a change of basis to this eigenbasis $(x,y) \to (X,Y)$ so that the difference equation becomes, in this basis,
$$
\begin{pmatrix}
r E_1-\lambda+r^{-1}D_1 & 0 \\
0 & r E_1-\lambda+r^{-1}D_2
\end{pmatrix}
\begin{pmatrix}
X \\ Y
\end{pmatrix}
=
\begin{pmatrix}
0 \\0
\end{pmatrix}
$$
where $E_i, D_i$ are the eigenvalues of $E, D$ with common eigenvector $(x_i, y_i)$. Our difference equation then "uncouples" into two regular difference equations. The boundary conditions are still the same $X_0 = Y_0 = X_{N+1} = Y_{N+1} = 0$ and so we really have two copies of the regular tridiagonal Toeplitz matrix problem. If $a \neq 0, b \neq 0$ then we can't apply the same trick, because $C$ doesn't commute with $D$ or $E$.
Best Answer
If $v$ is an eigenvector of $A_1$, then $\pmatrix{v\\ 0}$ is an eigenvector of $A$.
There might not be more eigenvectors: Consider the following example with $n=1$, $A_1=A_2=B=1$. then $$ A = \pmatrix{1 & 1 \\ 0 & 1} $$ only has a one-dimensional eigenspace to the eigenvector $1$.