Use a proof by induction on $m$:
$(a)$ It seems to me, then, that you have your base case $m = 2$: $P(2)$,
$\quad$ though it suffices, for a base case, to prove it's (trivially) true for $m = 1$.
$\quad$ But the process used in class for proving it's true for $m = 2$ will be helpful when making the
$\quad$ inductive step of the proof.
$(b)$ Your inductive hypothesis $P(k)$ would be to assume that this is true for $m = k$.
$\quad$ That is, assume the truth of: $$P(K):\quad
\det \begin{bmatrix}
A_1 &* &* &* &* &* \\
0& A_2 &* &* &* &* \\
.& 0& A_3 &* &* &* \\
.& 0& 0 &... &* &* \\
.& 0& 0& & ... &* \\
0& .& ...& 0&0 & A_k
\end{bmatrix}
= \prod_{i=1}^k \det A_i
$$
$(c)$ Then, take the inductive step: you'll need to use the inductive hypothesis to prove that $P(k+1)$ is true: for $m = k + 1$...
That is, assuming $P(k)$ is true, prove:
$$
\det \begin{bmatrix}
A_1 &* &* &* &* &| &* \\
0& A_2 &* &* &* &| &* \\
.& 0& A_3 &* &* &| &* \\
.& 0& 0 &... &* &| &* \\
.& 0& 0& ...& A_k &| &* \\
\hline& & & &\\
0& 0& ...& 0&0 &| & A_{k+1}
\end{bmatrix}
= \prod_{i=1}^{k+1} \det A_i
$$
Note that to do this, you can partition the matrix into two block matrices on the diagonal,
$(1)$ one of which is triangular (block) matrix with $k$ sub-blocks $A_i$ for $1 < i < k$ along it's diagonal, for which the determinant you know from the inductive hypothesis (having assumed its truth), and
$(2)$ the other with one block on the diagonal which we call $A_{k+1}$.
Here is where you can use the proof used in class for $m = 2$
That is: $$\prod_{i=1}^{k+1} \det A_i = \left(\prod_{i=1}^{k} \det A_i\right)\cdot \det(A_{k+1})
$$
Then you will have shown that the determinant of a partitioned triangular matrix is product of the determinants of the block matrices on the diagonal.
Best Answer
Yes, it's always true, but note that there is a premise to fulfill: $C$ and $D$ have to commute, that is, we need $CD=DC$. If this condition is violated, the formula may fail to hold.
More generally, when the entries of $A,B,C,D$ be matices over a commutative ring (this includes, but isn't limited to, the cases where $A,B,C,D$ are real or complex matrices), we have $$ \det \pmatrix{A&B\\ C&D}= \begin{cases} \det(AD-BC) & \text{ if } CD=DC,\\ \det(DA-CB) & \text{ if } AB=BA,\\ \det(DA-BC) & \text{ if } BD=DB,\\ \det(AD-CB) & \text{ if } AC=CA. \end{cases} $$ This was first proved by M. H. Ingraham (see his paper A note on determinants) and later proved independently by John Silvester (see his paper). Essentially, suppose $AC=CA$ (this is the last case in the above). Then $$ \pmatrix{I&0\\ -C&A+xI}\pmatrix{A+xI&B\\ C&D}=\pmatrix{A+xI&B\\ AC-CA&(A+xI)D-CB}. $$ By assumption, $AC-CA=0$, so the RHS is block-triangular and $$\det(A+xI)\,\det\pmatrix{A+xI&B\\ C&D}=\det(A+xI)\,\det\left((A+xI)D-CB\right).$$ As $\det(A+xI)$ is a nonzero polynomial in $x$, we can divide both sides by it and obtain $\det\pmatrix{A+xI&B\\ C&D}=\det\left((A+xI)D-CB\right)$. Put $x=0$, the assertion follows. The proofs for the other three cases are similar.
Remark. Those who are familiar with Schur complement may recognise that the same technique using block matrix decomposition is also used in the derivation of Schur complement. In fact, if the matrix entries are taken from a field, we can use Schur complement to prove the above formula. E.g. suppose again that $AC=CA$. Let $x$ be an indeterminate and extend the underlying field $F$ to $F(x)$, the field containing all rational functions in $x$. Then $A+xI$ is invertible and it commutes with $C$. Therefore, by using Schur complement, we get \begin{aligned} \det\pmatrix{A+xI&B\\ C&D} &=\det(A+xI)\det\left(D-C(A+xI)^{-1}B\right)\\ &=\det(A+xI)\det\left(D-(A+xI)^{-1}CB\right)\\ &=\det\left((A+xI)D-CB\right) \end{aligned} and the assertion again follows by putting $x=0$.