I'm going to be even more concrete than you and fill in the choice of $X$ in a way that I hope will be illuminating.
The note in the linked document is about the specific case of tuples, taking advantage of types $\overline{m}$ (where $m : \mathbb{N}$) that are inhabited by exactly $m$ elements; $\overline{m} = \{0, 1, \dots, m-1\}$. Altenkirch has previously demonstrated that $\overline{m+n}$ has as many elements as $\overline{m} + \overline{n}$, where the $+$ on the left is adding numbers and the $+$ on the right is the disjoint union of types. Similarly, multiplication of numbers corresponds here to the type-theoretic product, and as you observe, the function type $\overline{n}\rightarrow\overline{m}$ has the same number of elements as the type $\overline{m^n}$, namely $m^n$.
We now move on to the dependent product and sum.
Altenkirch's arithmetic claims for these are that for $m : \mathbb{N}$ and $f : \overline{m} \rightarrow \mathbb{N}$, we have:
- The $\Sigma$-type $\Sigma x:\overline{m}\mathbin{.}f\,x$ has $f(0) + f(1) + \cdots + f(m-1)$ elements, and
- the $\Pi$-type $\Pi x:\overline{m}\mathbin{.}f\,x$ has $f(0) \times f(1) \times \cdots \times f(m-1)$ elements.
(I've changed the notation a bit just because the original has some deliberate but potentially confusing overlap between numbers and their corresponding finite types.)
Your types of Booleans and Alphas are equivalent to $\overline{2}$ and $\overline{3}$.
When you form your $\Pi x:\mathrm{Boolean}\mathbin{.}X\,x$, this stands for the type of functions where a Boolean comes in, the result type can vary depending on whether that Boolean was true or false, and the result's value is of that correct type. Here, $X : \mathrm{Boolean} \rightarrow \mathbf{Type}$ is the thing which tells us the type of the result. A function $\phi$ of this $\Pi$-type is determined by the pair of values $(\phi(\mathrm{True}), \phi(\mathrm{False}))$ it takes in $X\;\mathrm{True} \times X\;\mathrm{False}$. For example, if $X\;\mathrm{True}$ is $\overline{7}$ and $X\;\mathrm{False}$ is $\overline{4}$, then a perfectly good function would be the one for which $\phi(\mathrm{True}) = 5$ and $\phi(\mathrm{False}) = 1$. We can see that for this $X$, the total number of possible functions is $7 \times 4$; and for a general $X$ it will be the product of the sizes of $X\;\mathrm{True}$ and $X\;\mathrm{False}$. That is what Altenkirch asserted in the special case of $m = 2$, and hopefully the behaviour for more general $m$ will now be clearer.
Turning to $\Sigma$-types, you propose $\Sigma x : \mathrm{Boolean}\mathbin{.}X x$ with $X$ as above. Something of this type consists of a Boolean $x$, together with a value in $X\;x$. So it is either $(\mathrm{True}, y)$ where $y : X\;\mathrm{True}$, or it is $(\mathrm{False}, z)$ where $z : X\;\mathrm{False}$. Again following the example just above, if $X\;\mathrm{True}$ is $\overline{7}$ and $X\;\mathrm{False}$ is $\overline{4}$, then our complete repertoire of elements is:
$$
(\mathrm{True}, 0), (\mathrm{True}, 1), (\mathrm{True}, 2),(\mathrm{True}, 3),(\mathrm{True}, 4),(\mathrm{True}, 5),(\mathrm{True}, 6)\\
(\mathrm{False}, 0), (\mathrm{False}, 1), (\mathrm{False}, 2), (\mathrm{False}, 3)
$$
There are evidently $7 + 4$ inhabitants of this type. For a general $X$, we would get the sum of the sizes of $X\;\mathrm{True}$ and $X\;\mathrm{False}$, consistent with Altenkirch's statement in the case $m=2$.
Yes, your argument is correct.
I will also address your more general question about
the connection of propositional logic and mathematical proofs
For the most part, everyday mathematics is written in English (or other natural language) with a few symbols. So your proof is essentially
If $6$ divides $x$ then $x = 6k = 3(2k)$ for some integer $k$. That
shows $x$ is also a multiple of $3$.
The formal argument with the truth table for implication is called for only if that's what your instructor requires at this stage of you education.
Best Answer
If you allow quantification over $\mathsf{Type}$, then $A \times B$ and $A + B$ have the following definitions as $\Pi$-types: $$ A + B := \Pi_{(C: \mathsf{Type})} (A \to C) \to (B \to C) \to C \\ A \times B := \Pi_{(C: \mathsf{Type})} (A \to (B \to C)) \to C $$ where every $\to$ could be further replaced by a $\Pi$-type, as you also illustrated. When it comes to $A \to B$ as a $\Sigma$-type, then (apart for the case where $A$ is some finite type*) I am less convinced that it will work out. One thing to note is that $\Sigma$-types can also be defined in terms of $\Pi$-types. Given $P : A \to \mathsf{Type}$ we can define $$ \Sigma_{(a :A)} P(a) := \Pi_{(C : \mathsf{Type})} ( \Pi_{(a : A)} (P(a) \to C)) \to C $$
$^*$ If we refer to $\mathsf{n}$ as the type with $n$ terms, then (depending on the exact meaning of $\equiv$) you could probably do something like: $$ \mathsf{(n+1)} \to B \equiv \Pi_{(x \, : \, \mathsf{(n+1)})} B \equiv B \times \Pi_{(x \, : \, \mathsf{n})} B \equiv \Sigma_{(b : B)} \, \Pi_{(x \, : \, \mathsf{n})} B \equiv \Sigma_{(b : B)} \, (\mathsf{n} \to B) $$ and then continue this until we reach $\mathsf{0} \to B \equiv \mathsf{1}$. This is really just trying to imitate exponentiation of a number $a : \mathbb{N}$ with some natrual number: $$ a^4 = a \cdot a^3 = \sum_{i = 1}^a a^3 = \sum_{i = 1}^a a \cdot a^2 = \dots = \sum_{i = 1}^a \sum_{j = 1}^a\sum_{k = 1}^a\sum_{l = 1}^a 1 $$