First, this construction of the common probability space is valid only if the experiments are independent. If they are, then by definition and (from the theoretical point of view) only by definition, we have
$$P(A_1\cap A_2 \cap \cdots \cap A_n)=P_1(A_1)P_2(A_2)\cdots P_n(A_n) \ \ \ \ (*)$$
if the occurrence of $A_j$, can be decided by observing only the $j^{th}$ experiment; that is when $A_j\in F_j$ (for all$j$).
However, when we consider $\Omega_1 \times \Omega_2 \times \cdots \times \Omega_n$ we may have to compute the probability of events that cannot be written in the form of $(*). $ That is, when the probability cannot be calculated as a product of probabilities.
The best is to consider the simplest possible example: flipping a coin $n$ times. Then the event that during the $n$ experiments there occurred exactly one tails is not an event whose occurrence could be decided by observers who can see only one of the experiments. Let this event be denoted by $T$. That is, let
$$T=\{\text{only one tails occurred during the experiments.}\}$$
So, when we consider $F_1 \times F_1 \times \cdots \times F_n$ we talk about events like the one mentioned above.
If, for instance $n=3$ and $$\Omega_1=\{h_1,t_1\},\Omega_2=\{h_2,t_2\},\Omega_3=\{h_3,t_3\}$$ and $$F_1=\{\emptyset, \{h_1\},\{t_1\},\{h_1,t_1\}\},$$ $$F_2=\{\emptyset, \{h_2\},\{t_2\},\{h_2,t_2\}\},$$ $$F_3=\{\emptyset, \{h_3\},\{t_3\},\{h_3,t_3\}\}$$ then
$$T\not \in F_j, \ j=1,2,3$$.
because $$T=\{(t_1,h_2,h_3),(h_1,t_2,h_3),(h_1,h_2,t_3)\}\in F_1\times F_2 \times F_3.$$
So, $F_1\times F_2 \times F_3$ consists of those events that can be defined if all the experiments are taken into account.
There are many questions that arise when one wants to create these sets and when one wants to extend the probabilities given in the form of $(*)$ to the events that get generated when it comes to the events in $F_1 \times F_1 \times \cdots \times F_n$.
To understand all the details you will have to study measure theory, the theoretical basis of the calculations related to probability. Go google or wiki!
EDITED (Equasion 2)
Let's stick to the example of flipping a coin three times. Say, $A_1=\{t_1\}, A_2=\{t_2\},A_3=\{t_3\}$. (That is, all the three experiments resulted in tails. Let $P_1(A_1)=P_2(A_2)=P_3(A_3)=\frac {1}{2}$.)
We have a specific event (set) now:
$$\{(t_1,t_2,t_3)\}\subset\Omega_1 \times \Omega_2 \times \Omega_3.$$
A possible notation for this event is $$\{t_1,t_2,t_3\}=A_1\times A_2 \times A_3.$$
The fact that the first result was tails ($A_1$) can be expressed in $\Omega_1 \times \Omega_2 \times \Omega_3$ as
$$"A_1"=\{(t_1,t_2,t_3),(t_1,t_2,h_3),(t_1,h_2,t_3),(t_1,h_2,h_3)\}.$$
Obviously, the event that the first result was a tails has two representations now: The first one was $A_1=\{t_1\}$ (in $\Omega_1$ ) and the second one was $"A_1"$ (in $\Omega_1 \times \Omega_2 \times \Omega_3 $). I used quotation marks because theoretically $A_1$ and $"A_1"$ are two very different things: two different representations of the same event. So, saying "the same event" is quite dangerous if one does not understand the theory behind.
Said all that, we have $$P_1(A_1)=P("A_1")=P(\{(t_1,t_2,t_3),(t_1,t_2,h_3),(t_1,h_2,t_3),(t_1,h_2,h_3)\})=\frac{1}{2}.$$
You can write the same, if you like, the following way:
$$\{(t_1,t_2,t_3),(t_1,t_2,h_3),(t_1,h_2,t_3),(t_1,h_2,h_3)\}=A_1\times\Omega_2 \times \Omega_3$$
or
$$P(\{(t_1,t_2,t_3),(t_1,t_2,h_3),(t_1,h_2,t_3),(t_1,h_2,h_3)\})=P(A_1\times\Omega_2 \times \Omega_3)=\frac{1}{2}.$$
Apply the notation described above for $A_2$ and $A_3$ and use the assumption that the consecutive events are independent then you will have
$$P(A_1\times A_2 \times A_3)=P(\{(t_1,t_2,t_3)\})=P_1(A_1)P_2(A_2)P_3(A_3)=\frac{1}{2^3}.$$
EDITED 2
I've just realized that it may be a problem to see why
$$A_1 \times A_2 \times A_3=$$
$$=(A_1 \times \Omega_2 \times \Omega_3) \cap (\Omega_1 \times A_2 \times \Omega_3) \cap (\Omega_1 \times \Omega_2 \times A_3).$$
If you think in terms of triplets like $(t_1,h_2,t_3)$ then you will understand why $$P("A_1"\cap "A_2" \cap "A_3")=P(A_1 \times A_2 \times A_3).$$
EDITED:
How to justify the product rule (in the case of classical probabilities.)
In the case of a classical probability field, the number of elements in $\Omega$ is a finite number, say, $\Omega=\{1,2,\cdots,N\}$. The possible events are modeled by the class of all sets in $\Omega$ and the events of one single elements are equally likely: $P(\{i\})=\frac{1}{N}, \ i=1,2, \cdots N.$
When we model two experiments of this kind then we consider $\Omega \times \Omega=\{(i,j): i,j=1,2,\cdots, N\}$, a set of $N^2$ elements, (and we consider all the possible subsets of $\Omega \times \Omega$).
This is not yet a classical probability space because we have not yet MADE A DECISION about the probabilities of the simplest events of the kind of $\{(i,j)\}$. If we DECLARE that the new probability space is still a classical one then, by definition, we say that $$P'(\{(i,j)\})=P'(\{(k,l)\})=\frac{1}{N^2}=\frac{1}{N}\times \frac{1}{N}=P(\{i\})P(\{j\}).$$
So, from the point of view of the philosophy behind this act of creation, it is not the product rule that we use, but it is the theoretically still non-justify-able hypothesis that the product space is a classical one. Experiments can falsify our hypothesis but it can never be verified apodictically.
I have spoken!
Best Answer
If $\pi_j:\Omega\to\Omega_j$ denotes the $j$th projection map, i.e. $$ \pi_j(\omega_1,\ldots,\omega_n)=\omega_j,\quad $$ then the $j$th marginal distribution of $P$ is the probability measure $$ P_j'(A):=P(\pi_j^{-1}(A)),\quad A\in \mathcal{F}_j, $$ where $\mathcal{F}_j$ is the sigma-algebra on $\Omega_j$. But for $A\in\mathcal{F}_j$ we have $$ \pi_j^{-1}(A)=\{(\omega_1,\ldots,\omega_n)\in\Omega\mid \omega_j\in A\}=\Omega_1\times\cdots\times A\times\cdots \times\Omega_n, $$ where $A$ is in the $j$th position, and hence $$ \begin{align} P_j'(A)&=P(\Omega_1\times\cdots\times A\times\cdots \times\Omega_n)\\ &=P_1(\Omega_1)\cdots P_j(A)\cdots P_n(\Omega_n)=P_j(A). \end{align} $$