Kolmogorov: question on definition of Independence from his book

math-historyprobability theory

I'm reading Kolmogorov Foundations of the Theory of Probability, Page 9, which explains Independence and have questions. Kolmogorov provides the following pictured Definition I for independence. The definition is similar to what more modern books provide with exception to the inclusion of the experiment number.

The definition is confusing because an experiment $U^{(n)}$ is a decomposition of events. A decomposition is a set of disjoint sets such that when union is taken, the result is $E$. Kolmogorov typically uses $q_1, q_2, …, q_n$ typically to denote elementary events (of the sample space). From what I read, the definition implies that the intersection of an element from each experiment's decomposition.

My questions, is above explanation correct? If so, Definition I is $= 0$ because an intersection of elementary events $A^{(1)}_{q_1} \cap A^{(1)}_{q_2} \cap ….\cap A^{(n)}_{q_n} = P(0) = 0$. I think I have read something wrong, because there should be the possibility that these events intersect

Appreciate any guidance on understanding of Kolmogorov's grammatical and symbolic explanation of independence mentioned here.

a
b

Best Answer

$A_{q_i}^{(i)}$ isn't an elementary event - it's some set of events.

Each experiment $\mathfrak{A}^{(i)}$ is a decomposition of $E$ into sets $A_j^{(i)}$. Sets say $A_{q_1}^{(1)}$ and $A_{q_2}^{(2)}$ are from different decompositions, so they can intersect.