OK, your problem with 1st point is your wrong interpretation of that. As I see, it means if you take a fixed number of outcomes the probability of every outcome is equal, i.e. if you generate 100 true-random numbers from 0-9 there are 10 0's, 10 1's and 10 of each (the values are uniformly distributed over a defined interval or set). This probability never changes, so your view that it might not give any 9's in first 100 outcome is, uh according to me, wrong. Even if we suppose no 9's came in first 100 outcomes (assumption, so just assume.) then you can't say what will happen in next generation since it must at least follow point 2 according to you. In real world there is nothing as random, but anything which is too complicated or difficult to establish a series is generally termed as random. In mathematical study of probability we assume true randomness, however there is nothing as such. There is the butterfly effect, because of which even if a butterfly flapped it's wing in different direction, it would lead to a storm. So every minor thing counts, even the air that is a kilometer far away from the place of tossing the coin, hence no accurate predictions can be done. If we are given godly (infinitely powerful) computer that could calculate that, we could predict the future. But there's something more, even if you take real world variables and repeat the experiment many times (say n times), then you find the probability of an event, you get that approaching true-random variable's probability when n tends to infinity, i.e. $n\to\infty$. You can look up an example of a probability calculation that I just did with the computer (yes computer are not truly random) for Coupon Collector's problem and repeated it a crore times (yes computers are fast) and I got 14.6999 or so which actually is 14.7. Think if we repeated it many many more times? (infinite indeed to be accurate, maybe since debatable) There are many other examples of such calculations which you can easily find.
I think that explains my view of random to all. :D
The idea is to regard each choice of uniformly distributed random number $U_i \sim \operatorname{Uniform}(0,1)$ where $i = 1, 2, \ldots, N$, as associated with a Bernoulli trial $B_i \sim \operatorname{Bernoulli}(p)$, where $$B_i = \begin{cases}1, & 0 \le U_i < t \\ 0, & t \le U_i \le 1. \end{cases}$$ That is to say, $B_i = 1$ if and only if the random number $U_i$ is less than the threshold $t$.
Therefore the success parameter $p$ of the Bernoulli trial is $$\Pr[B_i = 1] = \Pr[U_i < t] = \frac{t}{1-0} = t, \quad 0 \le t \le 1.$$ The total of all these (independent) Bernoulli trials is $$T = B_1 + B_2 + \cdots + B_N \sim \operatorname{Binomial}(N, t).$$
Now in order to get a probability that at least $n$ out of $N$ of the $U_i$s are less than $t$, this is simply equivalent to the probability that $T \le n$: $$\Pr[T \ge n] = \sum_{k=n}^N \Pr[T = k] = \sum_{k=n}^N \binom{N}{k} t^k (1-t)^{N-k}.$$ This sum furnishes the exact probability of the desired event for a given $N$, $n$, and $t$.
However, if $\min(n,N-n)$ is very large, then this sum (or its complementary probability) will have many terms and be unwieldy to compute. In such a case, the normal approximation (with continuity correction) to the binomial distribution is desirable: $$\Pr[T \ge n] \approx \Pr\left[ Z \ge \frac{n - Nt - \frac{1}{2}}{\sqrt{Nt(1-t)}} \right] = 1 - \Phi \left( \frac{n - Nt - \frac{1}{2}}{\sqrt{Nt(1-t)}}\right),$$ where $Z \sim \operatorname{Normal}(0,1)$ is a standard normal random variable, and $\Phi(z) = \Pr[Z \le z]$ is the cumulative distribution function of $Z$ whose values can be looked up in a standard normal table.
For your example of $N = 10^6$, $n = 50$, $t = 0.75$, the probability of such an event is about $1 - 4.00871338379431228438 \times 10^{-601806}$. The normal approximation gives $1 - 4.23635959294153500 \times 10^{-651360},$ which is also very close to $1$.
Best Answer
One minus the probability that they are all different: $$1 - \left( 1- \frac{1}{n} \right) \left( 1- \frac{2}{n} \right) \cdots \left( 1- \frac{k-1}{n} \right),$$ where $n=2,147,000,001$ (since you include $0$) and $k=1,000,000.$
See the birthday problem for more information.