I would say there's you have a chance of around 1 in 147 quintillion. More specifically, there are 67 games, since all but one of the 68 must lose. Thus, assuming you pick all your games with the flip of a coin, you've got a probability of $1/2^{67} = 1/147573952589676412928$ of picking everything correctly.
The computation is fairly simple. I'm assuming that the odds of picking a single game correctly are 50-50 or a probability of $1/2$. I'm also assuming that the choices are independent of one another so that their joint probabilities can be computed via multiplication. We then simply multiply 67 one-halves together to get the result.
Your first link points to a USA Today article which in turn points to a page on Geekosystem which shows an image of a Depaul mathematics professor standing in front of a white board with the computation $2^{63} = 9223372036854775808$. That suggests that the 9 quintillion number refers to the smaller 64 team bracket we had prior to the play-in games. My computation arises from the larger bracket that includes four play-in games.
Edit: It might be worth mentioning that most online bracket challenge games start after the play in games and, therefore, use the 64 team bracket.
Of course, some folks might be better than 50-50 at picking basketball games. Let's suppose you can pick a winner three fourths of the time. (Note that, earlier this month, Sports Illustrated's Seth Davis was 36 of 50 or just 72%.) Then, your odds would be $(3/4)^{67}$ or about 1 in 234 million. Still not very good.
I think I find this line of reasoning rather too speculative, I'm afraid. :-)
First, you assert that $P(I) = 1/2$. I don't see any symmetry that justifies this kind of application of the principle of indifference; it's not as though the only difference between being selected for an interview and not being selected is the opportunity to go to the interview. From the job poster's perspective, they have limited time to interview candidates, whether five people apply for the posting, or five hundred.
But suppose we put that aside for the moment. You then assert that $P(J \mid I) = 1/4$, on the assumption that once half of the possibilities have been eliminated (presumably, the $\neg I$ portion), only half of the half remain, or $1/4$. But $P(J \mid I)$ is already a conditional probability—it expresses the probability of the more specific compound event $I$ and $J$ as a fraction of the probability of the condition $I$! If you are to apply the principle of indifference again, you should treat the two possibilities equally; that is, $P(J \mid I) = P(\neg J \mid I)$. And since, by excluded middle, $P(J \mid I)+P(\neg J \mid I) = 1$ necessarily, it should be the case that
$$
P(J \mid I) = P(\neg J \mid I) = \frac{1}{2}
$$
Instead, you have
$$
P(J \mid I) = \frac{1}{4}
$$
which implies
$$
P(\neg J \mid I) = \frac{3}{4}
$$
This has no more basis than the assertion that $P(I) = 1/2$, and violates the principle of indifference to boot. (Of course, I said that the principle of indifference shouldn't be applied here, but since you appear to want to do it...)
In other words, when you arrive at $1/4$, you are determining the probability that you get the interview and you get the job (and that's $P(I, J)$), not the probability that you get the job given that you got the interview (that's $P(J \mid I)$). If you want to analyze the problem that way, there's nothing wrong with that (modulo indifference), but then you should just leave that value alone, and not multiply, again, the probability that you get the interview. That's already been accounted for in the joint probability.
I think the most we can say about it is as follows: Suppose the probability of being selected for a given job posting is $\sigma$. Then, by independence, the probability of being selected for at least one of two job postings is $1-(1-\sigma)^2$; the probability of being selected for at least one of three job postings is $1-(1-\sigma)^3$; and the probability of being selected for at least one of $k$ job postings is $1-(1-\sigma)^k$. Even independence seems rather daring, but I can see a case for it better than I can for indifference.
Best Answer
This is a problem in which the complementary approach will be the most fruitful - let's instead consider how likely you are to not get a job. We know that, for an event $A$, then
$$P(A) = 1 - P(\text{not} \; A)$$
That is to say, more relevant to your case,
$$P(\text{getting at least one job offer}) = 1 - P(\text{getting no job offers})$$
Since the odds of getting a job doesn't affect that for any other job, we know
$$\begin{align} P(\text{getting no offers}) &= (1 - P(\text{getting job #1})) \\ &\times (1 - P(\text{getting job #2})) \\ &\times (1 - P(\text{getting job #3})) \\ &... \\ &\times (1 - P(\text{getting job #10})) \end{align}$$
With these two facts in mind you should find it easy to complete.