OP asks:
I used two very approximate and incorrect methods, and got at the same result... this implies that this result cannot be too far from the truth. But what is the truth?
One can easily find exact solutions with a computer algebra system ... which is helpful for the comparison you seek. In particular, if $p = \frac12$, then $X$ and $Y$ have joint pmf say $f(x,y)$:
f = (2^-t Binomial[t, x]) ( 2^-t Binomial[t, y]);
domain[f] = {{x, 0, t}, {y, 0, t}} && {t > 0} && {Discrete};
... where, in your set up, $t = 4n$. One can then find exact solutions for any given value of $t$, or your $n$. In particular, $E[min(X,Y)]$ for $n$ = 1 to 10 can be calculated exactly as:
sol = Table[ Expect[Min[x, y], f /. t -> 4 n], {n, 1, 10}]
which returns:
$$\left\{\frac{93}{64},\frac{26333}{8192},\frac{10554795}{2097152},\frac{1846943453}{268435456},\frac{1202081373695}{137438953472},\frac{186920529770667}{17592186044416},\frac{56357790507521559}{4503599627370496},\frac{8307059966383480541}{576460752303423488},\frac{9629671370833818977607}{590295810358705651712},\frac{1376773263601616247805695}{75557863725914323419136}\right\}$$
where I am using the Expect
function from the mathStatica add-on to Mathematica.
To a few decimal places, the above exact solution is:
{1.45313, 3.21448, 5.03292, 6.8804, 8.74629, 10.6252, 12.5139, 14.4105, 16.3133, 18.2214}
... whereas your approximate solution of $2 n-\sqrt{\frac{n}{\pi }}$ yields:
{1.43581, 3.20212, 5.02279, 6.87162, 8.73843, 10.618, 12.5073, 14.4042, 16.3074, 18.2159}
In summary: your approximate solution is not only elegant, but appears to work remarkably well! However, your stated exact solution $E[M]$ appears to be incorrect.
If you would like to solve yourself, $E[ min(X,Y)]$ can also be expressed as the following sum:
$$ \sum_{y=0}^t \text{expr} $$
where:
expr = 2^(-1-2t)Binomial[t,y] (2^t t-2 Binomial[t, 1+y] Hypergeometric2F1[1, 1-t+y, 2+y,-1] - 2 Binomial[t, 2+y] Hypergeometric2F1[2, 2-t+y, 3+y, -1])
Note:
This answer uses the mathStatica package for Mathematica. As disclosure, I should add that I am one of the authors.
This seems like a straightforward problem in conditional probability. Let $W_1$ be the event of winning the first game, $W_2$ be the event of winning the second game, and $F_b, F_i, F_m$ be the events of facing a beginner, an intermediate player, and a master.
We are given the unconditional probabilities $P(F_b) = P(F_i) = P(F_m) = \frac13$, and the conditional probabilities
$$
P(W_1 \mid F_b) = P(W_2 \mid F_b) = \frac{9}{10}
$$
$$
P(W_1 \mid F_i) = P(W_2 \mid F_i) = \frac{5}{10}
$$
$$
P(W_1 \mid F_m) = P(W_2 \mid F_m) = \frac{3}{10}
$$
We are asked to find $P(W_2 \mid W_1)$. You have already observed that $P(W_1) = \frac{17}{30}$. From Bayes's theorem, we can write
$$
P(F_b \mid W_1) = \frac{P(W_1 \mid F_b)P(F_b)}{P(W_1)} = \frac{9}{17}
$$
$$
P(F_i \mid W_1) = \frac{P(W_1 \mid F_i)P(F_i)}{P(W_1)} = \frac{5}{17}
$$
$$
P(F_m \mid W_1) = \frac{P(W_1 \mid F_m)P(F_m)}{P(W_1)} = \frac{3}{17}
$$
By total probability, we have
\begin{align}
P(W_2 \mid W_1) & = P(W_2 \mid W_1, F_b) P(W_1 \mid F_b) \\
& + P(W_2 \mid W_1, F_i) P(W_1 \mid F_i) \\
& + P(W_2 \mid W_1, F_m) P(W_1 \mid F_m)
\end{align}
Because $W_1$ and $W_2$ are conditionally independent, given the same opponent, we can rewrite this as
\begin{align}
P(W_2 \mid W_1) & = P(W_2 \mid F_b) P(W_1 \mid F_b) \\
& + P(W_2 \mid F_i) P(W_1 \mid F_i) \\
& + P(W_2 \mid F_m) P(W_1 \mid F_m) \\
& = \frac{9}{10} \times \frac{9}{17}
+ \frac{5}{10} \times \frac{5}{17}
+ \frac{3}{10} \times \frac{3}{17} \\
& = \frac{115}{170} = \frac{23}{34}
\end{align}
P.S. I realize it's just a framing device, but I can't believe a single chess player could win $90$ percent against a beginner, $50$ percent against an intermediate player, and $30$ percent against a master.
Best Answer
When you are talking about two random variables, you need to think about their joint distribution - so, rather than talking about $P(X=i)$, you need to talk about $P(X=i\text{ and }Y=j)$, or, as we usually write it, $P(X=i,Y=j)$.
If it helps, think of it as randomly choosing a vector with two components - then calling the first component $X$ and the second component $Y$. You can think of $X$ and $Y$ as the separate outcomes of two experiments - which may or may not be related. So, $X$ could be how much you win in the first hand of poker, and $Y$ how much you win in the second. Then $X+Y$ is how much you won in the first two hands together.
With this in hand, for a function $f(x,y)$, we can define (for variables that take discrete values), $$ \mathbb{E}[f(X,Y)]=\sum_{x,y}f(x,y)\cdot P(X=x, Y=y). $$ So, in your particular case, $$ \mathbb{E}[X+Y]=\sum_{x,y}(x+y)P(X=x,Y=y)=\sum_{x,y}xP(X=x,Y=y)+\sum_{x,y}yP(X=x,Y=y). $$ Consider the first of these sums. Note $$ \sum_{x,y}xP(X=x,Y=y)=\sum_{x}x\sum_{y}P(X=x,Y=y). $$ The inner sum here is precisely $P(X=x)$: the event "$X=x$" is the same as the event "$X=x$ and $Y$ takes any value", whose probability is exactly this sum. So, $$ \sum_{x,y}xP(X=x,Y=y)=\sum_{x}x\sum_{y}P(X=x,Y=y)=\sum_{x}xP(X=x)=\mathbb{E}[X]. $$ Similarly, $$ \sum_{x,y}yP(X=x,Y=y)=\mathbb{E}[Y], $$ and combining these gives the formula $$ \mathbb{E}[X+Y]=\mathbb{E}[X]+\mathbb{E}[Y]. $$