For clarity, I prefer to segregate the sample size from the support of the distribution. That is, I will assume that $X$ is uniform over $\{ 1, 2, \ldots, U \}$. The special case that the question is concerned with is not much simpler anyway. $\newcommand{\e}{\mathrm{e}} \newcommand{\Xmin}{X_\min}$
For $1 \leqslant r \leqslant U$, the probability that the minimum $\Xmin$ is at least $r$ is equal to
$$
\Pr(\Xmin \geqslant r) = \frac{(U-r+1)^n}{U^n}.
$$
This is because the event $\Xmin \geqslant r$ happens iff $X_i \geqslant r$ for all $i \in \{ 1, 2, \ldots, n \}$. For any fixed $i$, the probability of the latter event occurring is $\frac{U-(r-1)}{U}$. And as usual, by independence, the probability of all the $n$ events occurring simultaneously is just the product of the individual probabilities.
Therefore (e.g., see this wikipedia page for the formula used),
$$
\mathbf{E}(\Xmin) = \sum_{r \geqslant 1} \Pr(\Xmin \geqslant r) = \sum_{1 \leqslant r \leqslant U} \frac{(U-r+1)^n}{U^n} = \frac{1}{U^n} \sum_{s = 1}^{U} s^n, \tag{$\ast$}
$$
by re-indexing the sum.
The final expression $(\ast)$ is essentially the sum of the $n^{th}$ powers of the first $U$ natural numbers. I don't think this be simplified much further. :-)
Some asymptotics. We can say a bit more about the original question assuming $n \to \infty$.
In this case, for constant $r$, the probability that $\Xmin \geqslant r$ is equal to
$$
\Big( 1 + \frac{1-r}{n} \Big)^n \to \e^{1-r}.
$$
Of course, if $r$ is not a constant but grows with $n$ (i.e., $r \to \infty$ as $n \to \infty$), then this probability goes to $0$. Thus the probability that the minimum is exactly $r$ is
$$
\e^{1-r} - \e^{-r} = (\e-1) \cdot \e^{-r},
$$
which is a geometric distribution (starting at $1$). The expected value of the distribution approaches
$$
\sum_{r \geqslant 1} \e^{1-r} = \frac{1}{1 - \frac{1}{\e}} = \frac{\e}{\e - 1}.
$$
For clarity, suppose that the dice have ID numbers $1,2,3,4$. Let $X_i$ be the result on die $i$. Let $Y$ be the sum of the three largest of the $X_i$, and let $W$ be the minimum of the $X_i$.
Then $Y=X_1+X_2+X_3+X_4-W$. By the linearity of expectation, it follows that
$$E(Y)=E(X_1)+E(X_2)+E(X_3)+E(X_4)-E(W).$$
The linearity of expectation is a very useful result. Note that linearity always holds: independence is not required.
The expectation of the minimum can be calculated by first finding the distribution of the minimum $W$.
The minimum is $1$ unless the dice all show a number $\ge 2$. The probability of this is $1-\left(\frac{5}{6}\right)^4$. We rewrite this as $\frac{6^4-5^4}{6^4}$.
The minimum is $2$ if all the dice are $\ge 2$ but not all are $\ge 3$. The probability of this is $\frac{5^4-4^4}{6^4}$/
The minimum is $3$ if all results are $\ge 3$ but not all are $\ge 4$. This has probability $\frac{4^4-3^4}{6^4}$.
And so on. Now use the ordinary formula for expectation. We get that the expectation of $W$ is
$$\frac{1}{6^4}\left(1(6^4-5^4)+ 2(5^4-4^4)+3(4^4-3^4)+4(3^4-2^4)+5(2^4-1^4)+6(1^4-0^4) \right).$$
We leave you the task of computing. Before computing, simplify!
Generalization: Suppose we toss $k$ "fair" $(n+1)$-sided dice, with the numbers $0$ to $n$ written on them. For $i=1$ to $k$, let $X_i$ be the number showing on the $i$-th die. Let $S$ be the sum of the dice. Then $S=X_1+\cdots+X_k$. The expectation of $X_i$ is $\frac{0+1+\cdots +n}{n+1}$. By the usual expression for the sum of consecutive integers, $E(X_i)=\frac{n}{2}$ and therefore $E(S)=\frac{kn}{2}$.
The analysis of the minimum $W$ goes along the same lines as the earlier one. The probability that the minimum is $j$ is $\frac{(n+1-j)^k -(n-j)^k}{(n+1)^k}$. If we use the ordinary formula for expectation, and simplify, we find that
$$E(W)=\frac{1^k+2^k+\cdots+n^k}{(n+1)^k}.$$
A nice way to find $E(W)$: The following is a useful general result. Let $X$ be a random variable that only takes non-negative integer values. Then
$$E(X)=\sum_{i=1}^\infty \Pr(X\ge i).$$
We apply that to the case of the random variable $W$ which is the minimum of $X_1,\dots,X_4$. The probability that $W\ge i$ in that case is $\frac{(7-i)^k}{6^k}$.
The same procedure works for the more general situation you asked about.
Best Answer
For every independent nonnegative random variables $X$ and $Y$, the random variable $Z=\min(X,Y)$ has expectation $$ E[Z]=\int_0^\infty P[Z\geqslant t]\,\mathrm dt =\int_0^\infty P[X\geqslant t]\,P[Y\geqslant t]\,\mathrm dt. $$ In your case, one can decompose the integral on the RHS as the sum of the integrals on the intervals $(0,1)$, $(1,2)$, and $(2,3)$, and compute $P[X\geqslant t]$ and $P[Y\geqslant t]$ separately for $t$ in each of these intervals. This yields:
Thus, $E[Z]=$ $1$ $+$ $______$ $+$ $______$ $=$ $______$.