I think this can be done by induction on $k$.
The base case is trivial, namely $\mathbb{P}(X_1\leq x)=x\leq\frac{x^1}{1!}$ since $X_1$ is uniform.
So assume the statement is true for all $0\leq x\leq 1$ and for a fixed $k$. We want to show it for $k+1$.
I think there is a more elegant way of doing this induction step using some integral tricks more directly, but I am struggling with making sure that is formally correct, so I will write down a more elementary version.
Pick an arbitrary natural $N$. Note that if the event $X_1+\dots+X_{k+1}\leq x$ occurs, this means that there exists some $1\leq n\leq N$ such that $X_{k+1}\in [\frac{(n-1)x}{N},\frac{nx}{N}]$ and $X_1+\dots+X_k\leq x-\frac{(n-1)x}{N}$. Therefore, setting $P:=\mathbb{P}(X_1+\dots+X_k+X_{k+1}\leq x)$, we have
$P\leq \sum_{n=1}^N \mathbb{P}(X_{k+1}\in [\frac{(n-1)x}{N},\frac{nx}{N}])\cdot\mathbb{P}(X_1+\dots+X_k\leq x-\frac{(n-1)x}{N})$.
Applying the induction hypothesis and the fact that $X_{k+1}$ is uniformly distributed, this means
\begin{equation*}
\begin{split}
P & \leq \sum_{n=1}^N \frac{x}{N} \mathbb{P}(X_1+\dots+X_k\leq x-\frac{(n-1)x}{N}) = \\
& = \frac{x}{N} \sum_{n=1}^{N} \mathbb{P}(X_1+\dots+X_k\leq \frac{nx}{N}) \leq \\
& \leq \frac{x}{N}\sum_{n=1}^{N} \big(\frac{n}{N}\big)^k\frac{x^k}{k!} =\\
& = \frac{x^{k+1}}{k!} \cdot \frac{1}{N} \sum_{n=1}^{N} \big(\frac{n}{N}\big)^k.
\end{split}
\end{equation*}
Note however that $\frac{1}{N} \sum_{n=1}^{N} \big(\frac{n}{N}\big)^k $ tends to $\int_0^1 x^k dx =\frac{1}{k+1}$ as $N\to\infty$ since clearly $x\mapsto x^k$ is Riemann integrable and the expression is just the (upper) Riemann sum corresponding to the equidistant partition of the unit interval with stepsize $\frac{1}{N}$ for $x\mapsto x^k$.
Since we chose $N$ arbitrarily, we can pass to the limit and obtain that $P\leq\frac{x^{k+1}}{k!}\frac{1}{k+1}=\frac{x^{k+1}}{(k+1)!}$, which completes the induction.
Best Answer
Let $\ G=\big|\{\,i\ |\,U_i<\gamma\,\}\big|\ $ and $\ V\ $ be the value of the number you choose in step $3$. Then $\ \mathbb{P}\big(G=g\big)=$$\,{m\choose g}\gamma^g(1-\gamma)^{m-g}\ $, and given that $\ G=g\ $ there are $\ m-g\ $ of the variates $\ U_1,U_2,\dots,U_m\ $ that will be uniformly distributed over the interval $\ [\gamma,1]\ $, and the minimum of them will be greater than $\ x\in[\gamma,1]\ $ if and only if all $\ m-g\ $ of them are. Therefore $\ \mathbb{P}\big(V> x\,|\,G=g\big)=$$\,\left(\frac{1-x}{1-\gamma}\right)^{m-g}\ $ for $\ x\in[\gamma,1]\ $ and $\ g=0,1,\dots, m-1\ $. Therefore \begin{align} \mathbb{P}\big(V> x\big)&=\sum_{g=0}^{m-1}\mathbb{P}\big(V> x\,|\,G=g\big)\mathbb{P}\big(G=g\big)\\ &=\sum_{g=0}^{m-1}{m\choose g}\gamma^g(1-x)^{m-g}\\ &=(1+\gamma-x)^m-\gamma^m\ . \end{align} Hence $$ \mathbb{P}\big(\{V\le x\}\cup\{V\ \text{is undefined}\}\big)=1+\gamma^m-(1+\gamma-x)^m\ . $$ If we take the probability, $\ \gamma^m\ $, of $\ V$'s being undefined as negligible, as the OP indicates as being the case in a comment, we get $$ \mathbb{P}\big(V\le x\big)=1-(1+\gamma-x)^m $$ for the cumulative distribution function of $\ V\ $, and $$ p_V(x)=m(1+\gamma-x)^{m-1} $$ for its density function.