Let $F$ denote the cdf of the uniform distribution over $[0,1]$, the common cdf of $X_1$ and $X_2$. Now, because of the independence of $X_1$ and $X_2$, we have
$$F_{X_1+2X_2}(y)=P(X_1+2X_2<y)=\int_0^1P(x+2X_2<y)\ dx=$$
$$=\int_0^1P\left(Y_2<\frac{y-x}2\right)\ dx=\int_0^1F\left(\frac{y-x}2\right)\ dx.$$
Introducing the new variable $u=\frac{y-x}2$ we get $dx=-2du$, $\frac y2$ for the lower limit, and $\frac{y-1}{2}$ for the upper limit of integration. That is,
$$F_{X_1+2X_2}(y)=2\int^{\frac y2}_{\frac {y-1}{2}}F(u) \ du=\begin{cases}
0&\text{ if }&y<0\\
2\int_0^{\frac y2}u\ du&\text{ if }&0\leq y<1\\
2\int_{\frac {y-1}{2}}^{\frac y2} u\ du&\text{ if }&1\le y\leq2\\
2\int_{\frac {y-1}{2}}^{1} u\ du+2\int_1^\frac y2\ du&\text{ if }&2\le y\leq3\\
1&\text{ if }&y>3.
\end{cases}.$$
Finally
$$F_{X_1+2X_2}(y)=\begin{cases}
0&\text{ if }&y<0\\
\frac14 y^2&\text{ if }&0\leq y<1\\
\frac12y-\frac14&\text{ if }&1\le y\leq2\\
-\frac14y^2+\frac32y-\frac54&\text{ if }&2<y\leq3\\
1&\text{ if }&y>3
\end{cases}.$$
The density is:
$$f_{X_1+2X_2}(y)=\begin{cases}
0&\text{ if }&y<0\\
\frac12y&\text{ if }&0\leq y<1\\
\frac12&\text{ if }&1\le y\leq2\\
-\frac12y+\frac32&\text{ if }&2<y\leq3\\
0&\text{ if }&y>3
\end{cases}.$$
EDIT
Some may find the solution above to be overcomplicated. So let's see what the simple text book solution would be.
The pdf $X_1$, say, $f_{X_1}$ is $1$ over the interval $[0,1]$ and $0$ elsewhere. The pdf belonging to $2X_2$, say $f_{X_2}$ is $\frac12$ over $[0,2]$ and $0$ without it.
So, we simply have to compute the following convolution:
$$f_{X_1+2X_2}(y)=\int_{-\infty}^{\infty}f_{X_1}(y-x)f_{X_2}(x)\ dx.\tag1$$
So far straightforward, indeed. However, when calculating the latter integral, all the boring technical details will come up.
I chose the first solution because in my opinion the boring technicalities - I would dare to say: the same technicalities - emerge closer to and are better explained by the spirit of the original question.
EDIT 2
This is to override my EDIT above.
Let's see how intuitive it can be to compute such a convolution. Consider the following figures
Obviously, if $y<0$ or $y>3$ then the result of the integral of the product is $0$. (see figures (a) and (e))
If $0\leq y\leq 1$ then the integral of the product is increasing as $\frac12y$. (See figure (b).)
If $1<y\leq 2$ then the integral equals $\frac12$. (See figure (c).)
If, however $2<y\leq 3$ the the value of the integral decreases with $y$ from $\frac12$ to $0$. (See figure (d). So for $2<y\leq 3$ the integral equals $-\frac12y+ \frac32.$
Now, is this really simpler? Do we not see the danger in pretending that computing a convolution is always this simple?
Let me give a direct proof using characteristic functions. The setting is as follows:
- $(X_n)$ and $(X'_n)$ are i.i.d.
- $\tilde{X}_n = X_n - X'_n$ are symmetrized variables.
- $S_n = X_1 + \cdots + X_n$ and $\tilde{S}_n = \tilde{X}_1 + \cdots + \tilde{X}_n$.
Under this setting, we want to prove that
Claim. If the law of $X_1$ is not degenerate, then there exists $\epsilon > 0$ such that
$$ \inf_{n\geq 1} \mathbb{P}\left( |S_n| \geq \epsilon\sqrt{n} \right) > 0. $$
We prove the contraposition. To this end, assume that $\inf_n \mathbb{P}\left(|S_n|\geq \epsilon \sqrt{n}\right) = 0$ for any $\epsilon > 0$. Then exists $(n_k)$ such that $S_{n_k}/\sqrt{n_k} \to 0$ in probability. This implies that $\tilde{S}_{n_k}/\sqrt{n_k} \to 0$ in probability as well. So, if $\varphi(t) = \mathbb{E}[\cos(t\tilde{X}_1)] $ denotes the characteristic funtion of $\tilde{X}_1$, then
$$ \varphi\left( \frac{t}{\sqrt{n_k}} \right)^{n_k} = \mathbb{E}[\exp\{\mathrm{i}t \tilde{S}_{n_k}/\sqrt{n_k}\}] \xrightarrow[k\to\infty]{} 1 $$
by the Portmanteau theorem. By taking $\log|\cdot|$, we have $n_k \log\left| \varphi\left( \frac{t}{\sqrt{n_k}} \right) \right| \to 0$. But since
$ \varphi\left( \frac{t}{\sqrt{n_k}} \right) = 1 - 2 \mathbb{E}\left[ \sin^2\left( \frac{t\tilde{X}_1}{2\sqrt{n_k}}\right) \right] $ by the double-angle identity;
$\mathbb{E}\left[ \sin^2\left( \frac{t\tilde{X}_1}{2\sqrt{n_k}}\right) \right] \to 0$ by the dominated convergence theorem;
it follows that
$$ n_k \mathbb{E}\left[ \sin^2\left( \frac{t\tilde{X}_1}{2\sqrt{n_k}}\right) \right] \xrightarrow[k\to\infty]{} 0. $$
Plugging $t = 2$ and applying the monotone convergence theorem and the squeezing lemma,
$$ \mathbb{E}[\tilde{X}_1^2]
= \lim_{k\to\infty} \mathbb{E}\left[ n_k \sin^2\left( \frac{\tilde{X}_1}{\sqrt{n_k}}\right) \mathbf{1}_{\{ |\tilde{X}_1| \leq \frac{\pi}{2}\sqrt{n_k} \}} \right]
= 0, $$
and therefore $X_1$ is degenerate.
Best Answer
Prove by induction the more general result: If $0\le t\le 1$, then $$ P(S_n\le t)=\frac{t^n}{n!}, $$ where $S_n$ denotes the sum $X_1+\cdots+X_n$. The base case $n=1$ is clear. If holds for $n$, then calculate for $0\le t\le 1$: $$ P(S_{n+1}\le t)=\int_0^1P(S_n+x\le t)f(x)dx\stackrel{(1)}=\int_0^t\frac{(t-x)^n}{n!}\,dx=\frac{t^{n+1}}{(n+1)!} $$ Note that in (1) the quantity $P(S_n\le t-x)$ is zero when $x>t$.