If $X$ and $Y$ are any random variables, and $a$ and $b$ are constants, then
$$E(aX+bY)=aE(X)+bE(Y).$$
If $X$ and $Y$ are independent random variables, and $a$ and $b$ are constants, then
$$\text{Var}(aX+bY)=a^2\text{Var}(X)+b^2\text{Var}(Y).$$
In our case we have $a=b=1$, but the general expressions may be useful to you later.
So you don't need to find the distribution of the random variable $X_1+X_2$, all you need is formulas for the mean and variance of an exponential random variable $T$ with parameter $\lambda$. These are respectively $\dfrac{1}{\lambda}$ and $\dfrac{1}{\lambda^2}$. Thus your mean and variance are respectively
$$\frac{1}{\lambda_1}+\frac{1}{\lambda_2}\quad\text{and}\quad \frac{1}{\lambda_1^2}+\frac{1}{\lambda_2^2}.$$
Edit: The question was changed. It turns out that $\lambda_1=\lambda_2=\lambda$. that is just a special case of the above formulas.
It is not clear whether you were asking also about how to compute the individual means and variances, or whether you already know these.
We need $E(T)$, and $E(T^2)$, since $\text{Var}(T)=E(T^2)-(E(T))^2$.
The expectations can be found as usual by integration. There are some shortcuts, but they tend to involve more advanced notions. In case you are interested, let me mention the term moment generating function.
Added: We compute the moment generating function $m(s)$ of our random variable $T$ (sorry about $s$, the traditional $t$ is already taken). So we want
$$E(e^{Ts})=\int_0^\infty e^{ts}\lambda e^{-\lambda t}\,dt=\int_0^\infty \lambda e^{-(\lambda-s)t}\,dt.$$
Integrate by substitution, easy. We get
$$m(s)=\frac{\lambda}{\lambda-s}=\frac{1}{1-\frac{s}{\lambda}}.$$
So $m(s)$ has a very nice Taylor expansion $m(s)=1+\frac{1}{\lambda}s+\frac{1}{\lambda^2}s^2+\cdots$. From this we can pick up $E(X^k)$ for any $k$. In particular, $E(X)=\frac{1}{\lambda}$, and $E(X^2)=\frac{2!}{\lambda^2}$.
We can even get the moment-generating function of $X_1+X_2$, since the mgf of an independent sum is the product of the mgf's..
Show that $P(lim(n→∞)supXn/logn=1/λ)=1$.
(...) $lim(n→∞)(supXn)$ means that Xn happens for infinitely often times as n close to infinity. so $lim(n→∞)(supXn)=⋂_n⋃_mXn$, where m>n and n∈N.
This is confusing limsup of events and limsup of random variables. Here, each $X_n$ is a random variable hence $$\limsup\limits_{n\to\infty}X_n/\log n$$ is the random variable $Y$ such that, for every $\omega$ in $\Omega$,
$$
Y(\omega)=\limsup\limits_{n\to\infty}X_n(\omega)/\log n.
$$
And the task is to prove that the event $[Y=1/\lambda]$ has probability $1$.
Note that the question most probably does not reproduce faithfully the text of the exercise asked. For example,
$$
\text{lim(n→∞)supXn/logn}
$$
should read
$$
\limsup\limits_{n\to\infty}X_n/\log n,
$$
where $\limsup\limits_{n\to\infty}$ acts as a single operation, or, equivalently,
$$
\limsup\limits_{n\to\infty}\frac{X_n}{\log n}.
$$
Best Answer
Ok, here goes.
Assuming $X_1,X_2$ are independent and $Y=X_1X_2$ we have
\begin{multline} P(Y\leq y)= P(X_1X_2\leq y) = \int_0^\infty \lambda P(X_1\leq y/x) e^{-\lambda x} dx = \lambda \int_0^\infty (1-e^{-\lambda y/x}) e^{-\lambda x} dx =\\ 1- \lambda \int_0^\infty e^{-\{\lambda (x+y/x)\}} dx. \end{multline}
The density is the derivative of that with respect to $y$, i.e. \begin{equation} \lambda^2 \int_0^\infty \frac{1}{x} e^{-\{\lambda (x+y/x)\}} dx, \end{equation} for all $y>0$.