A random variable $X\sim T_n$ has the following density function:
$\qquad f_{t_n}=\frac{1}{\sqrt{n}}\frac{\Gamma(\frac{n+1}{2})}{\Gamma(\frac{1}{2})\Gamma(\frac{n}{2})}(1+\frac{x^2}{n})^{-\frac{n+1}{2}}$
1) Using the definition of $\mathrm{E}[X]$, we get
$\qquad \mathrm{E}[X]=\frac{1}{\sqrt{n}}\frac{\Gamma(\frac{n+1}{2})}{\Gamma(\frac{1}{2})\Gamma(\frac{n}{2})}\int_{-\infty}^{+\infty}x(1+\frac{x^2}{n})^{-\frac{n+1}{2}}dx$
First, notice that $\mathrm{E}[X]$ is undefined if $0<n\le1$. So let's take it from $n>1$.
Now, if we look at our integrand, we see that the function $f(x)=x(1+\frac{x^2}{n})^{-\frac{n+1}{2}}$ is an odd function. So the integral equals $0$. The means we found the value of the expected value of are r.v $X$, that is:
$\qquad \mathrm{E}[X]=0$
The variance is a little trickier. Using what we've showed about $\mathrm{E}[X]$, we get:
$\qquad \mathrm{Var}[X]=\frac{1}{\sqrt{n}}\frac{\Gamma(\frac{n+1}{2})}{\Gamma(\frac{1}{2})\Gamma(\frac{n}{2})}\int_{-\infty}^{+\infty}x^2(1+\frac{x^2}{n})^{-\frac{n+1}{2}}dx$
If we do the change of variable $y=(1+\frac{x^2}{n})^{-1}$ we get:
$\qquad n\frac{\Gamma(\frac{n+1}{2})}{\Gamma(\frac{1}{2})\Gamma(\frac{n}{2})}\int_0^1y^{\frac{n}{2}-2}(1-y)^{\frac{1}{2}}$
Now if we remember a property of the gamma function, which states that:
$\qquad \int_0^1x^{p-1}(1-x)^{q-1}=\frac{\Gamma(p)\Gamma(q)}{\Gamma(p+q)}$
In our case $p=\frac{n}{2}-1$ and $q=\frac{3}{2}$. And simplifying all that gamma stuff, we get:
$\qquad \mathrm{Var}[X]=\frac{n}{n-2}$
2) Here, you probably meant $X^2\sim F_{1,p}$. That follows almost inmediatly from the definition of both distributions.
$\qquad T_n=\frac{Z}{\sqrt{\frac{1}{p}\sum_{i=1}^p Y_i^2}}$ $\qquad (1)$
where $Z\sim N(0,1)$ and $Y_i\sim N(0,1)$ for al $i=1,...,n$. Just squared that expression and you'll get the distribution of $F_{1,p}$.
3) The result you want to prove makes use of the Strong Law of Large Numbers. Review it and notive that if
$\qquad Y=\sum_{i=1}^pY_i^2$,
then $\frac{Y}{p}\to1$ as $p\to\infty$. Substitute in $(1)$ and you get the result.
First of all, the pdf of $X$ should be of the form
$$f(x)=\frac{1}{\sigma}\cdot\frac{\exp\left(-\frac{x-\mu}{\sigma}\right)}{\left[1+\exp\left(-\frac{x-\mu}{\sigma}\right)\right]^2}\,,\quad x\in\mathbb R$$
where $\mu$ is real and $\sigma$ is positive.
Normalising the pdf by taking $Y=(X-\mu)/\sigma$, we get the standard logistic pdf $$g(y)=\frac{e^{-y}}{\left(1+e^{-y}\right)^2}\,,\quad y\in\mathbb R$$
Now,
\begin{align}E(Y)&=\int_{\mathbb R}\frac{ye^{-y}}{\left(1+e^{-y}\right)^2}\,dy
\\&=\int_0^1\ln\left(\frac{z}{1-z}\right)\,dz\qquad\left[y\mapsto z\text{ such that } z=\frac{1}{1+e^{-y}}\right]
\\&=\int_0^1\ln z\,dz-\int_0^1\ln(1-z)\,dz
\\&=\int_0^1\ln z\,dz-\int_0^1\ln z\,dz\qquad\left[\text{ using }\int_0^af(x)\,dx=\int_0^af(a-x)\,dx\right]
\\&=0
\end{align}
Hence, $$\qquad E(X)=\mu$$
For the variance, see this post. It derives the result $$E(Y^2)=\operatorname{Var}(Y)=\frac{\pi^2}{3}$$
So we get $$\operatorname{Var}(X)=\frac{\pi^2\sigma^2}{3}$$
We can also derive the moments using the moment generating function, but the calculation is slightly more involved.
Best Answer
Let $u=x-\mu$, then we have $x=u+\mu$ and $dx=du$. By linearity we have $\operatorname{E}[X]=\operatorname{E}[U+\mu]=\operatorname{E}[U]+\mu$, hence \begin{align} \operatorname{E}[X] &=\frac1{2\sigma}\int_{-\infty}^\infty u\ e^{-\Large\frac{|u|}{\sigma}}\ du+\mu\\ &=\frac1{2\sigma}\color{blue}{\underbrace{\color{black}{\int_{-\infty}^0 u\ e^{\Large\frac{u}{\sigma}}\ du}}_{\color{red}{\text{set}\ u=-u}}}+\frac1{2\sigma}\int_{0}^\infty u\ e^{-\Large\frac{u}{\sigma}}\ du+\mu\\ &=-\frac1{2\sigma}\int_{0}^\infty u\ e^{-\Large\frac{u}{\sigma}}\ du+\frac1{2\sigma}\int_{0}^\infty u\ e^{-\Large\frac{u}{\sigma}}\ du+\mu\\ &=\large\color{blue}{\mu} \end{align} and \begin{align} \operatorname{E}\left[X^2\right]&=\operatorname{E}\left[(U+\mu)^2\right]\\ &=\operatorname{E}\left[U^2\right]+2\mu\operatorname{E}\left[U\right]+\mu^2\\ &=\operatorname{E}\left[U^2\right]+2\mu\operatorname{E}\left[X-\mu\right]+\mu^2\\ &=\frac1{2\sigma}\int_{-\infty}^\infty u^2 e^{-\Large\frac{|u|}{\sigma}}\ du+\mu^2\\ &=\frac1{2\sigma}\color{blue}{\underbrace{\color{black}{\int_{-\infty}^0 u^2 e^{\Large\frac{u}{\sigma}}\ du}}_{\color{red}{\text{set}\ u=-u}}}+\frac1{2\sigma}\int_{0}^\infty u^2 e^{-\Large\frac{u}{\sigma}}\ du+\mu^2\\ &=\frac1{2\sigma}\int_{0}^\infty u^2 e^{-\Large\frac{u}{\sigma}}\ du+\frac1{2\sigma}\int_{0}^\infty u^2 e^{-\Large\frac{u}{\sigma}}\ du+\mu^2\\ &=\frac1{\sigma}\color{blue}{\underbrace{\color{black}{\int_{0}^\infty u^2 e^{-\Large\frac{u}{\sigma}}\ du}}_{\color{red}{\text{set}\ v=\frac{u}{\sigma}}}}+\mu^2\\ &=\sigma^2\int_{0}^\infty v^2 e^{-v}\ dv+\mu^2\\ &=2\sigma^2+\mu^2, \end{align} where $$ \Gamma(n+1)=\int_{0}^\infty v^{n} e^{-v}\ dv=n!\qquad,\qquad\text{for $n$ natural number}. $$ Thus \begin{align} \operatorname{Var}[X]&=\operatorname{E}\left[X^2\right]-\left(\operatorname{E}[X]\right)^2=\large\color{blue}{2\sigma^2}. \end{align}