A clarification of your question (there seem to me to be two related, but different, parts): you are looking for (1) distribution of a sum of $n$ independent squared $t_{\alpha }$ random variables, and
(2) the sampling distribution of the variance (or the related standard deviation) of a random sample drawn from a $t_{\alpha }$ distribution (presumably your reason for asking about (1)).
Distribution of Sum of Independent Squared $t_{\alpha }$ Variables
If $T_i\sim t_{\alpha }$ are (independent) random $t$ variables with $\alpha$ d.f., then it is false that $\sum _{i=1}^n T_i^2\sim F(n,\alpha )$ (which is what you seem to be claiming in your second "possible solution"). This is easily verified by considering the first moment of each (the latter's first moment is $n$ times the first's).
The claim in your first "possible solution" is correct: $T_i^2\sim F(1,\alpha)$. Rather than resorting to characteristic functions, I think this result is more transparent when considering the characterisation of the $t$ distribution as the distribution of the ratio $\frac{Z}{\sqrt{U/\alpha}}$ where $Z$ is a standard normal variable and $U$ is a chi-squared variable with $\alpha$ degrees of freedom, independent of $Z$. The square of this ratio is then the ratio of two independent chi-squared variables scaled by their respective degrees of freedom i.e. $\frac{V/1}{U/\alpha}$ with $V=Z^2$, which is a standard characterisation of an $F(1,\alpha)$ distribution (with numerator d.f. equal to 1 and denominator d.f. equal to $\alpha$).
Considering the note I made on first moments in the first paragraph above, it might seem that a better claim may be that $\sum _{i=1}^n T_i^2\sim n F(n,\alpha )$ [I have slightly abused notation here by using the same expression for the distribution as well as a random variable having that distribution.]. Whilst the first moments match, the second central moments do not (for $\alpha>4$ the variance of the first expression is less than the variance of the latter expression) - so this claim is false too. [That being said, it is interesting to observe that $\lim_{\alpha \to \infty } \, n F(n,\alpha)= \chi _n^2$, which is the result we expect when summing squared (standard) normal variates.]
Sampling Distribution of Variance When Sampling from a $t_{\alpha }$ Distribution
Considering what I have written above, the expression you obtain for "the density of the standard deviation of n-sample T variables" is incorrect. However, even if the $F(n,\alpha)$ were the correct distribution, the standard deviation is not simply the square root of the sum of squares (as you seem to have used to arrive at your $g(u)$ density). You would instead be looking for the (scaled) sampling distribution of $\sum _{i=1}^n \left(T_i-\bar{T}\right){}^2=\sum _{i=1}^n T_i^2-n \bar{T}^2$. In the normal case, the LHS of this expression can be re-written as a sum of squared normal variables (the term inside the square can be re-written as a linear combination of normal variables which is again normally distributed) which leads to the familiar $\chi^2$ distribution. Unfortunately, a linear combination of $t$ variables (even with the same degrees of freedom) is not distributed as $t$, so a similar approach can not be exploited.
Perhaps you should re-consider what it is you wish to demonstrate? It may be possible to achieve the objective using some simulations, for example. However, you do indicate an example with $\alpha=3$, a situation where only the first moment of $F(1,\alpha)$ is finite, so simulation won't help with such moment calculations.
Best Answer
I have derived the answer using Mathematica:
results in a pdf for your transformed variable of the form:
$\frac{2^{1-\frac{\nu}{2}} e^{-\frac{\nu}{2x^2}}x^{-1-\nu}\nu^{\nu/2}}{Gamma(\nu/2)}$
I am not sure if this represents a particular named distribution, but hope that knowing the pdf may help in some way.
Update: the inverse cdf for this distribution is:
$\frac{\nu}{2 InverseGammaRegularised(\nu/2,x)}$
Best,
Ben