If $f(x)\,dx$ is a probability distribution with expected value $0$ and variance $1$, and the distribution of $X_i$ is
$$f\left(\dfrac{x-\mu_i}{\sigma_i}\right)\cdot\dfrac{dx}{\sigma_i},$$
and $X_i$ are independent, then certainly the distribution of $X_1+\cdots+X_n$ has expected value $\mu_1+\cdots+\mu_n$ and variance $\sigma_1^2+\cdots+\sigma_n^2$. Also, the higher cumulants would add together in the same way. (The fourth cumulant, for example, is $\mathbb E((X-\mu)^4) - 3(\mathbb E((X-\mu)^2))^2$, and the coefficient $3$ is the only number that makes this functional additive in the sense that the fourth cumulant of a sum of independent random variables is the sum of their fourth cumulants.)
We've tacitly assumed $\sigma_i<\infty$. I think if $\sum_{i=1}^\infty\sigma_i^2=\infty$, then as $n$ grows, the distribution would approach a normal distribution (I'm not recalling the appropriate generalization of the central limit theorem clearly enough to state it precisely.) But what happens for small $n$ is another question, and the answer would depend on what function $f$ is.
I said above that $f(x)\,dx$ has expectation $0$ and variance $1$. But one can also have perfectly good location-scale families in which the expectation, and a fortiori, the variance, do not exist. The most well-known case is the Cauchy distribution. The simplest result there is that $(X_1+\cdots+X_n)/n$ actually has the same Cauchy distribution as $X_1$ if these $n$ variables are i.i.d. It doesn't get narrower. So a lot depends on which function $f$ is.
First, $U = X_i/\theta \sim Norm(0,1)$ and $V = Y_i/\theta \sim Norm(0,1).$ Then $U^2$ and $V^2$ are independently distributed as $Chisq(df=1).$ Moreover,
$$U^2 + V^2 = (X^2 + Y^2)/\theta^2 = T/\theta^2 \sim Chisq(df = 2).$$
Following the Comment by @sinbadh, please see Wikipedia or your text (about the chi-squared family of distributions) for proofs of two facts used above,
which can be done using moment generating functions:
(a) If $Z \sim Norm(0,1),$ then $Z^2 \sim Chisq(df=1).$
(b) If $Q_m \sim Chisq(df=m)$ and $Q_n \sim Chisq(df=n)$,
then $Q_n + Q_m \sim Chisq(df= m+n).$
A result related to your second question is that if
$X_1, X_2, \dots, X_n$ are a random sample from $Norm(\mu, \sigma^2),$ where $\mu$ is known, then the MLE of $\sigma^2$ is $\frac{1}{n}\sum_{i=1}^n (X-\mu)^2.$ And the MLE of $\sigma$ is
$\sqrt{\frac{1}{n}\sum_{i=1}^n (X-\mu)^2}.$ The derivation of the MLE is routine.
For a contrast between this and the case where $\mu$ is $unknown,$
please see this related page.
Best Answer
You want to know the distribution of
$$ Z = \sum_{i=1}^n X_iY_i, $$
where $X_i$ and $Y_i$ are all normal and independent but not identically distributed. Let's calculate the mean and variance of $Z$:
$$ \mathrm E[Z] = \sum_{i=1}^n \mathrm E[X_iY_i] = \sum_{i=1}^n \mathrm E[X_i]\mathrm E[Y_i]. $$
You know the individual means of the normals, so the above sum can be calculated. Now the variance, knowing that covariance terms drop out because of uncorrelation due to independence
$$ \mathrm{Var}[Z] = \sum_{i=1}^n \mathrm{Var}[X_iY_i] = \sum_{i=1}^n \mathrm E[Y_i]^2\mathrm{Var}[X_i]+\mathrm E[X_i]^2\mathrm{Var}[Y_i]+\mathrm{Var}[X_i]\mathrm{Var}[Y_i] $$
Again, this is all in terms of quantities you know, the means and variances of the underlying distributions.
So, IF the CLT applies, your answer is that $Z\sim\mathcal{N}(\mathrm{E}[Z],\mathrm{Var}[Z])$. Rigorously proving that this is true for your particular case and your particular distributions is up to you, but my claim is that it is true. I pastebinned some numerical calculations to support my claim.