simplify the term in the integral to
$T=e^{-\frac{1}{2}((\frac{\frac{z}{y}-\mu_x}{\sigma_x} )^2 -y)} y^{k/2-2} $
find the polynomial $p(y)$ such that
$[p(y)e^{-\frac{1}{2}((\frac{\frac{z}{y}-\mu_x}{\sigma_x} )^2 -y)}]'=p'(y)e^{-\frac{1}{2}((\frac{\frac{z}{y}-\mu_x}{\sigma_x} )^2 -y)} + p(y) [-\frac{1}{2}((\frac{\frac{z}{y}-\mu_x}{\sigma_x} )^2 -y)]' e^{-\frac{1}{2}((\frac{\frac{z}{y}-\mu_x}{\sigma_x} )^2 -y)} = T$
which reduces to finding $p(y)$ such that
$p'(y) + p(y) [-\frac{1}{2}((\frac{\frac{z}{y}-\mu_x}{\sigma_x} )^2 -y)]' = y^{k/2-2}$
or
$p'(y) -\frac{1}{2} p(y) (\frac{z \mu_x }{\sigma_x^2} y^{-2} \frac{z^2}{\sigma_x^2} y^{-3} -1)= y^{k/2-2}$
which can be done evaluating all powers of $y$ seperately
edit after comments
Above solution won't work as it diverges.
Yet, some others have worked on this type of product.
Using Fourrier transform:
Schoenecker, Steven, and Tod Luginbuhl. "Characteristic Functions of the Product of Two Gaussian Random Variables and the Product of a Gaussian and a Gamma Random Variable." IEEE Signal Processing Letters 23.5 (2016): 644-647.
http://ieeexplore.ieee.org/document/7425177/#full-text-section
For the product $Z=XY$ with $X \sim \mathcal{N}(0,1)$ and $Y \sim \Gamma(\alpha,\beta)$ they obtained the characteristic function:
$\varphi_{Z} = \frac{1}{\beta^\alpha }\vert t \vert^{-\alpha} exp \left( \frac{1}{4\beta^2t^2} \right) D_{-\alpha} \left( \frac{1}{\beta \vert t \vert } \right)$
with $D_\alpha$ Whittaker's function ( http://people.math.sfu.ca/~cbm/aands/page_686.htm )
Using Mellin transform:
Springer and Thomson have described more generally the evaluation of products of beta, gamma and Gaussian distributed random variables.
Springer, M. D., and W. E. Thompson. "The distribution of products of beta, gamma and Gaussian random variables." SIAM Journal on Applied Mathematics 18.4 (1970): 721-737.
http://epubs.siam.org/doi/10.1137/0118065
They use the Mellin integral transform. The Mellin transform of $Z$ is the product of the Mellin transforms of $X$ and $Y$ (see http://epubs.siam.org/doi/10.1137/0118065 or https://projecteuclid.org/euclid.aoms/1177730201). In the studied cases of products the reverse transform of this product can be expressed as a Meijer G-function for which they also provide and prove computational methods.
They did not analyze the product of a Gaussian and gamma distributed variable, although you might be able to use the same techniques. If I try to do this quickly then I believe it should be possible to obtain an H-function (https://en.wikipedia.org/wiki/Fox_H-function ) although I do not directly see the possibility to get a G-function or make other simplifications.
$M\lbrace f_Y(x) \vert s \rbrace = 2^{s-1} \Gamma(\tfrac{1}{2}k+s-1)/\Gamma(\tfrac{1}{2}k)$
and
$M\lbrace f_X(x) \vert s \rbrace = \frac{1}{\pi}2^{(s-1)/2} \sigma^{s-1} \Gamma(s/2) $
you get
$M\lbrace f_Z(x) \vert s \rbrace = \frac{1}{\pi}2^{\frac{3}{2}(s-1)} \sigma^{s-1} \Gamma(s/2) \Gamma(\tfrac{1}{2}k+s-1)/\Gamma(\tfrac{1}{2}k) $
and the distribution of $Z$ is:
$f_Z(y) = \frac{1}{2 \pi i} \int_{c-i \infty}^{c+i \infty} y^{-s} M\lbrace f_Z(x) \vert s \rbrace ds $
which looks to me (after a change of variables to eliminate the $2^{\frac{3}{2}(s-1)}$ term) as at least a H-function
what is still left is the puzzle to express this inverse Mellin transform as a G function. The occurrence of both $s$ and $s/2$ complicates this. In the separate case for a product of only Gaussian distributed variables the $s/2$ could be transformed into $s$ by substituting the variable $x=w^2$. But because of the terms of the chi-square distribution this does not work anymore. Maybe this is the reason why nobody has provided a solution for this case.
Some background
The $\chi^2_n$ distribution is defined as the distribution that results from summing the squares of $n$ independent random variables $\mathcal{N}(0,1)$, so:
$$\text{If }X_1,\ldots,X_n\sim\mathcal{N}(0,1)\text{ and are independent, then }Y_1=\sum_{i=1}^nX_i^2\sim \chi^2_n,$$
where $X\sim Y$ denotes that the random variables $X$ and $Y$ have the same distribution (EDIT: $\chi_n^2$ will denote both a Chi squared distribution with $n$ degrees of freedom and a random variable with such distribution). Now, the pdf of the $\chi^2_n$ distribution is
$$
f_{\chi^2}(x;n)=\frac{1}{2^\frac{n}{2}\Gamma\left(\frac{n}{2}\right)}x^{\frac{n}{2}-1}e^{-\frac{x}{2}},\quad \text{for } x\geq0\text{ (and $0$ otherwise).}
$$
So, indeed the $\chi^2_n$ distribution is a particular case of the $\Gamma(p,a)$ distribution with pdf
$$
f_\Gamma(x;a,p)=\frac{1}{a^p\Gamma(p)}x^{p-1}e^{-\frac{x}{a}},\quad \text{for } x\geq0\text{ (and $0$ otherwise).}
$$
Now it is clear that $\chi_n^2\sim\Gamma\left(\frac{n}{2},2\right)$.
Your case
The difference in your case is that you have normal variables $X_i$ with common variances $\sigma^2\neq1$. But a similar distribution arises in that case:
$$Y_2=\sum_{i=1}^nX_i^2=\sigma^2\sum_{i=1}^n\left(\frac{X_i}{\sigma}\right)^2\sim\sigma^2\chi_n^2,$$
so $Y$ follows the distribution resulting from multiplying a $\chi_n^2$ random variable with $\sigma^2$. This is easily obtained with a transformation of random variables ($Y_2=\sigma^2Y_1$):
$$
f_{\sigma^2\chi^2}(x;n)=f_{\chi^2}\left(\frac{x}{\sigma^2};n\right)\frac{1}{\sigma^2}.
$$
Note that this is the same as saying that $Y_2\sim\Gamma\left(\frac{n}{2},2\sigma^2\right)$ since $\sigma^2$ can be absorbed by the Gamma's $a$ parameter.
Note
If you want to derive the pdf of the $\chi^2_n$ from scratch (which also applies to the situation with $\sigma^2\neq1$ under minor changes), you can follow the first step here for the $\chi_1^2$ using standard transformation for random variables. Then, you may either follow the next steps or shorten the proof relying in the convolution properties of the Gamma distribution and its relationship with the $\chi^2_n$ described above.
Best Answer
There is a mistake in $$F_Y(y) = \frac{1}{\sigma^2}\int_{-\infty}^y\sigma^2f_B(u)\text{u}$$as it should be$$F_Y(y) = \frac{1}{\sigma^2}\int_{-\infty}^{y/\sigma^2}\sigma^2f_B(u)\text{d}u$$resulting in$$F_Y(y)=F_B(y/\sigma^2)$$