Solved – Generalized Chi-Squared Distribution PDF

chi-squared-distributiondistributionsmultivariate normal distribution

Let $\mathbf{X} \sim \mathcal{N}_n( \mathbf{m}, \mathbf{C})$ be an $n$-dimensional gaussian vector, where $\mathbf{C} \in \mathbb{R}^{n \times n}$ is not diagonal, but it is positive-definitive, $\mathbf{C} \succ 0$, and $\mathbf{m} \neq \mathbf{0}$. Let
$$
Y = \| \mathbf{X} \|^2
$$
where $\| \cdot \|$ denotes the $L_2$-norm (Euclidean norm), should follow a generalized chi-squared distribution. Unfortunately, the Wikipedia page contains little information.

My question is: What is the PDF (probability density function) of $Y$ ?

Best Answer

Your question is really a special case of https://math.stackexchange.com/questions/442472/sum-of-squares-of-dependent-gaussian-random-variables/442916#442916 (with $A=I$).

But nevertheless: $X$ is multivariate normal ($n$ components) with expectation $m$ and positive definite covariance matrix $C$. We are interested in the distribution of $\| X\|^2 = X^T X$.

Define $W = C^{-1/2}(X-m)$. Then $W$ is multivariate normal with expectation zero and covariance matrix $I$, the identity matrix. $X= C^{1/2}W +m$, so after some algebra $$ X^T X= (W+C^{-1/2}m)^T C (W+C^{-1/2}m) $$ Use the spectral theorem to write $C=P^T\Lambda P$ where $P$ is an orthogonal matrix (such that $P^T P = P P^T = I$) and $\Lambda$ is a diagonal matrix with positive diagonal elements $\lambda_1, \dotsc, \lambda_n$. Write $U=PW$, $U$ is also multivariate normal with mean zero and identity covariance matrix. Now, with some algebra we find that $$ X^TX= (U+b)^T \Lambda (U+b) = \sum_{j=1}^n \lambda_j (U_j+b_j)^2 $$ where $b_j= \Lambda^{-1/2} P m$, so that $X^TX$ is a linear combination of independent noncentral chisquare variables, each with one degree of freedom and noncentrality $b_j^2$. Except for special cases, it would be hard to find a closed exact expression for its density function (for instance, if all $\lambda_j$ are equal, it will be a constant times a noncentral chisquare). For some ideas which could be used, in particular saddlepoint approximation, see the posts Generic sum of Gamma random variables, How does saddlepoint approximation work? and for the needed moment generating functions, What is the moment generating function of the generalized (multivariate) chi-square distribution?

There is a book-length treatment by Mathai and Provost https://books.google.no/books/about/Quadratic_Forms_in_Random_Variables.html?id=tFOqQgAACAAJ&redir_esc=y about quadratic forma in random variables. It gives a lot of different approximations, typically series expansions. There are also some exact (very complicated) results, but only for some special cases. I would go for the saddlepoint approximation! (I will try to come back and post some examples here, but not tonight ...)

There is also an R package https://CRAN.R-project.org/package=CompQuadForm with some approximations.

Related Question