For the case of elliptically contoured distributions (of which the Gaussian is a special case), the distribution of the norm of Mv is available in the literature (See for example 1 and 2)
M. Rangaswamy, D.D. Weiner, and A. Ozturk, "Non Gaussian Random Vector Identification Using Spherically Invariant Random Processes," Aerospace and Electronic Systems, IEEE Transactions on 29 (1), 111-124
Computer generation of correlated non-Gaussian radar clutter
M Rangaswamy, D Weiner, A Öztürk
Aerospace and Electronic Systems, IEEE Transactions on 31 (1), 106-116
They both seem to work quite well. They give slightly different results for the estimated covariance matrices of the generated series, but I wouldn't be surprised if it's due to rounding errors somewhere in the computations.
Below is some R code which generates samples from $N(0, \boldsymbol{\Sigma})$.
n <- 10000000
X <- cbind(rnorm(n), rnorm(n))
sigma <- t(matrix(c(0.666, -0.333, -0.333, 0.666), nrow=2))
spectral <- eigen(sigma)
X.spectral <- t(spectral$vectors %*% sqrt(diag(spectral$values)) %*% t(X))
X.cholesky <- t(t(chol(sigma)) %*% t(X))
cov(X.spectral)
cov(X.cholesky)
So with my 10,000,000 samples, the covariance matrices are
> cov(X.spectral)
[,1] [,2]
[1,] 0.6660626 -0.3331138
[2,] -0.3331138 0.6658130
> cov(X.cholesky)
[,1] [,2]
[1,] 0.6660344 -0.3328923
[2,] -0.3328923 0.6656198
Best Answer
You have $$X^T X=X^T AX+X^T(I-A)X$$
By Fisher-Cochran theorem, a necessary sufficient condition for $X^T AX$ and $X^T(I-A)X$ to be independent chi-square variables is $p=\operatorname{rank}(A)+\operatorname{rank}(I-A)$, which is certainly true here because $A$ is idempotent.
So $X^T AX\sim\chi^2_k$ is independent of $X^T(I-A)X\sim \chi^2_{p-k}$.
It follows from a well-known result (also mentioned on wikipedia) that $$\frac{X^TAX}{X^T AX+X^T(I-A)X}\sim\text{Beta}\left(\frac{k}{2},\frac{p-k}{2}\right)$$