# Sum of squares of dependent Gaussian random variables

Ok, so the Chi-Squared distribution with $$nn$$ degrees of freedom is the sum of the squares of $$nn$$ independent Gaussian random variables.

The trouble is, my Gaussian random variables are not independent. They do however all have zero mean and the same variance. Supposing I have a covariance matrix—which again is not a diagonal matrix because they aren’t independent, but all the elements along the diagonal are equal to each other because they have the same variance, and in fact the covariance matrix is a symmetric Toeplitz matrix (and I’m not saying that this is important to the solution if there is one, but if it’s a necessary property to get anywhere, by all means use that fact)—is there some way to decompose this sum of squares of these Gaussian random variables into perhaps a sum of chi-squared random variables and possibly Gaussian random variables? In other words, I can’t directly just square them all and add them together and call it a chi squared distribution because a chi squared distribution is a sum of independent Gaussian squares, and they aren’t independent.

I know how to find a linear transformation of the Gaussian random variables which are $$nn$$ independent Gaussians, but that’s no help because they aren’t the things being squared, you see.

Lets assume you have $$X=(X1,…,Xn)X=(X_1, \dots, X_n)$$ a random vector with multinormal distribution with expectation vector $$μ\mu$$ and covariance matrix $$Σ\Sigma$$. We are interested in the quadratic form $$Q(X)=XTAX=∑∑aijXiXjQ(X)= X^T A X = \sum \sum a_{ij} X_i X_j$$. Define $$Y=Σ−1/2XY = \Sigma^{-1/2} X$$ where we are assuming $$Σ\Sigma$$ is invertible. Write also $$Z=(Y−Σ−1/2μ)Z=(Y-\Sigma^{-1/2} \mu)$$, which will have expectation zero and variance matrix the identity.

Now
$$Q(X)=XTAX=(Z+Σ−1/2μ)TΣ1/2AΣ1/2(Z+Σ−1/2μ). Q(X) = X^T A X= (Z+\Sigma^{-1/2} \mu)^T \Sigma^{1/2}A\Sigma^{1/2} (Z+\Sigma^{-1/2} \mu).$$
Use the spectral theorem now and write $$Σ1/2AΣ1/2=PTΛP\Sigma^{1/2}A \Sigma^{1/2} = P^T \Lambda P$$
where $$PP$$ is an orthogonal matrix (so that$$PPT=PTP=IP P^T=P^T P=I$$) and $$Λ\Lambda$$ is diagonal with positive diagonal elements $$λ1,…,λn\lambda_1, \dotsc, \lambda_n$$. Write $$U=PZU = P Z$$ so that $$UU$$ is multivariate normal with identity covariance matrix and expectation zero.

We can compute
$$Q(X)=(Z+Σ−1/2μ)TΣ1/2AΣ1/2(Z+Σ−1/2μ)=(Z+Σ−1/2μ)TPTΛP(Z+Σ−1/2μ)=(PZ+PΣ−1/2μ)TΛ(PZ+PΣ−1/2μ)=(U+b)TΛ(U+b) Q(X) = (Z+\Sigma^{-1/2} \mu)^T \Sigma^{1/2}A\Sigma^{1/2} (Z+\Sigma^{-1/2} \mu) \\ = (Z+\Sigma^{-1/2} \mu)^T P^T \Lambda P (Z+\Sigma^{-1/2} \mu) \\ = (PZ+P\Sigma^{-1/2} \mu)^T \Lambda (PZ+P\Sigma^{-1/2} \mu) \\ = (U+b)^T \Lambda (U+b)$$
where now
$$b=PΣ−1/2μb = P \Sigma^{-1/2} \mu$$. (There was a small typo in above defs of $$UU$$ and $$bb$$, now corrected.) So:
$$Q(X)=XTAX=n∑j=1λj(Uj+bj)2 Q(X) = X^T A X = \sum_{j=1}^n \lambda_j (U_j+b_j)^2$$
In your case, $$μ=0\mu=0$$ so $$b=0b=0$$ so your quadratic form is a linear combination of independent chi-square variables, each with one degree of freedom. In the general case, we will get a linear combination of independent non-central chisquare variables.

If you want to work numerically with that distribution, there is an CRAN package (that is, package for R) implementing it, called CompQuadForm.

If you want (much) more detail, there is a book dedicated to the topic, Mathai & Provost: “Quadratic forms in random variables”.