Solved – Test for significance of correlation matrix

correlationstatistical significance

If one wishes to test if the correlations in a correlation matrix are statistically significant as a whole group, one can perform a likelihood ratio test of the hypothesis that the correlation matrix is equal to the identity matrix.

The ratio of the restricted and unrestricted likelihood functions is $\alpha = |R|^{N/2}$ , where |R| is the determinant of the correlation matrix (Morrison, 1967).

The test statistic is therefore $-2\log(\alpha)$, which is distributed as $\chi^2$ with $\frac{1}{2}p(p-1)$ df.

My question is, how do you perform the calculation of the observed value $-2\log( \alpha)$?

Best Answer

Do you mean something like this?:

in R:

R <- matrix(c(1.0, 0.1, 0.1, 
            0.1, 1.0, 0.1, 
            0.1, 0.1, 1.0), nrow=3)
N <- 100

chi <- -2*log(det(R)^(N/2))
df <- nrow(R)*(nrow(R)-1)/2
p <- 1 - pchisq(chi, df)  
chi
p

Or in Excel: Where the matrix is in cells C26:E28, and N is 100:

=-2*(LN(MDETERM(C26:E28)^100/2))

And the above is in cell D31:

=CHIDIST(D31,3)

You can also use the sem package:

require(sem)  
rownames(R)  <- c("a", "b", "c")
colnames(R)  <- c("a", "b", "c")
mySem <- specifyModel()
  a <-> a, va, NA
  b <-> b, vb, NA
  c <-> c, vc, NA

semFit <- sem(mySem, S=R, N=100)  
summary(semFit)

(sem gives a very slightly different answer, because it multiplies by N-1).