My outcome variable is called PB, and it is a binary variable: 0 or 1.
I have used generalized linear mixed models (GLMM) because each subject ($n=238$) was measured twice (overall observations = 476)
Using GLMM with a logit function, I have built an empty model with random intercepts. This how I did it in R:
FirstModel <- glmer(data = DATA_LONG3, PB ~ 1 + (1|userId),
family = "binomial")
Though when I ran it, I got these warnings:
Warning messages:
1: In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv, :
Model failed to converge with max|grad| = 0.0852968 (tol = 0.002, component 1)
2: In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv, :
Model is nearly unidentifiable: very large eigenvalue
- Rescale variables?
Just to give points on context:
- Overall there are 476 observations ($n=476$), 62 of them PB = 1, and 414 are PB = 0.
- In the first measurement: ($n=238$), 215 are PB = 1, and 23 are PB = 0.
- In the second measurement: ($n=238$), 199 are PB = 1, and 39 are PB = 0.
Can someone tell me how crucial this problem is for my results, and maybe give me a solution?
Best Answer
To obtain convergence (which as @ShawnHemelstrand 's answer points out is not your only issue) increase the default value of
nAGQ = 1
to at least 8 to get a relatively stable result:You gave almost every thing necessary to duplicate your data for use with the model given. One just needs to find the number of
userId
's that are 00, 01, 10, and 11. Those counts are labeledc00
,c01
,c10
, andc11
. We find that179 <= c00 <= 199
and the rest of the counts are then determined. It turns out thatc00 = 189
andc00 = 190
result in the same error you had.Using
c00 = 189
we have