Post-Hoc Test on lmer Model – How to Perform and Interpret

lme4-nlmepost-hocr

This is my data frame:

Group   <- c("G1","G1","G1","G1","G1","G1","G1","G1","G1","G1","G1","G1","G1","G1","G1","G2","G2","G2","G2","G2","G2","G2","G2","G2","G2","G2","G2","G2","G2","G2","G3","G3","G3","G3","G3","G3","G3","G3","G3","G3","G3","G3","G3","G3","G3")
Subject <- c("S1","S2","S3","S4","S5","S6","S7","S8","S9","S10","S11","S12","S13","S14","S15","S1","S2","S3","S4","S5","S6","S7","S8","S9","S10","S11","S12","S13","S14","S15","S1","S2","S3","S4","S5","S6","S7","S8","S9","S10","S11","S12","S13","S14","S15")
Value   <- c(9.832217741,13.62390117,13.19671612,14.68552076,9.26683366,11.67886655,14.65083473,12.20969772,11.58494621,13.58474896,12.49053635,10.28208078,12.21945867,12.58276212,15.42648969,9.466436017,11.46582655,10.78725485,10.66159358,10.86701127,12.97863424,12.85276916,8.672953949,10.44587257,13.62135205,13.64038394,12.45778874,8.655142642,10.65925259,13.18336949,11.96595556,13.5552118,11.8337142,14.01763101,11.37502161,14.14801305,13.21640866,9.141392359,11.65848845,14.20350364,14.1829714,11.26202565,11.98431285,13.77216009,11.57303893)

data <- data.frame(Group, Subject, Value)

Then I run a linear-mixed effects model to compare the 3 Groups' difference on "Value", where "Subject" is the random factor:

library(lme4)
library(lmerTest)
model <- lmer (Value~Group + (1|Subject), data = data)
summary(model)

The results are:

Fixed effects:
            Estimate Std. Error       df t value Pr(>|t|)    
(Intercept) 12.48771    0.42892 31.54000  29.114   <2e-16 ***
GroupG2     -1.12666    0.46702 28.00000  -2.412   0.0226 *  
GroupG3      0.03828    0.46702 28.00000   0.082   0.9353    

However, how to compare Group2 with Group3? What is the convention in academic article?

Best Answer

You could use emmeans::emmeans() or lmerTest::difflsmeans(), or multcomp::glht().

I prefer emmeans (previously lsmeans).

library(emmeans)
emmeans(model, list(pairwise ~ Group), adjust = "tukey")

The next option is difflsmeans. Note difflsmeans cannot correct for multiple comparisons, and uses the Satterthwaite method for calculating degrees of freedom as default instead of the Kenward-Roger method used by default by emmeans, so it might be best to explicitly specify the method you prefer.

library(lmerTest)
difflsmeans(model, test.effs = "Group", ddf="Kenward-Roger")

The multcomp::glht() method is described in the other answer to this question, by Hack-R.

Also, you can get the ANOVA p-values by loading lmerTest and then using anova.

library(lmerTest)
lmerTest::anova(model)

Just to be clear, you intended for the Value to be assessed three times for each subject, right? It looks like Group is "within-subjects", not "between-subjects."