# Generalized Linear Model – Difference Between LM and GLM for Gaussian Family

generalized linear modellmnormal distributionr

Specifically, I want to know if there is a difference between lm(y ~ x1 + x2) and glm(y ~ x1 + x2, family=gaussian). I think that this particular case of glm is equal to lm. Am I wrong?

While for the specific form of model mentioned in the body of the question (i.e. lm(y ~ x1 + x2) vs glm(y ~ x1 + x2, family=gaussian)), regression and GLMs are the same model, the title question asks something slightly more general:

Is there any difference between lm and glm for the gaussian family of glm?

To which the answer is "Yes!".

The reason that they can be different is because you can also specify a link function in the GLM. This allows you to fit particular forms of nonlinear relationship between $$y$$ (or rather its conditional mean) and the $$x$$-variables; while you can do this in nls as well, there's no need for starting values, sometimes the convergence is better (also the syntax is a bit easier).

Compare, for example, these models (you have R so I assume you can run these yourself):

x1=c(56.1, 26.8, 23.9, 46.8, 34.8, 42.1, 22.9, 55.5, 56.1, 46.9, 26.7, 33.9,
37.0, 57.6, 27.2, 25.7, 37.0, 44.4, 44.7, 67.2, 48.7, 20.4, 45.2, 22.4, 23.2,
39.9, 51.3, 24.1, 56.3, 58.9, 62.2, 37.7, 36.0, 63.9, 62.5, 44.1, 46.9, 45.4,
23.7, 36.5, 56.1, 69.6, 40.3, 26.2, 67.1, 33.8, 29.9, 25.7, 40.0, 27.5)

x2=c(12.29, 11.42, 13.59, 8.64, 12.77, 9.9, 13.2, 7.34, 10.67, 18.8, 9.84, 16.72,
10.32, 13.67, 7.65, 9.44, 14.52, 8.24, 14.14, 17.2, 16.21, 6.01, 14.23, 15.63,
10.83, 13.39, 10.5, 10.01, 13.56, 11.26, 4.8, 9.59, 11.87, 11, 12.02, 10.9, 9.5,
10.63, 19.03, 16.71, 15.11, 7.22, 12.6, 15.35, 8.77, 9.81, 9.49, 15.82, 10.94, 6.53)

y = c(1.54, 0.81, 1.39, 1.09, 1.3, 1.16, 0.95, 1.29, 1.35, 1.86, 1.1, 0.96,
1.03, 1.8, 0.7, 0.88, 1.24, 0.94, 1.41, 2.13, 1.63, 0.78, 1.55, 1.5, 0.96,
1.21, 1.4, 0.66, 1.55, 1.37, 1.19, 0.88, 0.97, 1.56, 1.51, 1.09, 1.23, 1.2,
1.62, 1.52, 1.64, 1.77, 0.97, 1.12, 1.48, 0.83, 1.06, 1.1, 1.21, 0.75)

lm(y ~ x1 + x2)
glm(y ~ x1 + x2, family=gaussian)
glm(y ~ x1 + x2, family=gaussian(link="log"))
nls(y ~ exp(b0+b1*x1+b2*x2), start=list(b0=-1,b1=0.01,b2=0.1))


Note that the first pair are the same model ($$y_i \sim N(\beta_0+\beta_1 x_{1i}+\beta_2 x_{2i},\sigma^2)\,$$), and the second pair are the same model ($$y_i \sim N(\exp(\beta_0+\beta_1 x_{1i}+\beta_2 x_{2i}),\sigma^2)\,$$ and the fits are essentially the same within each pair.

So - in relation to the title question - you can fit a substantially wider variety of Gaussian models with a GLM than with regression.