You have several questions bundled together. My answer is partial and focuses on link function and transformation, which I take to be more different than they seem.
I think it's important to keep the similar but not identical ideas of transformations and link functions distinct. The introductory literature I have seen does not do an especially good job on that, probably because the authors were too smart to realise that other people could get confused. A first approximation is that the link function has a loosely similar role to transformation of the response (outcome, dependent variable), but that aside the differences are crucial.
Focus on the common and relatively simple case of trying to predict $\log Y$ rather than $Y$ with some $\beta_0 + \beta_1 X$. Here the crucial detail is that the regression in no sense knows about the transformation. Rather, it's your decision that it would be a good idea to transform first (I will call this the "before" step). But the regression doesn't know what you did before. It is oblivious of where the data come from and just sees some $Y_\text{different}$. Also the assumption about the error term distribution is still that the error term is normal. Otherwise put, in
$$\log Y = Y_\text{different} = \beta_0 + \beta_1 X + \epsilon$$
the first equality is your private knowledge and the second is what defines the regression model. Thinking that the normal assumption about errors corresponds to a lognormal distribution on your original scale is also private (and such errors would be multiplicative not additive).
Similarly, with classical regression there is often an "after" step, in which for example you reverse the transformation to get predictions of the original $Y$, and perhaps even adjust the confidence intervals to correct for side-effects of transformation, at least to a good approximation. But that is nothing to do with the regression. Indeed, this step is not compulsory, and sometimes it is a good idea to stay on a logarithmic scale and think on that scale. (In effect, using units of measurement such as pH or decibels that are logarithmic is a decision of this kind, even if such a decision would be regarded as scientific rather than statistical.)
Contrast this with generalised linear models -- in this example, using a logarithmic link -- in which "before", fitting and "after" stages are tightly linked, indeed inseparable as far as a data analyst is concerned. The link makes the transformation of the response unnecessary, but the model fitting automatically includes the equivalent of the "after" stage, thus yielding predictions on the original scale. The invertibility of the link is naturally crucial here.
All this refers only to transformations of the response. Using a generalised linear model can still mean transforming predictor variables if that is appropriate.
I've found Lane's paper to be very helpful as a fairly informal but trustworthy discussion.
Lane, P.W. 2002. Generalized linear models in soil science. European Journal of Soil Science 53: 241–251. doi: 10.1046/j.1365-2389.2002.00440.x
The documentation clears this up. About halfway through it describes the object glm
returns, which has the attribute
fitted.values
the fitted mean values, obtained by transforming the linear predictors by
the inverse of the link function.
When you do n1$fit
, R does partial name matching, and gives you n1$fitted.values
.
Best Answer
This has been answered on the R help list by Adelchi Azzalini: the important point is that the dispersion parameter (which is what distinguishes an exponential distribution from the more general Gamma distribution) does not affect the parameter estimates in a generalized linear model, only the standard errors of the parameters/confidence intervals/p-values etc.; in R an estimate of the dispersion parameter is automatically reported, but as Azzalini comments,
summary.glm
allows the user to specify the dispersion parameter. So, as stated by Azzalini,[Azzalini had
family=Gamma
, i.e. using the default inverse link; I changed it to specify the log link as in your question.]