Solved – Complementary log-log regression

generalized linear modelr

Can someone explain the general approach to conducting complementary log-log in R? I can hardly find any information on complementary log-log. Now I was capable of running a complementary log-log, but on how to interpret the results. R gives me Z-values. Is the next step to calculate probabilities? Is it comparable to logistic regression?

I would really like to know about the complementary log-log procedure and examples would be cherry on top of the cake.

Best Answer

It looks like you can just write a function to act as the cloglog link in R. Here is an example:

library(tidyverse)

x = seq(-2,2, 0.01)
eta = 0.2*x - 0.8
p = 1/(1+exp(-x))
y = rbinom(length(p),1,p)

cloglog = function(x) log(-log(1-x))

loglog = glm(y~x, family = binomial(link = cloglog))
logit = glm(y~x, family = binomial())

Here is what the two models look like plotted for this data

enter image description here

So far as the interpretation of the results, the difference a single unit makes would be

$$\log(-\log(1-p(x+1) ) ) - \log(- \log(1-p(x))) = \beta_1 $$

which through some algebra is

$$\log\left( \dfrac{\log(1-p(x+1))}{\log(1-p(x))} \right) = \beta_1 $$

(or, at least I think it is) which isn't super intelligible to me.