Solved – Relationship between $\beta_1$ and odds in simple logistic regression

interpretationlogisticodds

I am taking a course in logistic regression, and currently my class is about to finish our discussion about simple logistic regression. My professor said that the following statement is correct:

For one unit increase in x, the odds(event) is increased (decreased) by the factor $\exp(β_1)/(1-\exp(β_1))$ when $β_1$ is positive (negative).

I understand all of this except for the $(1-\exp(\beta_1))$ part. Why wouldn't the $\rm odds(event)$ just decrease by $\exp(\beta_1)$?

Where $\rm odds(event) = P(event)/(1-P(event))$ and $x$ is a continuous predictor.

Best Answer

For inference in logistic regression, it would be easier to think in terms of log odds instead of odds. A simple logistic regression model is a generalized linear model with the form $$ \newcommand{\logit}{\rm logit} \newcommand{\odds}{\rm odds} \newcommand{\expit}{\rm expit} \logit(\pi_i) = \beta_0 + \beta_1X_{i1}, $$ where $\logit(\pi_i) = \log(\frac{\pi_i}{1-\pi_i}) = \log(\odds(\pi_i))$. Notice that it has the exact same form as linear regression with a Bernoulli-distributed transformed dependent variable. This has the more intuitive interpretation that for every increase in $X_{i1}$, the expected log odds in favor of $X_{i1}$ increase by $\beta_1$.

If you want to interpret the model in terms of odds, you just have to exponentiate the logit, which is what you initially assumed. This gives you $\odds(\pi_i) = \exp(\beta_0 + \beta_1X_{i1})$, and not $\odds(\pi_i) = \frac{\exp(\beta_0 + \beta_1X_{i1})}{1 + \exp(\beta_0 + \beta_1X_{i1})}$. The interpretation in this case is that for every increase in $X_{i1}$, the expected odds in favor of $Y_i = 1$ increase by $\exp(\beta_1)$. I think either you or your professor got odds and probability mixed up in your professor's explanation.

If you want to draw inference on probabilities, you need to invert the $\logit$ transformation by taking the $\expit$ of both sides, where $\expit(x) = \frac{\exp(x)}{1 + \exp(x)}$. In that case, we have $\pi_i = \frac{\exp(\beta_0 + \beta_1X_{i1})}{1 + \exp(\beta_0 + \beta_1X_{i1})}$ as our expected probability that $Y_i=1$. In this case, it is difficult to draw inference in terms of probabilities, which is why we transform back to probabilities by taking the expit of our expected log odds.