an odds ratio of 2 means that the event is 2 time more probable given a one-unit increase in the predictor
It means the odds would double, which is not the same as the probability doubling.
In Cox regression, a hazard ratio of 2 means the event will occur twice as often at each time point given a one-unit increase in the predictor.
Aside a bit of handwaving, yes - the rate of occurrence doubles. It's like a scaled instantaneous probability.
Are these not practically the same thing?
They're almost the same thing when doubling the odds of the event is almost the same as doubling the hazard of the event. They're not automatically similar, but under some (fairly common) circumstances they may correspond very closely.
You may want to consider the difference between odds and probability more carefully.
See, for example, the first sentence here, which makes it clear that odds are the ratio of a probability to its complement. So for example, increasing the odds (in favor) from 1 to 2 is the same as probability increasing from $\frac{1}{2}$ to $\frac{2}{3}$. Odds increase faster than probability increases. For very small probabilities, odds-in-favor and probability are very similar, while odds-against become increasingly similar to (in the sense that the ratio will go to 1) reciprocals of probability as probability gets small. An odds ratio is simply the ratio of two sets of odds. Increasing the odds ratio while holding a base odds constant corresponds to increasing the other odds, but may or may not be similar to the relative change in probability.
You may also want to ponder the difference between hazard and probability (see my earlier discussion where I make mention of hand-waving; now we don't gloss over the difference). For example, if a probability is 0.6, you can't double it – but an instantaneous hazard of 0.6 can be doubled to 1.2. They're not the same thing, in the same way that probability density is not probability.
The misunderstanding is in (1). In fact $exp(\beta+\beta_1*1)\neq \frac{\frac{p_1}{1-p_1}}{\frac{p_2}{1-p_2}}$
You already know $log(\frac{p_1}{1-p_1}) = \beta + \beta_1*1$ then
$exp(\beta+\beta_1*1)=\frac{p_1}{1-p_1}$ it is an odds not an odds ratio.
while the $exp(\beta_1)$ itself indeed is an odds ration, since $OR=\frac{p_1}{1-p_1}/\frac{p_0}{1-p_0}=\frac{exp(\beta+\beta_1*1)}{exp(\beta+\beta_1*0)}=e^{\beta+\beta_1-\beta-0*\beta_1}=exp{(\beta_1)}$ suppose you have a binary(dummy) predictor variable or when it is a continuous variable you are talking about one unit change of the variable..
Also note $exp(\beta) =$ odds ratio =$ \frac{\frac{p_1}{1-p_1}}{\frac{p_2}{1-p_2}}$ is not correct either.
Best Answer
For inference in logistic regression, it would be easier to think in terms of log odds instead of odds. A simple logistic regression model is a generalized linear model with the form $$ \newcommand{\logit}{\rm logit} \newcommand{\odds}{\rm odds} \newcommand{\expit}{\rm expit} \logit(\pi_i) = \beta_0 + \beta_1X_{i1}, $$ where $\logit(\pi_i) = \log(\frac{\pi_i}{1-\pi_i}) = \log(\odds(\pi_i))$. Notice that it has the exact same form as linear regression with a Bernoulli-distributed transformed dependent variable. This has the more intuitive interpretation that for every increase in $X_{i1}$, the expected log odds in favor of $X_{i1}$ increase by $\beta_1$.
If you want to interpret the model in terms of odds, you just have to exponentiate the logit, which is what you initially assumed. This gives you $\odds(\pi_i) = \exp(\beta_0 + \beta_1X_{i1})$, and not $\odds(\pi_i) = \frac{\exp(\beta_0 + \beta_1X_{i1})}{1 + \exp(\beta_0 + \beta_1X_{i1})}$. The interpretation in this case is that for every increase in $X_{i1}$, the expected odds in favor of $Y_i = 1$ increase by $\exp(\beta_1)$. I think either you or your professor got odds and probability mixed up in your professor's explanation.
If you want to draw inference on probabilities, you need to invert the $\logit$ transformation by taking the $\expit$ of both sides, where $\expit(x) = \frac{\exp(x)}{1 + \exp(x)}$. In that case, we have $\pi_i = \frac{\exp(\beta_0 + \beta_1X_{i1})}{1 + \exp(\beta_0 + \beta_1X_{i1})}$ as our expected probability that $Y_i=1$. In this case, it is difficult to draw inference in terms of probabilities, which is why we transform back to probabilities by taking the expit of our expected log odds.