As you're going through this, keep in mind that the interpretation of a "unit change in a logarithm" as a "percent change" is a local approximation.
1.
You're looking at a percent change in percentage points. Say $x$ measures how full a glass of water is. Some glasses are 25% full, others are 26% full. Un-logged, a 1-unit change in $x$ (i.e. moving from 25% to 26%) is associated with a $b$-unit change in $y$. The fact that the unit is a percentage point is irrelevant here.
Now take the log of $x$ and $y$. A 1-unit change in $\log{x}$ is associated with a $b$-unit change in $\log{y}$. So in the percent-change interpretation, a 1-percent change in $x$ is associated with a $b$-percent change in $y$. That is, moving from a glass that is 25% full to one that is 25.25% full is associated with a $b$% change in $y$.
What if $x$ is already a percent change in something else? Let's say, instead of "glass fullness," $x$ is now how much water has evaporated from a glass over some period of time, measured as a percentage of the original water level. Then a 1% change in $x$, i.e. going from 25% change to 25.25% change, is associated with a $b$% change in $y$.
Is that meaningful? Sure, if it's what you want to model. And chances are good that taking a logarithm to "correct skew" is unnecessary for the independent variable in a regression.
2.
Recall that $\log{u}-\log{v}=\log{u/v}$. So in the "percent change" interpretation, a 1% increase in the ratio of $x_t$ and $x_{t-1}$ is associated with a $b_1$% increase in the ratio of $y_t$ and $y_{t-1}$. This is a slightly messier case than before, but it's still a percent change in percentage points as above. Let's say $x_t=1$ and $x_{t-1}=2$. Then their ratio is $0.5$. Moving from $log{0.5}$ to $log{0.5}+1$ is the same thing as moving that ratio from $0.5$ to $0.5e^{1}=0.5e$, since $\log{e^1}=1$. By the same expansion, this is associated with moving the $y$ ratio from $r$ to $re^b$.
This, of course, is completely different from taking the logarithm of the first differences.
3.
There's no "bias" to correct for. I'm going to assume you mean to ask whether the predict functions automatically transform the data back to original scale. They don't.
R's built-in lm
function doesn't (and in some sense can't, and probably shouldn't) keep track of any transformations you apply to your variables. So predict
will just take whatever $x$ you feed it and plug it into the fitted line. So if you fit l = lm(log(y) ~ log(x))
, predict(l,x)
will give you $\widehat{\log{y}}$ and it will assume x
is already on a log scale. That doesn't mean you can't write a wrapper function for lm
that allows you to keep track of such transformations and a corresponding predict
method that undoes these transformations, but that's one for StackOverflow.
The same is even more true in Stata, where a command like reg log(y) log(x)
is downright invalid. You have to first do something like gen logx = log(x)
, gen logy = log(y)
, and finally reg logy logx
. So predict yhat
will, as in R, return a log scale variable and assume you are feeding it a log scale variable.
These odds ratios are the exponential of the corresponding regression coefficient:
$$\text{odds ratio} = e^{\hat\beta}$$
For example, if the logistic regression coefficient is $\hat\beta=0.25$ the odds ratio is $e^{0.25} = 1.28$.
The odds ratio is the multiplier that shows how the odds change for a one-unit increase in the value of the X. The odds ratio increases by a factor of 1.28. So if the initial odds ratio was, say 0.25, the odds ratio after one unit increase in the covariate becomes $0.25 \times 1.28$.
Another way to try to interpret the odds ratio is to look at the fractional part and interpret it as a percentage change. For example, the odds ratio of 1.28 corresponds to a 28% increase in the odds for a 1-unit increase in the corresponding X.
In case we are dealing with an decreasing effect (OR < 1), for example odds ratio = 0.94, then there is a 6% decrease in the odds for a 1-unit increase in the corresponding X.
The formula is:
$$ \text{Percent Change in the Odds} = \left( \text{Odds Ratio} - 1 \right) \times 100 $$
Best Answer
Let's look at some particular toy data in R which also gives a coefficient of about $0.71$:
What this is saying is the maximum likelihood logistic curve has
So when $x=0$ you would get predicted log-odds of $0.4251$, odds of $1.5297$ and a probability of $0.6047$; when $x=1$ you would get predicted log-odds of $1.1352$ ($0.7102$ more), odds of $3.1118$ (multiplying by $e^{0.7102}=2.0343$) and a probability of $0.7568$
A $1$ percentage point increase in $x$ increases the predicted log-odds by $0.004251$ and multiplies the predicted odds by $2.0343^{0.01}=1.00713$. You cannot make such a simple statement about the predicted probabilities.