You know that in a logit:
$$Pr[y = 1 \vert x,z] = p = \frac{\exp (\alpha + \beta \cdot \ln x + \gamma z)}{1+\exp (\alpha + \beta \cdot \ln x + \gamma z )}. $$
After some tedious calculus and simplification, the partial of that with respect to $x$ becomes:
$$ \frac{\partial Pr[y=1 \vert x,z]}{\partial x} = \frac{\beta}{x} \cdot p \cdot (1-p). $$
This is (sort of) equivalent to
$$\frac{\Delta p}{\Delta x}=\frac{\beta}{x} \cdot p \cdot (1-p),$$
which can be re-written as
$$\frac{\Delta p}{100 \cdot \frac{ \Delta x}{x}}= \frac{\beta \cdot p \cdot (1-p)}{100}.$$
This is the definition of semi-elasticity, and can be interpreted as the change in probability for a 1% change in $x$.
Here's an example in Stata.* Note that I am using margins
instead of the out-of-date mfx
to get the average marginal effect of $x$, $\frac{1}{N}\Sigma_{i=1}^N\frac{\beta \cdot p_i \cdot (1-p_i)}{100}$:
. sysuse auto, clear
(1978 Automobile Data)
. gen ln_price = ln(price)
. logit foreign ln_price mpg weight, nolog
Logistic regression Number of obs = 74
LR chi2(3) = 57.69
Prob > chi2 = 0.0000
Log likelihood = -16.185932 Pseudo R2 = 0.6406
------------------------------------------------------------------------------
foreign | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ln_price | 6.851215 2.11763 3.24 0.001 2.700737 11.00169
mpg | -.0880842 .1031317 -0.85 0.393 -.2902186 .1140503
weight | -.0062268 .0017269 -3.61 0.000 -.0096115 -.0028422
_cons | -41.32383 16.24003 -2.54 0.011 -73.15371 -9.493947
------------------------------------------------------------------------------
. margins, expression(_b[ln_price]*predict()*(1-predict())/100)
Predictive margins Number of obs = 74
Model VCE : OIM
Expression : _b[ln_price]*predict()*(1-predict())/100
------------------------------------------------------------------------------
| Delta-method
| Margin Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
_cons | .0046371 .0007965 5.82 0.000 .003076 .0061982
------------------------------------------------------------------------------
This means that for a 1% increase in price, the probability that a car is foreign increases by 0.005 on a [0,1] scale. Or a 10% increase in price gives you a 0.05 increase. In this date, about 0.3 of the cars are foreign, so these are economically and statistically significant.
Edit:
A good way to do this in Stata 10 is to install the user-written command margeff
:
. margeff, dydx(ln_price) replace
Average partial effects after margeff
y = Pr(foreign)
------------------------------------------------------------------------------
variable | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ln_price | .4637103 .0796514 5.82 0.000 .3075964 .6198241
mpg | -.0059616 .006781 -0.88 0.379 -.0192522 .007329
weight | -.0004214 .0000417 -10.11 0.000 -.0005031 -.0003398
------------------------------------------------------------------------------
. lincom _b[ln_price]/100
( 1) .01*ln_price = 0
------------------------------------------------------------------------------
variable | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
(1) | .0046371 .0007965 5.82 0.000 .003076 .0061982
------------------------------------------------------------------------------
*This is actually not a great empirical example since the relationship in the data has an inverted-U shape.
Best Answer
In a probit model,
$$p(y=1 \vert d,\ln x)=\Phi(\alpha + \beta \cdot \ln x + \gamma \cdot d),$$
where $\Phi()$ is the standard normal CDF. Taking the derivative gets you
$$\frac{\partial p}{\partial x}=\varphi(\alpha + \beta \cdot \ln x + \gamma \cdot d) \cdot \beta \cdot\frac{1}{x},$$
where $\varphi()$ is the standard normal pdf. This derivative can be re-arranged as $$\frac{\partial p}{\partial x} \cdot \frac{x}{100}=\varphi(\alpha + \beta \cdot \ln x + \gamma \cdot d) \cdot \frac{\beta}{100}.$$
The right hand side is a semi-elasticity: it gives you the change in the probability of success for 1% change in x. You can see that a bit more clearly if you rewrite it as:
$$\frac{\partial p}{\partial x} \cdot \frac{x}{100}=\frac{\Delta p}{100 \cdot \Delta x /x}.$$
If you want the full elasticity, you need to divide by $p$ instead (the 100 goes away since it now you have a percentage change in both in the numerator and denominator):
$$\frac{\partial p}{\partial x} \cdot \frac{x}{p}=\frac{\Delta p/p}{ \Delta x /x}=\frac{\varphi(\alpha + \beta \cdot \ln x + \gamma \cdot d) \cdot \beta}{\Phi(\alpha + \beta \cdot \ln x + \gamma \cdot d)}.$$
For binary variables, you can calculate the marginal effect as $$p(y=1 \vert d=1,\ln x)-p(y=1 \vert d=0,\ln x)=\Phi(\alpha + \beta \cdot \ln x + \gamma \cdot d)-\Phi(\alpha + \beta \cdot \ln x + \gamma \cdot 0).$$
This is arguably better since you are considering a discrete change in $d$ from 1 to 0, instead of an tiny change in $x$, so the finite difference is superior to the derivative.
In Stata, you can calculate averages of these quantities over the estimation sample like this:
mfx
has been superseded bymargins
.