Here's the explanation of what contrasting margins means.
Let's fit a toy Tobit model (you could also use intreg
), where we interact the foreign dummy with weight:
sysuse auto, clear
generate wgt=weight/1000
tobit mpg i.foreign##c.wgt c.headroom, ll(17) ul(30)
This yields:
Tobit regression Number of obs = 74
LR chi2(4) = 91.39
Prob > chi2 = 0.0000
Log likelihood = -138.22086 Pseudo R2 = 0.2484
-------------------------------------------------------------------------------
mpg | Coef. Std. Err. t P>|t| [95% Conf. Interval]
--------------+----------------------------------------------------------------
foreign |
Foreign | 10.16688 5.332589 1.91 0.061 -.468634 20.80239
wgt | -6.120729 .8351949 -7.33 0.000 -7.786473 -4.454986
|
foreign#c.wgt |
Foreign | -5.356987 2.229552 -2.40 0.019 -9.803689 -.9102848
|
headroom | -.5758296 .503259 -1.14 0.256 -1.579548 .4278888
_cons | 41.58485 2.453002 16.95 0.000 36.69249 46.47721
--------------+----------------------------------------------------------------
/sigma | 2.945599 .3107564 2.325815 3.565383
-------------------------------------------------------------------------------
18 left-censored observations at mpg <= 17
49 uncensored observations
7 right-censored observations at mpg >= 30
Now we will take the derivative of mpg with respect to wgt as if all cars were foreign and subtract from that the derivative of mpg with respect to wgt as if all cars were domestic (r.
means relative to the base level of foreign):
. margins r.foreign, dydx(wgt) predict(ystar(17,30))
Contrasts of average marginal effects
Model VCE : OIM
Expression : E(mpg*|17<mpg<30), predict(ystar(17,30))
dy/dx w.r.t. : wgt
------------------------------------------------
| df chi2 P>chi2
-------------+----------------------------------
wgt |
foreign | 1 0.43 0.5134
------------------------------------------------
------------------------------------------------------------------------
| Contrast Delta-method
| dy/dx Std. Err. [95% Conf. Interval]
-----------------------+------------------------------------------------
wgt |
foreign |
(Foreign vs Domestic) | -.3044572 .4658221 -1.217452 .6085373
------------------------------------------------------------------------
This tells you that the difference in the censored mpg-wgt slope between foreign cars and domestic cars is -.3: a 1000 lbs increase in weight is associated with an additional .3 mpg reduction in efficiency for foreign cars compared to domestic, but that gap is not statistically different from zero. Notice how different that is compared to the effect
We can also do things by hand in two steps (first get the two derivatives, and then take their difference):
. margins, dydx(wgt) at(foreign=(0 1)) predict(ystar(17,30)) post
Average marginal effects Number of obs = 74
Model VCE : OIM
Expression : E(mpg*|17<mpg<30), predict(ystar(17,30))
dy/dx w.r.t. : wgt
1._at : foreign = 0
2._at : foreign = 1
------------------------------------------------------------------------------
| Delta-method
| dy/dx Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
wgt |
_at |
1 | -4.237398 .3787365 -11.19 0.000 -4.979708 -3.495088
2 | -4.541855 .2690925 -16.88 0.000 -5.069267 -4.014443
------------------------------------------------------------------------------
. lincom _b[2._at]-_b[1._at]
( 1) - [wgt]1bn._at + [wgt]2._at = 0
------------------------------------------------------------------------------
| Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
(1) | -.3044572 .4658221 -0.65 0.513 -1.217452 .6085373
------------------------------------------------------------------------------
This gets you the same answer.
margins
computes standard errors from nonlinear predictions using the delta-method, and as donlelek points out, it also uses a normal approximation for computing confidence intervals.
Consider a slight variation on donlelek's example.
sysuse auto, clear
logistic foreign mpg
margins, predict(pr) nopvalues
The result from margins
is
Predictive margins Number of obs = 74
Model VCE : OIM
Expression : Pr(foreign), predict()
--------------------------------------------------------------
| Delta-method
| Margin Std. Err. [95% Conf. Interval]
-------------+------------------------------------------------
_cons | .2972973 .0487662 .2017172 .3928773
--------------------------------------------------------------
Let's call this marginal prediction $m$.
Here we verify by hand the confidence limits produced by margins
.
. display .2972973 - invnormal(.975)*.0487662
.2017173
. display .2972973 + invnormal(.975)*.0487662
.3928773
$m$ is the mean of the predicted probabilities. The predicted probability for observation $i$ is
$$
p_i = \frac{1}{1 + \exp(-z_i)}
$$
where $z_i$ is the corresponding linear prediction
$$
z_i = b_0 + b_1*x_i
$$
$x_i$ is the corresponding value of mpg
, and $b$ is the vector of regression coefficients. Thus $m$ is
$$
m = \frac{1}{N}\sum_{i=1}^N p_i
$$
$m$ is not a simple transformation of the marginal linear prediction. This is the general situation that margins
was developed to handle. As such, the only method for computing a confidence interval for $m$ is the normal approximation using standard errors computed via the delta-method.
margins
does not provide an option that will transform the confidence limits of a marginal linear prediction.
However, donlelek's question inspired me to write transform_margins
, which will transform the point estimates and confidence limits from margins
results with a user specified expression. Here is donlelek's example using transform_margins
:
sysuse auto
logistic foreign mpg
margins, at(mpg=(5(5)40)) predict(xb)
transform_margins invlogit(@)
Here is the output from transform_margins
----------------------------------------------
| b ll ul
-------------+--------------------------------
_at |
1 | .0271183 .0042517 .153952
2 | .0583461 .0151856 .1993467
3 | .1210596 .0511398 .2603462
4 | .2344013 .144129 .3575909
5 | .4049667 .2710298 .5547246
6 | .6020462 .3688264 .7966118
7 | .7707955 .4519781 .93203
8 | .8820117 .5300384 .9802168
----------------------------------------------
Type the following commands in Stata to install transform_margins
.
net from http://www.stata.com/users/jpitblado
net describe transform_margins
net install transform_margins
help transform_margins
Best Answer
Your model is effectively $$E[y \vert x,w]=\hat y =\hat \alpha+\hat \beta \cdot x + \hat \gamma \cdot w.$$ With the
eydx()
option,margins
calculates the average of $$\frac{\partial \hat y}{\partial x}\cdot\frac{1}{\hat y}= \frac{\hat \beta}{\hat y} \approx \frac{\frac{\Delta \hat y}{y}}{\Delta x}$$ in the estimation sample. This means the OLS coefficient is rescaled by the predicted value of the outcome and then averaged.This is a kind of semi-elasticity, and can be interpreted as the percentage/proportionate change in the expected value of $y$ for a one unit change in $x$.
This is not exactly equivalent to running the logged outcome regression, though it will often yield fairly similar estimates.
margins
is a post-estimation command that relies on previous estimates and performs none of its own.Similarly,
eyex()
calculates the average of $$\frac{\partial \hat y}{\partial x}\cdot \frac{x}{\hat y}= \hat \beta \cdot \frac{x}{\hat y} \approx \frac{\frac{\Delta \hat y}{\hat y}}{\frac{\Delta x}{x}},$$which is percent change in $y$ for a percent change in $x$, the full elasticity.
Here's Stata code showing these claims:
The margins semi-elasticity is a 0.86% decrease in price for an additional mile per gallon, holding weight constant (I find it helpful to multiply $\frac{\Delta \hat y}{\hat y} = 0.0086$ by 100 here). The logged outcome model's semi-elasticity is a 1% decrease.
The elasticity is 19.65% reduction in price for a 1% increase in mpg. If you fit the log-log model, the difference between the margins approach will be starker than in the semi-elasticity case.