Solved – Panel data model with two-way fixed effects and individual-specific slopes

fixed-effects-modelpanel datareferences

I have the following panel-data model:

$$ y_{it} = \alpha_i + \lambda_t + \beta_i X_{it} + \varepsilon_{it}. $$

It contains individual-specific intercept $\alpha_i$, time-specific intercept $\lambda_t$ and individual-specific slope $\beta_i$ (a vector). $X_{it}$ are exogenous variables.

If I got the panel data terminology right, it would be a fairly standard two-way fixed effects model, if not for the individual-specific slopes.

Questions:

  1. Does this model have a name? If so, how is it called?
  2. Where can I read more about this model and its estimation?
  3. What is a good estimator for the above model if
    • $y_{i,\cdot}$ is integrated of order 1 (I(1)),
    • $X_{i,\cdot}$ are I(1),
    • $y_{i,\cdot}$ are cointegrated across individuals (i.e. across $i$), but
    • there is no cointegration between $y_{i,\cdot}$ and $X_{i,\cdot}$?
  4. Is the model implemented in R? If not, is it implemented in some other software?

I have found something like this model in Stata's panel data manual, function xtxdpd (see bottom of page 15); but I did not like that source too much.

Edit:

The model does not look good if $y_{i,\cdot}$ is not cointegrated with $X_{i,\cdot}$, because then the regressors diverge from the regressand. So a model in first differences would make more sense.

Best Answer

Here is one way of estimating $\lambda_t$ and $\beta_i$.

Take the original equation (but consider only one $x_{it}$ in place of a vector $X_{it}$, that will help save some space and typesetting later on)

$$ y_{it} = \alpha_i + \lambda_t + \beta_i x_{it} + \varepsilon_{it} $$

and difference it with respect to time to obtain

$$ \Delta y_{it} = \Delta \lambda_t + \beta_i \Delta x_{it} + \Delta \varepsilon_{it}. $$

If $y_{i,\cdot}$ and $x_{i,\cdot}$ are integrated but not cointegrated, we get a relatively nice representation in terms of their stationary transformations.

Construct a set of dummies corresponding to $\Delta \lambda_t$ and stack the equations to get

$$ \begin{pmatrix} \Delta y_{11} \\ \Delta y_{12} \\ \vdots \\ \Delta y_{1T} \\ \Delta y_{21} \\ \Delta y_{22} \\ \vdots \\ \Delta y_{2T} \\ \vdots \\ \Delta y_{m1} \\ \Delta y_{m2} \\ \vdots \\ \Delta y_{mT} \end{pmatrix} = \begin{pmatrix} 1 & 0 & \dotsb & 0 & \Delta x_{11} & 0 & \dotsb & 0 \\ 0 & 1 & \dotsb & 0 & \Delta x_{12} & 0 & \dotsb & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \dotsb & 1 & \Delta x_{1T} & 0 & \dotsb & 0 \\ 1 & 0 & \dotsb & 0 & 0 & \Delta x_{21} & \dotsb & 0 \\ 0 & 1 & \dotsb & 0 & 0 & \Delta x_{22} & \dotsb & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \dotsb & 1 & 0 & \Delta x_{2T} & \dotsb & 0 \\ \vdots \\ 1 & 0 & \dotsb & 0 & 0 & 0 & \dotsb & \Delta x_{m1} \\ 0 & 1 & \dotsb & 0 & 0 & 0 & \dotsb & \Delta x_{m2} \\ \vdots & \vdots & \ddots & \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \dotsb & 1 & 0 & 0 & \dotsb & \Delta x_{mT} \\ \end{pmatrix} \times \begin{pmatrix} \Delta\lambda_1 \\ \Delta\lambda_2 \\ \vdots \\ \Delta\lambda_T \\ \beta_1 \\ \beta_2 \\ \vdots \\ \beta_m \\ \end{pmatrix} + \begin{pmatrix} \Delta\varepsilon_{11} \\ \Delta\varepsilon_{12} \\ \vdots \\ \Delta\varepsilon_{1T} \\ \Delta\varepsilon_{21} \\ \Delta\varepsilon_{22} \\ \vdots \\ \Delta\varepsilon_{2T} \\ \vdots \\ \Delta\varepsilon_{m1} \\ \Delta\varepsilon_{m2} \\ \vdots \\ \Delta\varepsilon_{mT} \end{pmatrix} $$

(for notational simplicity, I assumed the original observations at time $t=0$ are available).

This is a shape of ordinary linear regression, and the estimator of the coefficient vector is straightforward. I have not given much thought on how good such estimator is, though.

Related Question