Solved – Seeming Unrelated Regression (SUR) for logistic regression

hypothesis testinglogisticstatistical significance

I would like to perform linear hypothesis testing on whether $\beta_{11}=\beta_{21}$ in the logistic regressions
\begin{align*}
\text{logit}[P(Y_1=1|X)]=(\beta_{10}+\beta_{11}X)\\
\text{logit}[P(Y_2=1|X)]=(\beta_{20}+\beta_{21}X)\\
\end{align*}
The models are not nested within each other since the outcomes $Y_1$ and $Y_2$ are not the same (but the covariate $X$ is the same). I looked into SUR but realized logistic regressions cannot be solved via OLS. Is there a way to obtain variance-covariance matrix for the 4 model parameters above or perform the said hypothesis test? (I understand there are multinomial regression methods but would like to keep the above model structure).

Best Answer

It seems to me that the literature around this baffling process--Seemingly Unrelated Regression (SUR) models--concerns ordinary least squares and not logistic regression. The only reason SUR is of any interest is that the error terms are estimated jointly from several regression models using the same exposure variable(s).

Logistic regression has no error terms because of the mean variance relationship captured in the GLM. So these models (that you've written) are more than just seemingly unrelated. They're completely unrelated.

Multinomial regression models are not necessarily appropriate for analyzing multiple binary variables. In a multinomial model you only observe 1 event in rows of $Y$ variables. If $Y_1$ and $Y_2$ have the structure of, say, an educational outcome: college grad / HS grad / no HS diploma... then these events are conditionally dependent. However, if you have $Y_1$ is Democratic voting preference and $Y_2$ is marital status (married vs not married). You might have $Y_1 = Y_2 = 0$ or $1$.

So, at face value, I can see no reason why you're interested in joining these models and you should probably describe your data and what you're trying to do.

If you are interested in regression models that can borrow specific information across groups in this case--not just error terms, but odds ratios--you should consider log-linear models as an alternative.

Related Question