Solved – Linear Regression with measurement error

linearmeasurement errorregression

I'm searching for the analytical expression for the slope and its associated error in a simple regression problem. However, the independent variables $x_i$ have been obtained with a known measurement error of $\sigma_i$ .

Here I do not mean the inherent variability of the data, but the error in measuring it. For example take three points (x,y): (1,1),(2,2),(3,3). Calculate the slope and its associated error than that error would be equal to zero since the points correspond to a perfect line. However, if each point would have its associated measurement error, intuitively, there would still be an error on the estimation of the slope.

This question might be a duplicate of this

Best Answer

Given that you clarified that you have errors (with known variance) in your independent variables that you want to account for, I think what you're looking for is a Deming Regression (for a simple linear regression) or Total Least Squares (for multiple linear regression). Both model the degree of error in your independent variables (in addition to the error in your dependent variables), with the former being a special case of the latter.