[Math] Simple Regression ~ House Price Prediction

regression

I am stuck with this question.

"You have a data set consisting of the sales prices of houses in your neighborhood, with each sale time-stamped by the month and year in which the house sold. You want to predict the average value of houses in your neighborhood over time, so you fit a simple regression model with average house price as the output and the time index (in months) as the input. Based on $10$ months of data, the estimated intercept is $\$4569$ and the estimated slope is $143$ [Dollar/month]. If you extrapolate this trend forward in time, at which time index (in months) do you predict that your neighborhood's value will have doubled relative to the value at month $10$? (Round to the nearest month)."

What I tried: Since the house price is increasing according to slope, the house price should be doubled after $4569/143$, which equals to $32$ months. But this does not seem to be the right answer.

Any hint for solving this question will be greatly appreciated.

Thanks.

Best Answer

Actually this is more than a hint: Firstly, you need to write down the linear regression model that you are given $$Y=4569+143X$$ where $Y$ denotes the price (in \$) and $X$ the time (in months). Now, substitute $X=10$, find $Y_{10}$, double $Y_{10}$ and find $X'$ that corresponds to $2\cdot Y_{10}$. If this is not clear you can see the (spoiler) below:


For $X=10$ months, your linear regression model returns a price of

$$Y_{10}=4569+143\cdot10=4569+1430=\$5999$$

So, you need to find $X'$ such that

$$2\cdot5999=4569+143X'\implies X'=\frac{11998-4569}{143}=51.95$$

or $52$ months, rounded to the nearest integer.

Related Question