MATLAB: How to set slope in linear regression

linear modellinear regressionMATLABslope

I want to run a linear regression where I set the slope of the regression line to 1 and the intercept to 0. In other words, I want to see how well a set of points fits along the line y = x and I am not concerned about finding the line of best fit. I know how to use the fitlm to remove the intercept term, but I don't yet know how to set the slope. This is what I have so far:
vec1 = (0:10)+rand(1,11);
vec2 = (0:10)+rand(1,11);
scatter(vec1, vec2)
hold on
plot(0:11, 0:11)
mdl = fitlm(vec1, vec2, 'Intercept', false);
.Are there any other arguments that I can add to this function to set the slope, or is it necessary to calculate terms like R^2 and the p-value manually?

Best Answer

Determing the statistic you want to use to determine how well the regression explains the points is not difficult. It is relatively striaghtforward to calculate the Coefficient of determination, .
vec1 = (0:10)+rand(1,11);
vec2 = (0:10)+rand(1,11);
scatter(vec1, vec2)
hold on
plot(0:10, 0:10)
hold off
SStot = sum((vec2-mean(vec2)).^2);
SSres = sum((vec2-vec1).^2);
Rsq = 1-SSres/SStot;
pval = ttest(resd);
text(min(xlim)+0.2*diff(xlim), min(ylim)+0.7*diff(ylim), sprintf('R^2 = %.3f\n\\itp\\rm = %.3f', Rsq,pval))
A one-sample t-test on the residuials is probably adequate to provide a p-value.