MATLAB: How to get the statistics of the best fitted parameter in quadratic equation

MATLABstudent t-test

I have fitted quadratic equation (y=a+bx+cx2) to the given experimental data using curve fitting tool in MATLAB. Is it possible to get the statistics (mean, standard deviation, variance, covariance and student t-test value) of the fitted parameter?
These are the data:
x=[0,40,80,100,120,150,170,200],
y=[1865,2855,3608,4057,4343,4389,4415,4478]

Best Answer

You have the curve fitting toolbox, and wish to obtain statistics about the ESTIMATED parameters. (I think KSSV has missed the point of your question.)
x=[0,40,80,100,120,150,170,200],
y=[1865,2855,3608,4057,4343,4389,4415,4478]
mdl = fit(x',y','poly2')
mdl =
Linear model Poly2:
mdl(x) = p1*x^2 + p2*x + p3
Coefficients (with 95% confidence bounds):

p1 = -0.08513 (-0.1061, -0.06418)
p2 = 30.08 (25.69, 34.47)
p3 = 1835 (1631, 2039)
So fit tells you confidence intervals around each parameter. But it does not give an explicit standard deviation or variance. That may have been by choice, since one typically uses the standard deviation of the parameter to yield a confidence interval, or even a t-test. But in turn, one then typically uses the t-test to decide if that parameter is non-zero, and might be dropped from the model. So if the confidence interval includes zero, then why do you need the rest of that crap? ;-) Essentially, the target of the CFT is not the person who wants complete statistics on estimated parameters, so it looks like they did not provide that type of information. Anyway, given the CFT includes many model classes, not all of those statiistics are really appropriate, for example, a spline.
A quick perusal of what fit returns does not show what you are looking to find as an output from fit, or the methods that fit provides. But if I look more deeply, then I find:
[mdl,u,v] = fit(x',y','poly2')
mdl =
Linear model Poly2:
mdl(x) = p1*x^2 + p2*x + p3
Coefficients (with 95% confidence bounds):
p1 = -0.08513 (-0.1061, -0.06418)
p2 = 30.08 (25.69, 34.47)
p3 = 1835 (1631, 2039)
u =
struct with fields:
sse: 38889
rsquare: 0.99373
dfe: 5
adjrsquare: 0.99122
rmse: 88.192
v =
struct with fields:
numobs: 8
numparam: 3
residuals: [8×1 double]
Jacobian: [8×3 double]
exitflag: 1
algorithm: 'QR factorization and solve'
iterations: 1
So, while it looks like some of that would come in handy to gain what you asked for, there are other tools that would help you in your goals. To be honest, it really would be easier to use another tool for this purpose. For example, my polyfitn (found on the File Exchange) gives you much of that. Or, you could use regress from the stats toolbox. For example, I might try this using my own polyfitn:
P = polyfitn(x',y',2)
P =
struct with fields:
ModelTerms: [3×1 double]
Coefficients: [-0.085129 30.081 1834.9]
ParameterVar: [6.6403e-05 2.9157 6286.9]
ParameterStd: [0.0081488 1.7075 79.29]
DoF: 5
p: [0.00013857 1.081e-05 2.8031e-06]
R2: 0.99373
AdjustedR2: 0.99122
RMSE: 69.722
VarNames: {'X1'}
Which tells you most of what was requested, though not a complete covariance matrix of the parameters, or a t-test. Instead, it looks like I used that t-test to create the field P.p there, which as I recall came from an inverse t. In fact, I notice that I don't even give you sufficient information from polyfitn to get a complete parameter covariance matrix.
So, I can help you to find what you want. However, if you insist on using the output from the curve fitting toolbox, we would need to re-do what is essentially the entire fit anyway. That is, in order to gain the complete covariance matrix of the parameters, you arguably want the results from a qr decomposition. (A Cholesky would suffice in theory too, but the qr is a better solution for stability.) But if you are willing to skip the covariance matrix, then the output from polyfitn gives you almost everything you wanted, and I could show you how to get that t-test value for the parameters.
So what are you really looking for? And what tools are you willing to use? If you insist on starting from the results of fit, AND you need a covariance matrix of the parameters, then we need to rebuild a fair amount, starting almost from scratch. And, yes, some of the time, people really do want that complete covariance matrix, though with only 8 data points, those covariances might not be worth a whole lot. Worse yet, there are lots of issues around covariances, because then we would arguably want to be thinking about lack of fit, and whether a quadratic is truly the correct model. Parameter variances have some issues. Sigh. In order to help you, I need to know more.