1. Using "nlinfit"
The "nlinfit" function expects a response vector "Y" and a function of unknown parameters. Simply encapsulate the implicit model in a function of the form:
The response vector to be passed to "nlinfit" becomes
The new predictor array now contains both the old predictor variables and the old responses packed together in one array:
and the new model is specified by the anonymous function
modelfun = @(beta,X) ( X(:,2) - model(X(:,1),X(:,2),beta) );
where "model" is the implicit model with inputs "x" and "y", output "y", and parameterized by the (unknown) parameter vector "beta".
Finally, use "nlinfit" with an initial guess "beta0" as follows:
beta = nlinfit(X,Y,modelfun,beta0)
2. Using "lsqnonlin"
Alternatively, you could achieve the same results with the "lsqnonlin" function of the Optimization Toolbox.
The "lsqnonlin" function tries to solve a least-squares equation of the form
min_x { f_1(x)^2 + f_2(x)^2 + ...
for an unknown vector "x".
In the case of fitting an implicit model, the vector "x" would be the vector of unknown parameters "beta" of the implicit model and f_i(x) would be:
f_i(beta) = y_i - model(x_i,y_i,beta)
where "X=[x_i]" are the predictors and "Y=[y_i]" are the responses used to fit the model.
However, "lsqnonlin" takes as input argument a handle to a function of only 1 argument (the parameters). You would have to pass extra parameters (in this case, the predictors and responses) by using an anonymous function. Assume your implicit model is defined in a function file called "model". Then, "modelfun" is a handle to an anonymous function defined as:
modelfun = @(beta) ( Y - model(X,Y,beta) );
where "Y" is the vector of responses, "X" is the vector of predictors, and "beta" is the vector of unknown parameters.
With an initial guess "beta0", call "lsqnonlin" as follows:
beta = lsqnonlin(modelfun,beta0)
Best Answer