Hi! I have the following equation: y=c*u+d*u^2
Known variables are y and u for a given timeseries. c and d are unknown constants. Observe that it don't exist a constant term in the equation.
I'm sure that one solution lies within least squares, but I've sort of given up without assistance. Is there another, less complex way to solve this pherhaps?
Anyone who could help progress with this problem?
Thanks!
Best Answer