some options in R would be
t=seq(0,10,0.01)
y=sin(t)+rnorm(length(t))
plot(t,y,cex=0.1)
# puts a loess smoother through the points
lines(loess.smooth(t,y),col=2)
library(mgcv)
# puts a spline through the points - read Simon Woods work to learn more about this
g<-gam(y~s(t),data=data.frame(t=t,y=y))
lines(t,g$fitted.values,col=3)
and for your recently added data
t=1:400
y=rep(NA,length(t))
y[10]<-30
y[111]<-100
y[171]<-128
y[181]<-86
y[201]<-42
y[211]<-44
y[281]<-39
y[321]<-59
y[341]<-20
y[351]<-4
library(mgcv)
g<-gam(y~s(t),data=data.frame(t=t,y=y))
lm<-predict.gam(g,data.frame(t=t,y=y))
plot(t,y,cex=0.5,pch=16,ylim=c(min(lm),max(lm)))
lines(t,lm,col='red')
Follows a MATHEMATICA script that I hope, will express with sufficient accuracy, the formulation needed $z = f(x,y)$
data1 = {{5, -48.72}, {10, -65.25}, {15, -77.55}, {20, -90.01}, {25,-100.7}};
data2 = {{5, 1.007}, {10, 1.001}, {15, 0.9209}, {20, 0.8385}, {25, 0.7528}};
data3 = {{5, -0.000163}, {10, -0.000287}, {15, -0.000163}, {20, 0.000043}, {25, 0.000226}};
fc1 = c10 + c11 w + c12 w^2 + c13 w^3 + c14 w^4;
fc2 = c20 + c21 w + c22 w^2 + c23 w^3 + c24 w^4;
fc3 = c30 + c31 w + c32 w^2 + c33 w^3 + c34 w^4;
c0 = NonlinearModelFit[data1, fc1, {c10, c11, c12, c13, c14}, w];
c1 = NonlinearModelFit[data2, fc2, {c20, c21, c22, c23, c24}, w];
c2 = NonlinearModelFit[data3, fc3, {c30, c31, c32, c33, c34}, w];
f[Temp_, w_] := -(c0[w] + c1[w] Temp + c2[w] Temp^2)
ContourPlot[f[Temp, w], {w, 5, 25}, {Temp, 0, 100}, Contours -> 15, ContourShading -> None]
Plot[{f[Temp, 15], f[Temp, 17], f[Temp, 20]}, {Temp, 0, 100}]
Added a python script to define $z = f(x,y)$
from scipy.optimize import curve_fit
def coefs(w,c0,c1,c2,c3,c4):
return c0 + c1*w + c2*w ** 2 + c3*w ** 3 + c4*w ** 4
x_values = [5., 10., 15., 20., 25.]
c0_values = [48.72, 65.25, 77.55, 90.01, 100.7]
c1_values = [-1.007, -1.001, -0.9209, -0.8385, -0.7528]
c2_values = [0.000163, 0.000287, 0.000163, -0.000043, -0.000226]
coefsc0, _ = curve_fit(coefs, x_values, c0_values)
coefsc1, _ = curve_fit(coefs, x_values, c1_values)
coefsc2, _ = curve_fit(coefs, x_values, c2_values)
def C(p, x):
val = 0
for i, pp in enumerate(p):
val += pp * x**i
return val
def f(t,w):
return C(coefsc0,w) + (C(coefsc1,w) + C(coefsc2,w)*t)*t
print(f(0,15))
Best Answer
Taking a look at your data, I guessed that maybe a function of the form $f(x)=a/x^p+b$ might work. If we then parameters $a$, $b$, and $p$ to minimize $$\sum_{(x,y)\in \text{data}} (f(x)-y)^2,$$ we find $f(x)=\frac{2106.91}{x^{2.11436}} + 0.463718$. The sum of the squared errors is $0.000480789$, which seems pretty good. The graph of $f$ with the data looks like