Solved – Curve fitting vs function approximation

curve fittingmachine learningneural networksterminology

According to Wikipedia, curve fitting is defined as:

the process of constructing a curve, or mathematical function, that
has the best fit to a series of data points

And function approximation is defined as:

select a function among a well-defined class that closely matches
("approximates") a target function in a task-specific way

Specifically, I am talking about the context of a deep neural network, which are often described as function approximators. But according to the above definitions, I do not see the difference between describing a network as a curve fitter, and a function approximator. Both attempt to learn a function which matches the observed data as closely as possible. What is the key difference?

Best Answer

They are very similar, and different people in different community may have different definition. The following answer is based on my understanding.

  • In numerical analysis framework, "curve fitting" is often used to describe interpolation, where the ultimate goal would be trying to minimize the "training loss", i.e., the loss for all seen data points. And there is no notion about "over-fitting", which means if the model can perfectly pass though all the data points, the model is perfect.

  • On the other hand, "function approximation" may be more used in "machine learning" community, where just like any other learning problems, there are samples from the function (training data), and there are "ground truth function" or "hold out data for validation", therefore we may need to consider the "over-fitting". Where, perfectly predict seen data may not be good enough.

Related Question