How to optimize an experimentally derived objective function

approximationinterpolationMATLABoptimization

I have to optimize the result of a process that depends on a large number of variables, e. g. a laser engraving system where the engraving depth depends on the laser speed, distance, power and so on.

The objective function of such a system will be a nonlinear scalar function of the form $ f(x_1, x_2, x_3, …, x_n) $.
This function is unknown.

My objective is finding the set of optimized variables that allow me to maximize my objective function.
The $ (x_1, x_2, x_3, …, x_n) $ will also have to be constrained within some specific bounds.

Before tackling the optimization problem, I have to find some expression for the objective function. Regarding this, I have some questions:

  1. What is the best way to interpolate a n-variable scalar function?
  2. Since I will be performing the measurements, how should I choose which points to measure to have a reasonable result?
  3. Can I 'scan' a single $ x_n $, keeping the other constant and measuring the single-variable objective function, and then build the n-variable objective function from these 'projections'?
  4. Is there any MATLAB or Python toolbox or function that can help me achieve what I described above?

Once I have an approximation for the objective function:

  1. Which MATLAB toolbox or function can I use to find the final, optimized set of parameters that will allow me to maximize the objective function?

Sorry if the questions seem vague, but this is my first time tackling an optimization problem.
Thank you very much,
Riccardo

Best Answer

You are describing an "online black-box optimization problem". I suspect that in your case, online bayesian optimization would be a good tool. Here's an algorithm/paper that you may find useful:

McIntire M, Ratner D, Ermon S. Sparse Gaussian Processes for Bayesian Optimization. In UAI 2016 Jun 25. https://www.auai.org/uai2016/proceedings/papers/269.pdf

And here's their GitHub repo:

https://github.com/ermongroup/bayes-opt

In case the above is not useful to you, I'll throw around some other ideas/keywords which may help with your search.

There are many ways you can tackle this type of problem, but the tool you choose depends on a few factors. For example, if you are dealing with a nonconvex function and there are unsatisfactory local minima, then your problem becomes a global optimization one, which I assume is your case. If your function is highly dimensional (think hundreds or thousands of dimensions), then some methods may require excessive function evaluations to find a good solution. Given you are manually tuning an experimental procedure, I'll assume that the number of parameters is around 10-ish. I'll also assume that it takes significant time to evaluate $f(x)$ for a given $x$; in these low-dimensional but highly expensive continuous problems, Bayesian Optimization is often a good solution. If you have a mix of discrete and continuous variables, you could make a function that maps a continuous domain to your discrete variables. If you have purely discrete domain, then you may need to investigate combinatorial optimizers.

Related Question