3

What's the best method to maximize the value of slow multivariate function? Here is what I know about the function:

  1. Number of parameters is ~ 10.
  2. It takes considerable amount of time to compute the value of the function for single variable set. Let's say 30 minutes.
  3. Function seems to be concave against every individual parameter.
  4. I can run computation in parallel.

All my code is in Java and I'm interested in coding crude version of the optimization algorithm myself rather than using some industry solver.

ak.
  • 131
  • 2
  • 1
    Here's a related question I posted a while back, but most of the answers were about existing packages: http://scicomp.stackexchange.com/questions/10068/global-maximization-of-expensive-objective-function – AJK Aug 09 '15 at 03:41
  • If you can do computations in parallel then scan parameters and generate a look-up table of values; then use the table and interpolation to solve the optimization problem. – Maxim Umansky Aug 09 '15 at 05:18
  • What does "slow multivariate function" mean? – nicoguaro Aug 09 '15 at 06:35
  • 1
    slow multivariate function - f(x1, x2, x3, ...) that takes a long time to compute, e.g. 30 minutes. To contrast here is fast multivariate function: f(x, y, z) = x + y * z^2 – ak. Aug 09 '15 at 07:15
  • Any clue why the function is this slow? I.e., are you doing shape optimization in a large-scale 3D PDE problem or is the code just lousy? – Bill Barth Aug 09 '15 at 13:03
  • You say that the function you're maximizing appears to be convex, which is the hard case. Did you actually mean that the function is concave? – Brian Borchers Aug 09 '15 at 15:41
  • @BrianBorchers: you are right, I meant concave. – ak. Aug 09 '15 at 18:23
  • @BillBarth: The function is a result of backtest-like simulation over temporal data. Thinks stock prices. – ak. Aug 09 '15 at 18:24
  • Are your objective function values computed precisely, or is there some randomness inherent in the evaluation of the function by simulation? – Brian Borchers Aug 09 '15 at 20:06
  • @BrianBorchers: Objective function is precise, i.e. no error term present – ak. Aug 09 '15 at 20:51
  • Do you know whether your objective function is smooth (differentiable), whether or not you have any way to compute the derivatives? – Brian Borchers Aug 09 '15 at 21:02
  • If you look at it real close then it will probably look like a step function. But the steps are quite narrow, and within the parameter ranges of interest it looks quite smooth, but there is no way to calculate derivatives. – ak. Aug 10 '15 at 02:12
  • 2
    If the underlying function is reasonably smooth, then finite difference derivative approximations can be effective. If the underlying function is inherently non-smooth, then you don't want to use any optimization method that assumes smoothness. It sounds as though you could get away with finite difference approximations to the gradient (with 10 parameters, you can get a finite difference gradient with 11 function evaluations in paralllel.) – Brian Borchers Aug 11 '15 at 03:51

0 Answers0