[Numpy-discussion] hairy optimization problem

Sebastian Walter sebastian.walter at gmail.com
Thu May 7 04:06:41 EDT 2009


hi mathew,

1) what does it mean if a value is None? I.e., what is larger: None or 3?
Then first thing I would do is convert the None to a number.

2) Are your arrays integer arrays or double arrays?
It's much easier if they are doubles because then you could use
standard methods for NLP problems, as for example Newton's method as
suggested above.
But of the sound of it you could possibly enumerate over all possible
solutions. This may possibly be formulated as a linear mixed integer
program. This is also a hard problem, but can be usually solved quite
fast nowadays.
In the worst case, you might have to use algorithms as genetic
algorithms or a stochastic search as e.g. simulated annealing which
often do not give good results.

3)
It is not clear to me what exactly you are trying to maximize. As far
as I understand you actually have
N optimization problems. This is very unusual! Typically the problem
at hand can be formulated as *one* optimization problem.
Could you tell us, what exactly your problem is and why you want to
solve it? I am pretty sure that there is a much better approach than
solving N optimization problems.
It is good practice to first find the category of the optimization
problem. There are quite a lot of them: linear programs, nonlinear
programs, mixed integer linear programs, ....
and they can further be distinguished by the number of constraints,
type of constraints, if the objective function is convex, etc...
Once you have identified all that for your given problem, you can
start looking for a standard solver that can solve your problem.


On Thu, May 7, 2009 at 5:59 AM, David Huard <david.huard at gmail.com> wrote:
> Hi Mathew,
>
> You could use Newton's method to optimize for each vi sequentially. If you
> have an expression for the jacobian, it's even better.
>
> What I'd do is write a class with a method f(self, x, y) that records the
> result of f(x,y) each time it is called. I would  then sample very coarsely
> the x,y space where I guess my solutions are. You can then select the x,y
> where v1 is maximum as your initial point for Newton's method and iterate
> until you converge to the solution for v1. Since during the search for the
> optimum your class stores the computed points, your initial guess for v2
> should be a bit better than it was for v1, which should speed up the
> convergence to the solution for v2, etc.
>
> If you have multiple processors available, you can scatter function
> evaluation among them using ipython. It's easier than it looks.
>
> Hope someone comes up with a nicer solution,
>
> David
>
> On Wed, May 6, 2009 at 3:16 PM, Mathew Yeates <myeates at jpl.nasa.gov> wrote:
>>
>> I have a function f(x,y) which produces N values [v1,v2,v3 .... vN]
>> where some of the values are None (only found after evaluation)
>>
>> each evaluation of "f" is expensive and N is large.
>> I want N x,y pairs which produce the optimal value in each column.
>>
>> A brute force approach would be to generate
>> [v11,v12,v13,v14 ....]
>> [v21,v22,v23 .......    ]
>> etc
>>
>> then locate the maximum of each column.
>> This is far too slow ......Any other ideas?
>>
>>
>>
>> _______________________________________________
>> Numpy-discussion mailing list
>> Numpy-discussion at scipy.org
>> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
> _______________________________________________
> Numpy-discussion mailing list
> Numpy-discussion at scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>



More information about the NumPy-Discussion mailing list