[SciPy-User] [ANN] Guaranteed solution of nonlinear equation(s)

Yury V. Zaytsev yury at shurup.com
Wed May 25 08:55:41 EDT 2011


Hi Dmitrey,

On Wed, 2011-05-25 at 10:36 +0300, Dmitrey wrote:

>         1) Are there any scientific publications detailing the algorithm that
>         can be referenced from a paper?
>         
> not yet.

Please keep us posted when it happens!

>         2) Sorry, I am not sure if I've got it right from the examples. 
>         
>         Is it true that your implementation of interalg is impossible to use
>         outside of OpenOpt the way one would, for example, use fmin from SciPy?
> 
> From all your questions, it seems you haven't read interalg webpage.

Yes, sorry, it's a shame, I was distracted by the benchmarks page :-/

It would be helpful as well if it would mention the size of the problems
that interalg is suitable for. I have > 1000 variables and a scalar
non-linear objective function (which is proven to be concave, though).

Also, are the box bound constraints necessary for the algorithm to work
or the support for infinite constraints could be added later?

> As it is mentioned in http://openopt.org/interalg , only FuncDesigner
> models can be handled (because interval analysis is required).

Ok, now it's clear. I have read the FuncDesigner documentation page
again and I think that now I understand it a little bit better.

Will it be correct if I summarize that in order to use interalg:

the objective function has to be constructed using FuncDesigner as an
oofunc, where every instance that depends on the optimization variables
(oovars) has to be an oofunc constructed from other oofuncs or oovars
using only numbers, FuncDesigner mathematical functions (you named
below) and Python for loops / ifThenElse expressions 

etc.?

Can you estimate the memory overhead of using an oovar / oofunc as
opposed to a NumPy array? I.e., I think I would need like 3 x 5 000 000
oovars in size and few hundreds individual oovars for my function.

When I use a NumPy float arrays for storage and Cython to implement my
for loops in C, it only needs few hundred megabytes of RAM and works
quite fast (x100 times faster than pure Python + NumPy vectorized
operations).

Therefore, I imagine that without JIT this is hopeless? Does OpenOpt
work with PyPy as of yet?

> However, you could easily compare scipy optimize fmin and other
> openopt-connected solvers with interalg.

... which is, of course, very nice, except that for this you have first
to implement your model in FuncDesigner :-(

> Currently only these funcs are supported:
> +, -, *, /, pow (**), sin, cos, arcsin, arccos, arctan, sinh, cosh,
> exp, sqrt, abs, log, log2, log10, floor, ceil
> Future plans: 1-D splines , min, max
> Also, any monotone func R->R (or with known "critical points", where
> order of monotonity changes) can be easily connected. 

Ok, I see, right now I need expi (as in SciPy) in addition to that. Is
there some documentation on how to connect my own functions it to
OpenOpt, so that they work like the supplied ones with regards to
interval analysis and automatic differentiation that is needed of
interalg?

Thanks!

-- 
Sincerely yours,
Yury V. Zaytsev





More information about the SciPy-User mailing list