Hi all,
who can shed some light on the different behavior of
scipy.optimize and openopt wrt. l_bfgs_b ?
Nils
>>> scipy.__version__
'0.11.0.dev-491f9db'
>>> openopt.__version__
'0.37'
python -i test_lbfgsb.py
Optimal solution by L_BFGS_B
(array([ 2000. , 1100. , 1614.65585984,
1050. ,
900. , 1300. , 1100. ,
1200. ,
1400. , 1000. , 28. ,
28. ,
73. , 54. , 30. ]),
22.685511313514301, {'warnflag': 0, 'task': 'CONVERGENCE:
NORM OF PROJECTED GRADIENT <= PGTOL', 'grad': array([
8.38440428e-04, 9.89075488e-04, 5.32907052e-06,
3.17612603e-04, 4.14956958e-04,
2.43041143e-03,
1.26121336e-03, 1.87938554e-04,
2.43041143e-03,
1.10297549e-02, -2.24627428e-02,
-2.86387802e-02,
-5.51217738e-02, -1.89885441e-02,
-5.40191181e-01]), 'funcalls': 36})
------------------------- OpenOpt 0.37
-------------------------
solver: scipy_lbfgsb problem: unnamed type: NLP
goal: minimum
iter objFunVal
0 2.355e+01
RUNNING THE L-BFGS-B CODE
* * *
Machine precision = 2.220D-16
N = 15 M = 10
At X0 0 variables are exactly at the bounds
Traceback (most recent call last):
File "test_lbfgsb.py", line 68, in <module>
r = p.solve('scipy_lbfgsb')
File
"/home/nwagner/local/lib/python2.7/site-packages/openopt-0.37-py2.7.egg/openopt/kernel/baseProblem.py",
line 235, in solve
return runProbSolver(self, *args, **kwargs)
File
"/home/nwagner/local/lib/python2.7/site-packages/openopt-0.37-py2.7.egg/openopt/kernel/runProbSolver.py",
line 246, in runProbSolver
solver(p)
File
"/home/nwagner/local/lib/python2.7/site-packages/openopt-0.37-py2.7.egg/openopt/solvers/scipy_optim/scipy_lbfgsb_oo.py",
line 36, in __solver__
iprint=p.iprint, maxfun=p.maxFunEvals)
File
"/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/lbfgsb.py",
line 157, in fmin_l_bfgs_b
options=opts, full_output=True)
File
"/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/lbfgsb.py",
line 270, in _minimize_lbfgsb
isave, dsave)
ValueError: failed to initialize intent(inout) array --
input not fortran contiguous