parameter bounds using leastsq
Is there any way to enforce upper and/or lower bounds on parameters (x0) optimized by leastsq? If not can anyone tell me where I might look to hack this? Thanks, -mike
Just to remind people, there is a very full featured non-linear least squares fitter that is written in pure python (but based on MINPACK) available here: http://cars9.uchicago.edu/software/python/mpfit.html It allows for constraints on paramters and gives error estimates (computed from the covariance matrix) on the fitted parameters in a convenient format. Scott On Sat, Nov 05, 2005 at 05:27:00PM -0800, mike cantor wrote:
Is there any way to enforce upper and/or lower bounds on parameters (x0) optimized by leastsq? If not can anyone tell me where I might look to hack this?
Thanks, -mike
_______________________________________________ SciPy-user mailing list SciPy-user@scipy.net http://www.scipy.net/mailman/listinfo/scipy-user
-- -- Scott M. Ransom Address: NRAO Phone: (434) 296-0320 520 Edgemont Rd. email: sransom@nrao.edu Charlottesville, VA 22903 USA GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989
Thanks Scott, How do you think it compares in terms of speed with the SciPy implementation that wraps C code? Speed is a big concern for me. -mike At 05:51 PM 11/5/2005, you wrote:
Just to remind people, there is a very full featured non-linear least squares fitter that is written in pure python (but based on MINPACK) available here:
http://cars9.uchicago.edu/software/python/mpfit.html
It allows for constraints on paramters and gives error estimates (computed from the covariance matrix) on the fitted parameters in a convenient format.
Scott
On Sat, Nov 05, 2005 at 05:27:00PM -0800, mike cantor wrote:
Is there any way to enforce upper and/or lower bounds on parameters (x0) optimized by leastsq? If not can anyone tell me where I might look to hack this?
Thanks, -mike
_______________________________________________ SciPy-user mailing list SciPy-user@scipy.net http://www.scipy.net/mailman/listinfo/scipy-user
-- -- Scott M. Ransom Address: NRAO Phone: (434) 296-0320 520 Edgemont Rd. email: sransom@nrao.edu Charlottesville, VA 22903 USA GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989
_______________________________________________ SciPy-user mailing list SciPy-user@scipy.net http://www.scipy.net/mailman/listinfo/scipy-user
On Sat, Nov 05, 2005 at 06:07:56PM -0800, mike cantor wrote:
Thanks Scott,
How do you think it compares in terms of speed with the SciPy implementation that wraps C code? Speed is a big concern for me.
Speed is its one downfall. It is definitely much slower than the SciPy leastsq (due to being completely done in C). I guess there probably wouldn't be such a huge hit if you had a complex function to optimize which dominated the run time, but if that is not the case, figuring out a workaround for leastsq will probably be your best bet. Scott
At 05:51 PM 11/5/2005, you wrote:
Just to remind people, there is a very full featured non-linear least squares fitter that is written in pure python (but based on MINPACK) available here:
http://cars9.uchicago.edu/software/python/mpfit.html
It allows for constraints on paramters and gives error estimates (computed from the covariance matrix) on the fitted parameters in a convenient format.
Scott
On Sat, Nov 05, 2005 at 05:27:00PM -0800, mike cantor wrote:
Is there any way to enforce upper and/or lower bounds on parameters (x0) optimized by leastsq? If not can anyone tell me where I might look to hack this?
Thanks, -mike
_______________________________________________ SciPy-user mailing list SciPy-user@scipy.net http://www.scipy.net/mailman/listinfo/scipy-user
-- -- Scott M. Ransom Address: NRAO Phone: (434) 296-0320 520 Edgemont Rd. email: sransom@nrao.edu Charlottesville, VA 22903 USA GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989
_______________________________________________ SciPy-user mailing list SciPy-user@scipy.net http://www.scipy.net/mailman/listinfo/scipy-user
_______________________________________________ SciPy-user mailing list SciPy-user@scipy.net http://www.scipy.net/mailman/listinfo/scipy-user
-- -- Scott M. Ransom Address: NRAO Phone: (434) 296-0320 520 Edgemont Rd. email: sransom@nrao.edu Charlottesville, VA 22903 USA GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989
On Sat, 5 Nov 2005, Scott Ransom apparently wrote:
Just to remind people, there is a very full featured non-linear least squares fitter that is written in pure python (but based on MINPACK) available here: http://cars9.uchicago.edu/software/python/mpfit.html
Here's the actual download link: http://cars.uchicago.edu/software/pub/python_epics.tar As far as I can tell, you must get the whole collection together. Cheers, Alan Isaac
mike cantor wrote:
Is there any way to enforce upper and/or lower bounds on parameters (x0) optimized by leastsq? If not can anyone tell me where I might look to hack this?
If you just need the optimizal value and not some (dubious) estimate of the uncertainty, then you can use one of the constrained minimizers. You simply have to make an appropriate misfit function: def f(beta, x): # compute values y given parameters beta at points x def misfit(beta, x, y): diff = y - f(beta, x) return scipy.sum(diff*diff) beta_opt = scipy.optimize.fmin_cobyla(f, beta0, constraints, (x, y)) -- Robert Kern rkern@ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter
Robert- Could you flesh out this answer some more? I've had to do this in the past and was sure I was missing something. In the simplest case a set of: y1 = (b0 + b1*x1 + b2*x2)1 . . yn = (b0 + b1*x1 + b2*x2)n In fortran I would pass x as a 2d matrix to the function misfit. What do you do in scipy if x is a vector at each point y? Eric Robert Kern wrote:
mike cantor wrote:
Is there any way to enforce upper and/or lower bounds on parameters (x0) optimized by leastsq? If not can anyone tell me where I might look to hack this?
If you just need the optimizal value and not some (dubious) estimate of the uncertainty, then you can use one of the constrained minimizers. You simply have to make an appropriate misfit function:
def f(beta, x): # compute values y given parameters beta at points x
def misfit(beta, x, y): diff = y - f(beta, x) return scipy.sum(diff*diff)
beta_opt = scipy.optimize.fmin_cobyla(f, beta0, constraints, (x, y))
Eric Zollars wrote:
Robert- Could you flesh out this answer some more? I've had to do this in the past and was sure I was missing something. In the simplest case a set of: y1 = (b0 + b1*x1 + b2*x2)1 . . yn = (b0 + b1*x1 + b2*x2)n
In fortran I would pass x as a 2d matrix to the function misfit. What do you do in scipy if x is a vector at each point y?
You would write your function f accordingly. Because Scipy is vector oriented, a simple y=b0+b1*x[0]+b2*x[1] would work, assuming x is a Scipy/Numeric array of shape (2,n). By the way, I humbly suggest there was a typo in Robert's original post:
def f(beta, x): # compute values y given parameters beta at points x
def misfit(beta, x, y): diff = y - f(beta, x) return scipy.sum(diff*diff)
beta_opt = scipy.optimize.fmin_cobyla(f, beta0, constraints, (x, y))
I think the "f" argument in the call to fmin_cobyla needs to be "misfit" else you're not fitting to the data.
Steve Walton
mike cantor wrote:
Is there any way to enforce upper and/or lower bounds on parameters (x0) optimized by leastsq? If not can anyone tell me where I might look to hack this?
One way that seems to work quite good is to cheat the optimizer by externally mapping +-inf to the upper/lower bounds of the parameters: If a is the parameter with upper/lower bounds amax/amin, then newa will be the parameter which you should use to evaluate your function: newa = (amin+amax)/2.0+(amax-amin)/2.0*a/(abs(a+1)) Like this when the optimizer is pushing the parameters to +-inf they will in fact slowly approach amax and amin. I'm not sure if this works with any function but up to now I did not have any problems. Btw. I succesfully use this method in a leastsq curve fitting application called peak-o-mat (http://lorentz.sf.net). Regards, Christian
participants (7)
-
Alan G Isaac -
Christian Kristukat -
Eric Zollars -
mike cantor -
Robert Kern -
Scott Ransom -
Stephen Walton