Bounded Linear Least-Squares
Hi all, Does scipy have a function analogous to Matlab's lsqlin? I need to solve two problems of the form Ax = b, one subject to the constraint that 0 <= x, and one subject to 0 <= x <= 1. The first case is handled by scipy.optimize.nnls, but it doesn't support the second. I know that scipy.optimize includes several constrained optimization routines, but AFAICT they're all aimed at minimizing arbitrary functions, and as such I'd expect them to be far slower than an actual linear solver. Is there such a constrained linear solver in scipy (or numpy, or scikits.*, etc.)? Even better would be a constrained matrix factorization routine, i.e. that solves AX = B for X with A, X and B all being matrices, subject to 0 <= X <= 1, but obviously you can construct the latter from the former, so the former would suffice. Thanks, Dan Lepage
On Wed, Apr 20, 2011 at 10:39 AM, Daniel Lepage <dplepage@gmail.com> wrote:
Hi all, Does scipy have a function analogous to Matlab's lsqlin? I need to solve two problems of the form Ax = b, one subject to the constraint that 0 <= x, and one subject to 0 <= x <= 1. The first case is handled by scipy.optimize.nnls, but it doesn't support the second. I know that scipy.optimize includes several constrained optimization routines, but AFAICT they're all aimed at minimizing arbitrary functions, and as such I'd expect them to be far slower than an actual linear solver. Is there such a constrained linear solver in scipy (or numpy, or scikits.*, etc.)?
Even better would be a constrained matrix factorization routine, i.e. that solves AX = B for X with A, X and B all being matrices, subject to 0 <= X <= 1, but obviously you can construct the latter from the former, so the former would suffice.
I don't know anything that would solve this directly, but I think that scipy.optimize.fmin_slsqp should work well in this case. Josef
Thanks, Dan Lepage _______________________________________________ SciPy-User mailing list SciPy-User@scipy.org http://mail.scipy.org/mailman/listinfo/scipy-user
On Wed, Apr 20, 2011 at 10:54 AM, <josef.pktd@gmail.com> wrote:
On Wed, Apr 20, 2011 at 10:39 AM, Daniel Lepage <dplepage@gmail.com> wrote:
Hi all, Does scipy have a function analogous to Matlab's lsqlin? I need to solve two problems of the form Ax = b, one subject to the constraint that 0 <= x, and one subject to 0 <= x <= 1. The first case is handled by scipy.optimize.nnls, but it doesn't support the second. I know that scipy.optimize includes several constrained optimization routines, but AFAICT they're all aimed at minimizing arbitrary functions, and as such I'd expect them to be far slower than an actual linear solver. Is there such a constrained linear solver in scipy (or numpy, or scikits.*, etc.)?
Even better would be a constrained matrix factorization routine, i.e. that solves AX = B for X with A, X and B all being matrices, subject to 0 <= X <= 1, but obviously you can construct the latter from the former, so the former would suffice.
I don't know anything that would solve this directly, but I think that
scipy.optimize.fmin_slsqp
should work well in this case.
It will work, but more slowly than a linear solver because it doesn't assume the function to be linear. To even hope to get comparable speeds, I'd need to enter the derivatives of my system by hand, which is possible but a pain, and even then I expect it will take longer. Fortunately, I figured out how to restructure my particular problem so that I only need nonnegativity, so I'll just use nnls. Thanks, Dan Lepage
On Wed, Apr 20, 2011 at 2:22 PM, Daniel Lepage <dplepage@gmail.com> wrote:
On Wed, Apr 20, 2011 at 10:54 AM, <josef.pktd@gmail.com> wrote:
On Wed, Apr 20, 2011 at 10:39 AM, Daniel Lepage <dplepage@gmail.com> wrote:
Hi all, Does scipy have a function analogous to Matlab's lsqlin? I need to solve two problems of the form Ax = b, one subject to the constraint that 0 <= x, and one subject to 0 <= x <= 1. The first case is handled by scipy.optimize.nnls, but it doesn't support the second. I know that scipy.optimize includes several constrained optimization routines, but AFAICT they're all aimed at minimizing arbitrary functions, and as such I'd expect them to be far slower than an actual linear solver. Is there such a constrained linear solver in scipy (or numpy, or scikits.*, etc.)?
Even better would be a constrained matrix factorization routine, i.e. that solves AX = B for X with A, X and B all being matrices, subject to 0 <= X <= 1, but obviously you can construct the latter from the former, so the former would suffice.
I don't know anything that would solve this directly, but I think that
scipy.optimize.fmin_slsqp
should work well in this case.
It will work, but more slowly than a linear solver because it doesn't assume the function to be linear. To even hope to get comparable speeds, I'd need to enter the derivatives of my system by hand, which is possible but a pain, and even then I expect it will take longer.
Fortunately, I figured out how to restructure my particular problem so that I only need nonnegativity, so I'll just use nnls.
Since it's just a linear error function (quadratic objective function), the derivatives should be easy. I was just hoping you figure out a recipe example for this since I also have applications for this case and haven't looked at it yet. :) Josef
Thanks, Dan Lepage _______________________________________________ SciPy-User mailing list SciPy-User@scipy.org http://mail.scipy.org/mailman/listinfo/scipy-user
Daniel Lepage <dplepage <at> gmail.com> writes:
Hi all, Does scipy have a function analogous to Matlab's lsqlin? I need to solve two problems of the form Ax = b, one subject to the constraint that 0 <= x, and one subject to 0 <= x <= 1. The first case is handled by scipy.optimize.nnls, but it doesn't support the second. I know that scipy.optimize includes several constrained optimization routines, but AFAICT they're all aimed at minimizing arbitrary functions, and as such I'd expect them to be far slower than an actual linear solver. Is there such a constrained linear solver in scipy (or numpy, or scikits.*, etc.)?
Even better would be a constrained matrix factorization routine, i.e. that solves AX = B for X with A, X and B all being matrices, subject to 0 <= X <= 1, but obviously you can construct the latter from the former, so the former would suffice.
Thanks, Dan Lepage
Openopt has a BVLS wrapper, which is bounded values linear square solver. You also could wrap bvls.f with fwrap. Is there any intrest to include bvls in scipy.optimize? I remember to saw a constrained matrix facorization routine for python, but i forget where :(. Till Stensitzki
participants (4)
-
Daniel Lepage -
josef.pktd@gmail.com -
Skipper Seabold -
Till Stensitzki