# Constrained least-squares fitting with Python

Scipy contains a good least-squares fitting routine, leastsq(), which implements a modified Levenberg-Marquardt algorithm.  I just learned that it also has a constrained least-squared routine called fmin_slsqp().   I am using simple upper and lower bound constraints, but it’s also possible to specify more complex functional constraints.
What I did not realize, at first, is that fmin_slsqp requires a different type of objective function than leastsq.   leastsq requires you to write a function that returns a vector of residuals, and leastsq automatically squares and sums the residuals.  fmin_slsqp is actually more flexible, in that it can use any objective function that returns a single scalar value.  To implement least-squares curve fitting, your objective function will need to find the residual at each data point, square the values, and sum them up.  Hopefully this tip will save you some time.
Check out the scipy optimization tutorial for more examples.  Here is the original paper by Dieter Kraft which introduces the algorithm used by fmin_slsqp.

### 7 thoughts on “Constrained least-squares fitting with Python”

1. I am also facing a similar problem. In my case the matrix equation looks like [Y]nX1=[X]nXm[P]mX1, where Y and P are vectors and X is a matrix. Further, there is a equality constraint on P which is Sum(P(i))=0.0. How do I proceed to solve that? Which function of python is suitable for this? I saw few of discussion on scipy.optimize.fmin_slsqp() function. However, I need your help. I am new in SCIPY. Please help me out in this regard.

This site uses Akismet to reduce spam. Learn how your comment data is processed.