Scipy contains a good least-squares fitting routine, leastsq(), which implements a modified Levenberg-Marquardt algorithm. I just learned that it also has a constrained least-squared routine called fmin_slsqp(). I am using simple upper and lower bound constraints, but it’s also possible to specify more complex functional constraints.
What I did not realize, at first, is that fmin_slsqp requires a different type of objective function than leastsq. leastsq requires you to write a function that returns a vector of residuals, and leastsq automatically squares and sums the residuals. fmin_slsqp is actually more flexible, in that it can use any objective function that returns a single scalar value. To implement least-squares curve fitting, your objective function will need to find the residual at each data point, square the values, and sum them up. Hopefully this tip will save you some time.