You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ceres docs advocate for using algorithmic differentiation instead of finite differences. Let's make one more IPy notebook example with the jacobian calculated using algopy, https://pythonhosted.org/algopy/
The text was updated successfully, but these errors were encountered:
It rather seems like example for algopy than scipy.optimize. I mean what will it explain in usage of least_squares? But if you think it's interesting and important, the simplest idea might be extending bundle adjustment example and compare results.
Unfortunately it doesn't support sparse Jacobians according to potential improvements: "support for sparse Jacobian and sparse Hessian computations using graph coloring as explained in http://portal.acm.org/citation.cfm?id=1096222"
So please suggest some problem where it would make sense to use automatic differentiation (preferably with bounds).
Hmmm, yeah, that's too bad. It'd be a good enhancement in there. An ideal example would be something where the errors from finite differences do play a role. I'll try thinking about it.
Ceres docs advocate for using algorithmic differentiation instead of finite differences. Let's make one more IPy notebook example with the jacobian calculated using algopy, https://pythonhosted.org/algopy/
The text was updated successfully, but these errors were encountered: