Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

an example with algorithmic differentiation #7

Open
ev-br opened this issue Aug 6, 2015 · 2 comments
Open

an example with algorithmic differentiation #7

ev-br opened this issue Aug 6, 2015 · 2 comments

Comments

@ev-br
Copy link
Collaborator

ev-br commented Aug 6, 2015

Ceres docs advocate for using algorithmic differentiation instead of finite differences. Let's make one more IPy notebook example with the jacobian calculated using algopy, https://pythonhosted.org/algopy/

@nmayorov
Copy link
Owner

It rather seems like example for algopy than scipy.optimize. I mean what will it explain in usage of least_squares? But if you think it's interesting and important, the simplest idea might be extending bundle adjustment example and compare results.

Unfortunately it doesn't support sparse Jacobians according to potential improvements: "support for sparse Jacobian and sparse Hessian computations using graph coloring as explained in http://portal.acm.org/citation.cfm?id=1096222"

So please suggest some problem where it would make sense to use automatic differentiation (preferably with bounds).

@ev-br
Copy link
Collaborator Author

ev-br commented Aug 11, 2015

Hmmm, yeah, that's too bad. It'd be a good enhancement in there. An ideal example would be something where the errors from finite differences do play a role. I'll try thinking about it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants