Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initial implicit/truncated backward modes #29

Merged
merged 14 commits into from
Jan 19, 2022
Merged

Initial implicit/truncated backward modes #29

merged 14 commits into from
Jan 19, 2022

Conversation

bamos
Copy link
Contributor

@bamos bamos commented Dec 12, 2021

Migrating this from the private repo (where we have some more context. I've updated the example/test to use the quadratic cost fitting example as I like it's simplicity to the older example we were using with the cost weights. Here's what the current backward pass mode example looks like:

--- backward_mode=FULL
[ 0.09670612 -0.7900081   0.08778726  0.13497931  0.21781093 -0.20041543
  0.11018689 -1.5431342   0.13462512 -0.23546126]

--- backward_mode=IMPLICIT
[ 0.04835452 -0.3950162   0.04389496  0.0674917   0.10890876 -0.10021076
  0.05509511 -0.77159065  0.0673146  -0.11773425]

--- backward_mode=TRUNCATED, backward_num_iterations=5
[ 0.09368691 -0.7653437   0.08504649  0.1307652   0.21101077 -0.19415839
  0.10674679 -1.4949567   0.13042209 -0.22811009]

--- Numeric derivative
[ 0.09670693 -0.7900125   0.08778608  0.13497717  0.21781142 -0.20041636
  0.11019139 -1.54313637  0.13462317 -0.2354647 ]

=== Runtimes
Forward: 1.60e-02 s +/- 1.57e-03 s
Backward (FULL): 4.61e-03 s +/- 3.73e-04 s
Backward (IMPLICIT) 5.18e-04 s +/- 7.88e-05 s
Backward (TRUNCATED, 5 steps) 1.59e-03 s +/- 8.15e-05 s

This is almost ready to merge but still WIP as I'm still debugging some scaling issues in the implicit mode, which should be easier to debug with this simpler example.

@bamos bamos requested review from luisenp and mhmukadam December 12, 2021 04:18
theseus/core/cost_function.py Outdated Show resolved Hide resolved
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Dec 12, 2021
@mhmukadam mhmukadam added this to the 0.1.0-b.2 milestone Dec 17, 2021
Copy link
Contributor

@mhmukadam mhmukadam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! Left some minor comments, after which we should merge this and try to debug the implicit grad offset issue in a separate PR (unless you already fixed it).

Something failed in the GPU test. Might be a CI issue, don't think it was related to any commits here.

@mhmukadam mhmukadam marked this pull request as ready for review December 17, 2021 23:37
@mhmukadam mhmukadam linked an issue Dec 20, 2021 that may be closed by this pull request
@bamos bamos changed the title Initial WIP commit of implicit/truncated backward modes Initial implicit/truncated backward modes Jan 11, 2022
@bamos
Copy link
Contributor Author

bamos commented Jan 11, 2022

Hi @mhmukadam and @luisenp, I just went through and addressed all the inline comments. Let me know if there's anything else! Otherwise this is ready to merge from my perspective, and I'll keep debugging some of the gradient scaling issues separately for #39.

Copy link
Contributor

@luisenp luisenp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, some minor comments and a question!

theseus/optimizer/nonlinear/nonlinear_optimizer.py Outdated Show resolved Hide resolved
theseus/optimizer/nonlinear/tests/test_backwards.py Outdated Show resolved Hide resolved
Copy link
Contributor

@mhmukadam mhmukadam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good!

@bamos
Copy link
Contributor Author

bamos commented Jan 19, 2022

Thanks for the pass @luisenp! I just addressed your comments and pushed the latest that fixes the scaling issues and passes the backward tests with atol=1e-4. Can you let me know if there's anything else?

@bamos bamos merged commit 6d89db7 into main Jan 19, 2022
@bamos bamos deleted the diff branch January 19, 2022 15:52
suddhu pushed a commit to suddhu/theseus that referenced this pull request Jan 21, 2023
* Initial WIP commit of implicit/truncated backward modes

* spacing

* add numdifftools requirement

* fix mypy and GPU issues

* import BackwardMode as part of the main thesus module

* add ValueError messages

* add comments to backward_modes and add it to examples/README

* Remove error_increase_induces

* move converged_indices from the info back into the optimizaiton loop

* fix gradient scaling for facebookresearch#39

* update backward tests

* add type hints/remove unused track_best_solution

* remove erroneous update
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implicit differentiation support
4 participants