Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DLM gradients #161

Merged
merged 23 commits into from
Jun 7, 2022
Merged

DLM gradients #161

merged 23 commits into from
Jun 7, 2022

Conversation

rtqichen
Copy link
Contributor

@rtqichen rtqichen commented Apr 13, 2022

Motivation and Context

Implements support for the "direct loss minimization" gradient computation for theseus.

How Has This Been Tested

Tested in theseus/optimizer/nonlinear/tests/test_backwards.py, and ran the following examples:

  • examples/backward_mode.py
  • examples/tactile_pose_estimation.py

Types of changes

  • Docs change / refactoring / dependency upgrade
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Checklist

  • My code follows the code style of this project.
  • My change requires a change to the documentation.
  • I have updated the documentation accordingly.
  • I have read the CONTRIBUTING document.
  • I have completed my CLA (see CONTRIBUTING)
  • I have added tests to cover my changes.
  • All new and existing tests passed.

@rtqichen rtqichen requested review from bamos, luisenp and mhmukadam April 13, 2022 19:46
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 13, 2022
@mhmukadam mhmukadam added enhancement New feature or request experiments Experiments using latest features labels Apr 13, 2022
theseus/theseus_layer.py Outdated Show resolved Hide resolved
theseus/theseus_layer.py Outdated Show resolved Hide resolved
theseus/theseus_layer.py Outdated Show resolved Hide resolved
theseus/theseus_layer.py Outdated Show resolved Hide resolved
Copy link
Contributor

@mhmukadam mhmukadam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Look good so far. After adding discussed things and adding unit tests, you can try with the tactile example which we are using for the backward experiments.

@mhmukadam
Copy link
Contributor

Also CI fails linting. Make sure you are using git precommit hooks.

Copy link
Contributor

@luisenp luisenp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great! Left some comments/questions.

theseus/theseus_layer.py Outdated Show resolved Hide resolved
theseus/theseus_layer.py Show resolved Hide resolved
theseus/theseus_layer.py Outdated Show resolved Hide resolved
theseus/theseus_layer.py Outdated Show resolved Hide resolved
theseus/theseus_layer.py Outdated Show resolved Hide resolved
theseus/theseus_layer.py Outdated Show resolved Hide resolved
Copy link
Contributor

@mhmukadam mhmukadam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! Thanks @rtqichen for adding DLM.

@mhmukadam mhmukadam merged commit 9e11ced into main Jun 7, 2022
@mhmukadam mhmukadam deleted the rtqichen.dlm_gradients branch June 7, 2022 20:09
suddhu pushed a commit to suddhu/theseus that referenced this pull request Jan 21, 2023
* DLM gradients hacky example

* implement DLM using autograd.Function

* make soln a bit more accurate

* minor; removed unnecessary code

* lower case for dlm_epsilon

* backward test for DLM

* fix imports

* rename and make linter happy

* filter for tensors that require grad

* Construct the bwd objective only once

* minor

* remove print statements

* Fix DLM when using gpu; cost function shape; and handle case when no differentiable tensor

* Fix memory leak by removing dict input_data from input arguments

* preserve ordering

* Expand batch dim if possible

* undo

* use lower case

* reduce a bit of python overhead

* explicit one step
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. enhancement New feature or request experiments Experiments using latest features
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants