Skip to content

Commit

Permalink
update readme and bump version (#309)
Browse files Browse the repository at this point in the history
  • Loading branch information
mhmukadam authored Sep 28, 2022
1 parent b1ebc8e commit 0fa526b
Show file tree
Hide file tree
Showing 5 changed files with 12 additions and 10 deletions.
6 changes: 3 additions & 3 deletions CITATION.cff
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ authors:
- family-names: "Sodhi"
given-names: "Paloma"
- family-names: "Chen"
given-names: "Ricky"
given-names: "Ricky T. Q."
- family-names: "Ortiz"
given-names: "Joseph"
- family-names: "DeTone"
Expand All @@ -31,7 +31,7 @@ title: "Theseus: A Library for Differentiable Nonlinear Optimization"
url: "https://github.com/facebookresearch/theseus"
preferred-citation:
type: article
journal: arXiv preprint arXiv:2207.09442
journal: Advances in Neural Information Processing Systems
title: "Theseus: A Library for Differentiable Nonlinear Optimization"
url: "https://arxiv.org/abs/2207.09442"
year: 2022
Expand All @@ -47,7 +47,7 @@ preferred-citation:
- family-names: "Sodhi"
given-names: "Paloma"
- family-names: "Chen"
given-names: "Ricky"
given-names: "Ricky T. Q."
- family-names: "Ortiz"
given-names: "Joseph"
- family-names: "DeTone"
Expand Down
12 changes: 7 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ Our implementation provides an easy to use interface to build custom optimizatio
- Gauss-Newton, Levenberg–Marquardt
- [Linear solvers](https://github.com/facebookresearch/theseus/tree/main/theseus/optimizer/linear)
- Dense: Cholesky, LU; Sparse: CHOLMOD, LU
- [Commonly used costs](https://github.com/facebookresearch/theseus/tree/main/theseus/embodied), [AutoDiffCostFunction](https://github.com/facebookresearch/theseus/blob/main/theseus/core/cost_function.py)
- [Commonly used costs](https://github.com/facebookresearch/theseus/tree/main/theseus/embodied), [AutoDiffCostFunction](https://github.com/facebookresearch/theseus/blob/main/theseus/core/cost_function.py), [RobustCostFunction](https://github.com/facebookresearch/theseus/blob/main/theseus/core/robust_cost_function.py)
- [Lie groups](https://github.com/facebookresearch/theseus/tree/main/theseus/geometry)
- [Robot kinematics](https://github.com/facebookresearch/theseus/blob/main/theseus/embodied/kinematics/kinematics_model.py)

Expand All @@ -72,7 +72,8 @@ We support several features that improve computation times and memory consumptio
- [Sparse linear solvers](https://github.com/facebookresearch/theseus/tree/main/theseus/optimizer/linear)
- Batching and GPU acceleration
- [Automatic vectorization](https://github.com/facebookresearch/theseus/blob/main/theseus/core/vectorizer.py)
- [Backward modes](https://github.com/facebookresearch/theseus/blob/main/theseus/optimizer/nonlinear/nonlinear_optimizer.py): Implicit, Truncated, Direct Loss Minimization ([DLM](https://github.com/facebookresearch/theseus/blob/main/theseus/theseus_layer.py)), Sampling ([LEO](https://github.com/facebookresearch/theseus/blob/main/examples/state_estimation_2d.py))
- [Backward modes](https://github.com/facebookresearch/theseus/blob/main/theseus/optimizer/nonlinear/nonlinear_optimizer.py)
- Implicit, Truncated, Direct Loss Minimization ([DLM](https://github.com/facebookresearch/theseus/blob/main/theseus/theseus_layer.py)), Sampling ([LEO](https://github.com/facebookresearch/theseus/blob/main/examples/state_estimation_2d.py))


## Getting Started
Expand All @@ -86,6 +87,7 @@ We support several features that improve computation times and memory consumptio
- `conda install -c conda-forge suitesparse` (Mac).

### Installing

#### **pypi**
```bash
pip install theseus-ai
Expand Down Expand Up @@ -137,7 +139,7 @@ objective.add(cost_function)
layer = th.TheseusLayer(th.GaussNewton(objective, max_iterations=10))

phi = torch.nn.Parameter(x_true + 0.1 * torch.ones_like(x_true))
outer_optimizer = torch.optim.RMSprop([phi], lr=0.001)
outer_optimizer = torch.optim.Adam([phi], lr=0.001)
for epoch in range(10):
solution, info = layer.forward(
input_tensors={"x": phi.clone(), "v": torch.ones(1, 1)},
Expand All @@ -157,8 +159,8 @@ If you use Theseus in your work, please cite the [paper](https://arxiv.org/abs/2
```bibtex
@article{pineda2022theseus,
title = {{Theseus: A Library for Differentiable Nonlinear Optimization}},
author = {Luis Pineda and Taosha Fan and Maurizio Monge and Shobha Venkataraman and Paloma Sodhi and Ricky Chen and Joseph Ortiz and Daniel DeTone and Austin Wang and Stuart Anderson and Jing Dong and Brandon Amos and Mustafa Mukadam},
journal = {arXiv preprint arXiv:2207.09442},
author = {Luis Pineda and Taosha Fan and Maurizio Monge and Shobha Venkataraman and Paloma Sodhi and Ricky TQ Chen and Joseph Ortiz and Daniel DeTone and Austin Wang and Stuart Anderson and Jing Dong and Brandon Amos and Mustafa Mukadam},
journal = {Advances in Neural Information Processing Systems},
year = {2022}
}
```
Expand Down
Binary file modified docs/source/img/theseuslayer.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion examples/simple_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ def error_fn(optim_vars, aux_vars): # returns y - v * exp(x)
layer = th.TheseusLayer(th.GaussNewton(objective, max_iterations=10))

phi = torch.nn.Parameter(x_true + 0.1 * torch.ones_like(x_true))
outer_optimizer = torch.optim.RMSprop([phi], lr=0.001)
outer_optimizer = torch.optim.Adam([phi], lr=0.001)
for epoch in range(20):
solution, info = layer.forward(
input_tensors={"x": phi.clone(), "v": torch.ones(1, 1)},
Expand Down
2 changes: 1 addition & 1 deletion theseus/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.

__version__ = "0.1.0"
__version__ = "0.1.1"

from .core import (
CostFunction,
Expand Down

0 comments on commit 0fa526b

Please sign in to comment.