Skip to content

Commit

Permalink
README and fixes
Browse files Browse the repository at this point in the history
  • Loading branch information
methylDragon committed Aug 6, 2018
1 parent 60adb95 commit 1350ade
Show file tree
Hide file tree
Showing 10 changed files with 60 additions and 20 deletions.
8 changes: 4 additions & 4 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
.swp
.swap
.save
.bak
*.swp
*.swap
*.save
*.bak
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ In other words, the odometry data (gathered from sensors), and pose estimates (f

[go to top](#top)

There are two kinds of pose estimates, one for the robot's local position (which is continuous drifts over time), and one of the robot's estimated position globally (which is discontinuous but more accurate in the long run).
There are two kinds of pose estimates, one for the robot's local position (which is continuous and drifts over time), and one of the robot's estimated position globally (which is discontinuous but more accurate in the long run).

And these pose estimates affect **different** transforms between the three coordinate frames of the map, odom and base_link frames.

Expand Down
25 changes: 22 additions & 3 deletions 02 - Global Pose Estimate Fusion (Example Implementation).md
Original file line number Diff line number Diff line change
Expand Up @@ -755,7 +755,7 @@ Remember this? Change x, y, z, and yaw to fit your purposes.



### **2.7 Configure AMCL <a name="2.7></a>**
### **2.7 Configure AMCL <a name="2.7"></a>**

[go to top](#top)

Expand Down Expand Up @@ -963,7 +963,7 @@ initial_estimate_covariance: [1e-4, 0, 0, 0, 0, 0, 0, 0, 0,


### 2.9 Tune the Covariances <a name="2.9></a>
### 2.9 Tune the Covariances <a name="2.9"></a>

[go to top](#top)

Expand Down Expand Up @@ -1047,7 +1047,7 @@ pose_covariance: [0.1404, 0, 0, 0, 0, 0,
### 2.10 Validate the Sensor Fusion <a name="2.10></a>
### 2.10 Validate the Sensor Fusion <a name="2.10"></a>
[go to top](#top)
Expand Down Expand Up @@ -1082,6 +1082,25 @@ If it works really well, then congratulations! We're good to build **more marvel



#### **What you should see:**

1. When you run `$ rosrun tf view_frames` **you should get the following frame diagram** (or something close to it)
![TF Frames](assets/2_30.png)

2. **The following nodes should have been added** (check using `$rosrun rqt_graph rqt_graph`)

- hedge_msg_adapter
- hedge_pose
- amcl_pose
- ekf_localization_map


![rqt_graph](assets/2_31.png)

![Detailed view](assets/2_32.png)





CH3EERS!
Expand Down
9 changes: 0 additions & 9 deletions HOW TO USE THIS TUTORIAL

This file was deleted.

34 changes: 32 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,32 @@
# ros-sensor-fusion-tutorial
An in-depth step-by-step tutorial for implementing sensor fusion with robot_localization!
# Sensor Fusion in ROS
[![Click to watch video!](assets/youtube_thumbnail.png)](https://youtu.be/5vZOvISwT94)

An in-depth step-by-step tutorial for implementing sensor fusion with extended Kalman filter nodes from robot_localization! Basic concepts like covariance and Kalman filters are explained here!

This tutorial is especially useful because there hasn't been a full end-to-end implementation tutorial for sensor fusion with the robot_localization package yet.

You can find the implementation in the Example Implementation folder!



### Why fuse sensor data

A lot of times, the individual navigation stack components in a robot application can fail more often than not, but together, they form a more robust whole than not.

One way to do this is with the extended Kalman filter from the [robot_localization](http://wiki.ros.org/robot_localization) package. The package features a relatively simple ROS interface to help you fuse and configure your sensors, so that's what we'll be using!



### How to use this tutorial

1. Make sure you're caught up on [ROS](https://github.com/methylDragon/coding-notes/tree/master/Robot%20Operating%20System%20(ROS)/ROS)
2. It'll be good to read the [Marvelmind Indoor 'GPS' beacon tutorial](https://github.com/methylDragon/marvelmind-indoor-gps-tutorial) alongside this if you want to understand the example implementation
3. Likewise for the [Linorobot stack](https://linorobot.org)
4. And [AMCL](http://wiki.ros.org/amcl)
5. Then go ahead and follow the tutorial in order!



------

[![Yeah! Buy the DRAGON a COFFEE!](assets/COFFEE%20BUTTON%20%E3%83%BE(%C2%B0%E2%88%87%C2%B0%5E).png)](https://www.buymeacoffee.com/methylDragon)
Binary file added assets/2_30.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/2_31.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/2_32.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/youtube_thumbnail.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 1350ade

Please sign in to comment.