Skip to content

Commit 1350ade

Browse files
committed
README and fixes
1 parent 60adb95 commit 1350ade

10 files changed

+60
-20
lines changed

Diff for: .gitignore

+4-4
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
.swp
2-
.swap
3-
.save
4-
.bak
1+
*.swp
2+
*.swap
3+
*.save
4+
*.bak

Diff for: 01- ROS and Sensor Fusion Tutorial.md renamed to 01 - ROS and Sensor Fusion Tutorial.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,7 @@ In other words, the odometry data (gathered from sensors), and pose estimates (f
6666

6767
[go to top](#top)
6868

69-
There are two kinds of pose estimates, one for the robot's local position (which is continuous drifts over time), and one of the robot's estimated position globally (which is discontinuous but more accurate in the long run).
69+
There are two kinds of pose estimates, one for the robot's local position (which is continuous and drifts over time), and one of the robot's estimated position globally (which is discontinuous but more accurate in the long run).
7070

7171
And these pose estimates affect **different** transforms between the three coordinate frames of the map, odom and base_link frames.
7272

Diff for: 02 - Global Pose Estimate Fusion (Example Implementation).md

+22-3
Original file line numberDiff line numberDiff line change
@@ -755,7 +755,7 @@ Remember this? Change x, y, z, and yaw to fit your purposes.
755755

756756

757757

758-
### **2.7 Configure AMCL <a name="2.7></a>**
758+
### **2.7 Configure AMCL <a name="2.7"></a>**
759759

760760
[go to top](#top)
761761

@@ -963,7 +963,7 @@ initial_estimate_covariance: [1e-4, 0, 0, 0, 0, 0, 0, 0, 0,
963963
964964

965965

966-
### 2.9 Tune the Covariances <a name="2.9></a>
966+
### 2.9 Tune the Covariances <a name="2.9"></a>
967967

968968
[go to top](#top)
969969

@@ -1047,7 +1047,7 @@ pose_covariance: [0.1404, 0, 0, 0, 0, 0,
10471047
10481048
10491049
1050-
### 2.10 Validate the Sensor Fusion <a name="2.10></a>
1050+
### 2.10 Validate the Sensor Fusion <a name="2.10"></a>
10511051
10521052
[go to top](#top)
10531053
@@ -1082,6 +1082,25 @@ If it works really well, then congratulations! We're good to build **more marvel
10821082

10831083

10841084

1085+
#### **What you should see:**
1086+
1087+
1. When you run `$ rosrun tf view_frames` **you should get the following frame diagram** (or something close to it)
1088+
![TF Frames](assets/2_30.png)
1089+
1090+
2. **The following nodes should have been added** (check using `$rosrun rqt_graph rqt_graph`)
1091+
1092+
- hedge_msg_adapter
1093+
- hedge_pose
1094+
- amcl_pose
1095+
- ekf_localization_map
1096+
1097+
1098+
![rqt_graph](assets/2_31.png)
1099+
1100+
![Detailed view](assets/2_32.png)
1101+
1102+
1103+
10851104

10861105

10871106
CH3EERS!

Diff for: HOW TO USE THIS TUTORIAL

-9
This file was deleted.

Diff for: README.md

+32-2
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,32 @@
1-
# ros-sensor-fusion-tutorial
2-
An in-depth step-by-step tutorial for implementing sensor fusion with robot_localization!
1+
# Sensor Fusion in ROS
2+
[![Click to watch video!](assets/youtube_thumbnail.png)](https://youtu.be/5vZOvISwT94)
3+
4+
An in-depth step-by-step tutorial for implementing sensor fusion with extended Kalman filter nodes from robot_localization! Basic concepts like covariance and Kalman filters are explained here!
5+
6+
This tutorial is especially useful because there hasn't been a full end-to-end implementation tutorial for sensor fusion with the robot_localization package yet.
7+
8+
You can find the implementation in the Example Implementation folder!
9+
10+
11+
12+
### Why fuse sensor data
13+
14+
A lot of times, the individual navigation stack components in a robot application can fail more often than not, but together, they form a more robust whole than not.
15+
16+
One way to do this is with the extended Kalman filter from the [robot_localization](http://wiki.ros.org/robot_localization) package. The package features a relatively simple ROS interface to help you fuse and configure your sensors, so that's what we'll be using!
17+
18+
19+
20+
### How to use this tutorial
21+
22+
1. Make sure you're caught up on [ROS](https://github.com/methylDragon/coding-notes/tree/master/Robot%20Operating%20System%20(ROS)/ROS)
23+
2. It'll be good to read the [Marvelmind Indoor 'GPS' beacon tutorial](https://github.com/methylDragon/marvelmind-indoor-gps-tutorial) alongside this if you want to understand the example implementation
24+
3. Likewise for the [Linorobot stack](https://linorobot.org)
25+
4. And [AMCL](http://wiki.ros.org/amcl)
26+
5. Then go ahead and follow the tutorial in order!
27+
28+
29+
30+
------
31+
32+
[![Yeah! Buy the DRAGON a COFFEE!](assets/COFFEE%20BUTTON%20%E3%83%BE(%C2%B0%E2%88%87%C2%B0%5E).png)](https://www.buymeacoffee.com/methylDragon)

Diff for: assets/2_30.png

107 KB
Loading

Diff for: assets/2_31.png

230 KB
Loading

Diff for: assets/2_32.png

206 KB
Loading

Diff for: assets/youtube_thumbnail.png

337 KB
Loading

0 commit comments

Comments
 (0)