1
Robotics
Teacher:
Dr. Syed Riaz un Nabi Jafri
2
Moving Robots
• A robot having capability to change its
location is known as moving robot
• Variety of moving robots are available
Moving Robots
Structure
• Different kind of structures are in use
according to demand
• This presentation is covering Differential Drive
Rover which are most common structure for
indoor/outdoor environments
Structure
Main body
Joints
Controller
Actuators
Processor
Sensors
Moving Robot Components
• Rover: main body of robot
• Tx/Rx: Communication modules
• Joints: Provide connectivity between different parts
• Actuators: Agents which are responsible to provide force for motion on
different parts
• Sensors: Agents which are responsible to sense rate of motion of robot
itself or objects nearer to robot with further details
• End effector: The part at the last joint (hand/gripper)
• Controller: A numerically supported agent responsible to control all
motions of robot
• Processor (Brain): A numerically supported agent responsible to
determine and estimate behavior of robot and its surroundings and to
generate commands to different parts
• Software: A series of instructions which are provided to Processor
• Base station : For data logging and control
7
Block Diagram
Base Station
Motion
Sensors
Processor Controller Actuators
Perception
Sensors
8
Workspace
Could you please draw workspace ?
9
Movement Principle
Two Actuators are connected on each
wheel. A balancing wheel can be placed at
front or back side.
Rotation rate of each wheel can be
controlled through variable PWM signals.
Based on respective PWM values, robot can
move in forward, right or left directions.
Actuators
Differential drive robot
10
Movement Principle
One Actuator is connected on front shaft for
two wheels. It is responsible to turn right or
left the vehicle.
One Actuator is connected on back shaft for
two wheels. It is responsible to move
forward or backward the vehicle.
Rotation rate of each actuator can be
controlled through variable PWM signals. Actuators
Car like robot (Ackerman steering)
11
Examples
Car like robot (Ackerman steering)
Differential drive robot
12
Applications
• In use for indoor/outdoor cleaning
• For surveillance
• As an industrial loader
• For inspection
• Other domestic use
13
Degree Of Freedom (DOF)
Could you please let me know DOF ?
14
Actuators
Actuators are referred as Muscles of robot
• Electric motors
Servo motors
Stepper motors
• Pneumatic actuators
• Hydraulic actuators
15
Actuators
Pneumatic actuator
DC Servomotor
Hydraulic actuator
Stepper motor
16
Actuator Selection
• Select best actuator for following:
17
Sensors
• Potentiometers
• Encoders
• LVDT
• Resolvers
• GPS (Global positioning system)
• IMU (inertial measurement unit)
• Proximity Sensors
• Range Finders
• Object Identifiers
18
Sensors
Potentiometer
Encoder IMU
Resolver
GPS
19
Sensors
2D Laser range finder
2D ultrasonic range finder
20
Sensors
IR Sensor
Camera
Kinect (RGB-D)
21
Sensors Combination
• Position sensing (Localization)
Indoor ? (encoder, Pot, IMU,GPS)
Outdoor ?
• Environment perception (Mapping)
Indoor ? (encoder, Laser, IMU,Camera)
Outdoor ?
22
Processors (Brain)
A variety of processors solutions are available
depending on the requirement.
• Microprocessor based main boards
• Microcontroller based main boards
• FPGA based main boards
23
Processors (Brain)
ARM
Raspberry PI
FPGA
Tern
24
Reference Frame
25
Sensor Working Principle
XS
YS
2D laser sensor consists of laser LED and
XS photodiode/transistor. LED emits light and
diode detects its reflection from some
YS
object and determine distance using S =26V*T
Reference Frame
27
Objects points in Sensor Frame
Number of scan points could be more
depending on the objects
28
Objects features in Sensor Frame
29
Objects features in Robotic Frame
30
Objects features in World Frame
(Transformations)
31
Real Environment Features
Most of the scan points have merged into line segments, there
are some points which have not merged. This is the demerit of
feature based approach.
32
Scan points conversion into features
(Split & Merge)
Features handling is easy as compare to raw scan points
33
Transformations
Yw
Translation (dZ = 0)
Xw
Rotation along z-axis
34
Transformations
Consider following wall W1 has
been sensed by the laser scanner
Example:
Robot center is placed at (5,5) W1
with orientation 40 degrees.
Scanner center has displacement
of 10 units from robot center.
Both Scanner and robot frames
5,5 10
are aligned with each other.
35
Transformations
Scanning values of the wall has been given
(4,6)
W1
(7,2)
XS
36
Transformations
We need to do following steps:
W1
• Translation (from sensor to robot frame)
• Rotation (robot frame to world frame)
• Translation (robot frame to world frame)
Pxyz = Trans(5,5,0) x Rot(Z,40 degrees) x Trans(10,0,0) x Ps
37
Transformations
Matlab script:
Ps => P_s = [4; 6; 0; 1]
Trans(10,0,0) => T2 = [1 0 0 10;0 1 0 0;0 0 1 0;0 0 0 1 ]
Rot(Z,40 degrees) => Rot = [cos(40*pi/180) -sin(40*pi/180) 0 0;
sin(40*pi/180) cos(40*pi/180) 0 0;
0 0 1 0;
0 0 0 1]
Trans(5,5,0) => T1 = [1 0 0 5;0 1 0 5;0 0 1 0;0 0 0 1]
Pxyz => P_xyz
P_xyz = T1 * Rot * T2 * P_s
38
Transformations
Exercise:
Robot center is placed at (5,0) with orientation 0 degree. Sensor center
has displacement of 5 units from robot center. Both Sensor and robot
frames are aligned with each other. Robot sensor is detecting a line
end point at (5,0) in sensor frame. Determine end point coordinates in
world frame. Also verify result by plotting all frames manually.
39
Transformations
Exercise:
1
3 (1,1)
XR XS
3
XW
40
Transformations
Line parameters conversion
(rg = 4, ϕg = 90 degrees)
line
XR
6 , 0 degrees
XW
41
Ultrasonic Sensor Scans
Original scan Processed scan
42
Environment Perception (Features
based Map)
Features derived from raw scan points
43
Environment Perception (Grid based
Map)
Grids showing three states: occupied, non occupied,
undetermined
44
Localization
Finding location and orientation of robot within known
environment.
Localization could be local when we are comparing two
consecutive scans, in other case, it could be global when
we are comparing current scan with global map.
45
Localization
We compare two consecutive scans to
determine robot new pose
First Scan Second Scan (in blue)
World frame view
46
Localization
We compare current scan with given known
map to determine robot new pose
47
Simultaneous Localization and Mapping
(SLAM)
What is SLAM?
Estimate the pose of a robot and the map of the environment
at the same time
SLAM is hard, because
a map is needed for localization and
a good pose estimate is needed for mapping
Localization: inferring location given a map
Mapping: inferring a map given locations
SLAM: learning a map and locating the robot simultaneously
Various algos are available like EKF and particle filter SLAM
48
Simultaneous Localization and Mapping
(SLAM)
49
Simultaneous Localization and Mapping
(SLAM)
50
Simultaneous Localization and Mapping
(SLAM)
51
Simultaneous Localization and Mapping
(SLAM)
52
Simultaneous Localization and Mapping
(SLAM)
53
Environment Perception (3D)
Single Camera as Range & Bearing Sensor
54
Environment Perception (3D)
55
Environment Perception (3D)
56
Useful links
• Books:
Introduction to Robotics: Analysis, Control, Applications (By Saeed B. Niku)
Robotic Vision and Control (By Peter Corke)
Probabilistic Robotics (By Thrun et al)
• Journals/Conferences
IEEE Transactions on Robotics, Advanced Robotics, Journal of Field Robotics, IEEE ICRA, IEEE
IROS, IEEE ROBIO
• Online courses
MIT, Stanford, Coursera, EDX, IIT India
• Platforms
ROS , Webots, Vrep, OpenCV
• Groups
Robotics worldwide (search more specified)
57