You are on page 1of 60

ROBOTICS AND AUTOMATION

ME-465
Dr. Sara Ali
My Research Interests
 Human-Robot Interaction (HRI)
 Brain-Computer Interface
 Virtual Reality
 AI in healthcare

Wide areas: Mobile Robotics, Human-Robot


Interaction, Cybernetics and bionics, Humanoids.
My Research
3

Multi-Robot Interactive Therapy for Children with Autism Spectrum Disorder

System Setup : Measuring JA, Imitation and EEG

Dr Sara Ali HRI


Interesting videos
4

 https://www.youtube.com/watch?v=QdQL11uWW
cI
 https://www.youtube.com/watch?v=8KRZX5KL4fA
 https://www.youtube.com/watch?v=Kw-
gOmJwzuc

OIZUU

Dr Sara Ali HRI


Current projects
 PhD: x1 student
 Multi-robot system for children with Autism Spectrum Disorder
 MS: x5 students
 Personality Prediction with Pepper
 Socio-Technical System for effective classroom learning
 An Unreal Engine Based Human Robot Interaction Framework
 Estimation of Cognitive Workload Using Virtual Reality
 Ball Interception in Humanoid Robot Soccer
 Perception of Emotion in Human Robot Interaction
 UG groups: x2
 Felix: Design of Quadruped robot
 Design of Agri robot for variable spray rate (Controller design + UGV)
Research Topics
 Masters: Effects of Font Style and Font Color in News Text on
User Cognitive Load in Intelligent User Interfaces
 UG: Intelligent human following cart.
 UG: Design and fabrication of 6 wheeled Agri-robot
 UG: Designing of seed-sowing mechanism for Agri-bot
 UG: Designing of auto-spraying assembling for Agri-robot
 Masters +UG: Yield estimation vision based for orchid
 Masters +UG: Classification vision based for orchid
 Masters +UG: Pest and weed estimation and spraying
Equipment Available
 Pepper robot
 Nao Robots
 VR headset (OCULUS QUEST 2)
 Following CART
 P3-ATs – mobile robots
 FMS robotic arm
 EEG headsets
 UGV platform
 Quadruped robot
 Prosthetic hand project
 Myo Band, Bumble bee camera, LIDARS etc.
Administrative
8

• Brief Introduction
• At least 75% attendance policy will be enforced.
• Office Location: 2nd floor, Room 317
• Office Hours: email at sarababer@smme.nust.edu.pk

3/9/2022
Textbooks
9

 Introduction to Robotics by John J. Craig, 3rd Edition.


 Robotics, Vision and Control by Peter Corke, 2nd
Edition.

Dr Hasan Sajid Engineering Mechanics 3/9/2022


Pre-requisites
 Linear Algebra
 Probability
 Dynamics
 Basic Control theory
 Programming in MATLAB and object-oriented
concepts
Essential Logistics
 At-least MATLAB 2015 and preferably MATLAB
2016b.
 Download and install Robotics Toolbox (RT) by Peter
Corke available at:
http://petercorke.com/wordpress/toolboxes/roboti
cs-toolbox#Downloading_the_Toolbox
 Manual of RT can be downloaded at:
http://petercorke.com/wordpress/?ddownload=34
3
Course Content
 Representing Position and Orientation
 Time and Motion
 Robot Arm Kinematics
 Velocity Relationships
 Dynamics and Control
Brief History
 Term coined in a 1921 Czech Science fiction play
“Rossum’s Universal Robots” by Karel Capek.
 In Czech, the word “Robot” is derived from the word
for slave.
 First patent for what we consider a robot was filed
in 1954 by George C. Dovel and issued in 1961
 First Robotics company, Unimation was founded in
1956 by George C. Devol and Joseph Engelberger
Who is Who?

George C. Devol Joseph F. Engelberger


2011: Inducted in national Inventors Hall of Fame “Father of Robotics”
Brief History
 First Industrial Robot installed in 1961

a A plan view of machine from Devol’s patent; b first unimation robot


working at a General Motors factory. (Courtesy of George C Dovel)
Brief History

Descendants of Unimate:
First Generation robot- subcategorized
as manufacturing robots.

What are first generation robots?

A modern six-axis robot from ABB for


factory automation (Image courtesy ABB
robotics).
Brief History

Descendants of Unimate:

Baxter, two-armed robot with built-in vision


capability (image courtesy Rethink
robotics)
Brief History – Mobile Robots

Small autonomous underwater vehicle self driving car (image courtesy Dept. Information Honda’s Asimo humanoid robot (image courtesy Honda
(Todd Walsh © 2013 MBARI) Engineering, Oxford Univ.) Motor Co. Japan)

Mars Science Lander, Curiosity, self portrait taken at Cheetah legged robot (image courtesy Boston Savioke Relay delivery robot (image courtesy
“John Klein”. Dynamics) Savioke)
Brief History – Mobile Robots

Global Hawk unmanned aerial vehicle


Autonomous Underwater vehicle (photo courtesy of NASA)
Alternative Taxonomy
 Based on robots function
 Manufacturing Robots
◼ Function: operate in factories and are the technological
descendents of the first generation robots.
◼ Application: factory automation, packaging, electronic assembly,
welding, painting etc
 Service Robots:
◼ Function: supply services to people
◼ Applications: medical rehabilitation, fetching, carrying, cleaning,
personal assistance etc
 Field Robots
◼ Function: work on outdoor tasks
◼ Applications: environmental monitoring, agriculture, mining,
construction, forestry, agricultural robots, driverless cars etc
Alternative Taxonomy Cont.
 Humanoid Robots
◼ have the physical form of a human being – they are both
mobile robots and service robots
 Telerobots
◼ are robot-like machines that are remotely controlled by a
human operator. Davinci surgical robot, mars rover etc

Davinci Surgical System Mars Rover


(Courtesy: Intuitive Surgical) (Courtesy: NASA)
What is a Robot?
what is a Robot?
“a goal oriented machine that can sense, plan and
act” (SPA)
 A robot senses its environment and uses that
information, together with a goal, to plan some
action.
 After the acting phase, the sensing phase, and the
entire cycle, is repeated.

https://www.youtube.com/watch?
v=tUs2nyDcls8
Key terms: Sense, plan and act
 Sensing
 Proprioceptive sensors measure the state of the robot itself: the
angle of the joints on a robot arm, the number of wheel
revolutions on a mobile robot or the current drawn by an electric
motor.
 Exteroceptive sensors measure the state of the world with respect
to the robot. The sensor might be a simple bump sensor on a
robot vacuum cleaner to detect collision. It might be a GPS
receiver that measures distances to an orbiting satellite
constellation, or a compass that measures the direction of the
Earth’s magnetic field vector relative to the robot. It might also be
an active sensor that emits acoustic, optical or radio pulses in
order to measure the distance to points in the world based on the
time taken for a reflection to return to the sensor.
 What about cameras?
Ethical Issues

“robots taking jobs from people”

“whom to blame if a robotic car crashes into a


human or gets into an accident”

“human vs robotic surgery”

“Should we use robots to look after our


children, and even teach them?”
Representing Position and Orientation

 Pose: Combination of position and orientation


 a point in space?
A Point has position in space. The only characteristic
that distinguishes one point from another is its position.
 Coordinate frame/ Cartesian coordinate system?
 Difference/advantage of point or vector? Think
about real world objects …….
A Vector has both magnitude and direction, but no
fixed position in space.
Difference between a point and a vector

• a vector is a movement that you can make. like “2 steps north, 3 steps
east”
• a point is a position. it’s a place in the room. like “the south east corner” or
“the center”
 Notice that a vector doesn’t say where you start or end. Just how you
should move.
 Some operations mix the two. You can take a point and add a vector (start
in the center of the room, go 2 steps north and 3 steps east). You can take
two points and talk about the vector between them (how do you need to
talk to get between the points).
Vector and Point
 Vectors don't normally represent a position, rather they represent a
direction and magnitude.
 So if you subtract a point from another point, you get a vector-
subtraction gives you a displacement vector
 If you add a point and a vector, you get another point.
 A point is just a vector with a known origin.
 You can add 2 vectors and get a vector – resultant vector
 Vectors are normally represented with <> and points are
represented by ().

https://graphics.cs.wisc.edu/WP/cs559-fall2014/2014/08/28/points-
vectors-and-coordinate-systems-why-are-points-and-vectors-different/
Common Examples
The difference is precisely that between location and
displacement.
 Pointsare locations in space.
 Vectors are displacements in space.

An analogy with time works well.


 Times, (also called instants or datetimes) are locations in
time.
 Durations are displacements in time
Representing Position and Orientation

 Convention: Attach coordinate frame to an object. It


enables us to describe the pose (position and
orientation) of an object with respect to reference
coordinate frame. Each frame is given a name.
 Assumption: object is rigid
Representing Position and Orientation

 How many dimensions are required to completely


describe the pose of an object?
 Notations

Frame
being described
Reference
coordinate frame

Relative pose of frame {B} with respect to frame {A}

NOTE: If superscript is missing, we assume the change is w.r.t


world coordinate frame denoted as {O}.
Representing Position and Orientation

Point P can be described w.r.t either frame {A} or {B} by vectors Ap and Bp
respectively. Formally

The operator . transforms the vector resulting in a new vector BUT w.r.t
another coordinate frame
Representing Position and Orientation

 An important characteristic of relative poses is that


they can be combined/composed together

*Pose of {C} relative to {A} can be obtained


by compounding the relative poses from {A} to
{B} and {B} to {C}. We use the operator ⊕ to
indicate Composition of relative poses.
Representing Position and Orientation

 An important characteristic of relative poses is that


they can be combined/composed together

For this case the point P can be described by

Till now we have seen 2D coordinate frames. It is applicable to many robotic systems
especially mobile robots that operate in planar world. However for many other robots,
we require 3D coordinate frames. Examples are ……..
Representing Position and Orientation

Spatial Relationships
Representing Position and Orientation

 Algebraic Rules

combination

Not commutative with exception of their


sum being zero
Working in 2D

Describe frame {B}


with respect to
frame {A}
Working in 2D – Orthonormal rotation matrix

We introduce another frame {V} whose axis are


parallel to frame {A} but origin lies on frame {B}. Point
P can be expressed in terms of frame {V} as:

Eq 1

Frame {B} can be described using its


orthogonal axes as:

We can represent point P wrt frame {B} as

Eq 2
Working in 2D – Orthonormal rotation matrix

Equate equations 1 and 2, to see how points are


transformed when frame is rotated.

Rotation Matrix: transforms a


point P from one frame to the
other. It is denoted by VRB
Working in 2D – Orthonormal rotation matrix

 Properties of rotation matrix


Rotation Matrix in Toolbox

R=rot2(0.2)

Symbolic Mathematics

syms theta
R=rot2(theta)
Homogeneous Form of a Vector
Working in 2D – Homogeneous Transformation Matrix

 Translation?
 It is simply vector addition
Working in 2D – Homogeneous Transformation Matrix

 Coordinate vectors of point P can be expressed in


homogeneous form (note that tilde indicates a
homogeneous vector) as

AT represents the translation and orientation or relative pose


B
and often referred as rigid-body motion.
Working in 2D – Homogeneous Transformation Matrix

Transformation matrices can be compounded/combined as

Dot here is standard matrix-vector product


Transformation in Toolbox

T1=transl2(1,2)*trot2(30,'deg') *Note that rot2 provides orthonormal


plotvol([0 5 0 5]); rotation matrix, whereas trot2
trplot2(T1, 'frame', '1', 'color', 'b') provides homogeneous transformation

T2 = transl2(2, 1)
trplot2(T2, 'frame', '2', 'color', 'r');

T3 = T1*T2
trplot2(T3, 'frame', '3', 'color', 'g');

T4 = T2*T1;
trplot2(T4, 'frame', '4', 'color', 'c');

P = [3 ; 2 ];
plot_point(P, 'label', 'P', 'solid', 'ko');
Transformation in Toolbox
 To determine the coordinate of the point P wrt {1}
Earlier we derived ……

P1 = inv(T1) * [P; 1]

Conversions from homogeneous to Euclidean space


h2e( inv(T1) * e2h(P) )

You might also like