You are on page 1of 14

Unit 0: Robotnik Summit XL platform for research and surveillance

The SUMMIT XL has skid-steering kinematics based on 4 high power motorwheels. This means that it turns by closely controlling the difference
in speed turning of each wheel. This removes the use of any steering mechanism, which make it highly robust.

Demo Move Around

Lets move around Summit XL robot shall we? This way you will get a hang of it. You can move Summit XL in three different ways:

1. Publishing into the topic /cmd_vel directly. ¶

This robot need that you publish the Twist message continuously, otherwise it will stop moving. This is normal to avoid security issues. The rate
will depend on how smooth you want the movement to be. Here is an example of a 1 Hz publish:

In [ ]: rostopic pub -r1 /cmd_vel geometry_msgs/Twist "linear:


x: 0.0
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: 1.0"

Be carefull of typing this by hand and not copy paste, otherwise the format might not be the correct one.

Now , stop it.


In [ ]: rostopic pub -r1 /cmd_vel geometry_msgs/Twist "linear:
x: 0.0
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: 0.0"

2. Launching the keyboard teleop launch , and move it with the keyboard.

Execute in WebShell #1

roslaunch sumit_xl_tools start_teleop.launch

Now, you can use the keys indicated in the WebShell Output in order to move the robot around.

i Move Forward

, Move Backward

k Stop

l Turn left

j Turn Right

u Turn Left and Forward

o Turn Right and Forward

m Turn Left and Backward

. Turn Right and Backward

q/z Increase/decrease Speed

Try it!! When you're done, you can Ctrl+C to stop the execution of the program.

3. Using in RVIZ Interactive Markers to publish in /cmd_vel


Execute in WebShell #1

roslaunch sumit_xl_tools start_teleop_with_interactivemarkers.launch

This is launching the RVIZ, so you will have to open the Graphical Tools by clicking on the icon:

You should get something like this:

As you can experiment, you have an interactive marker that publishes linear velocity ( red arows) and angular turn ( blue circle).

Demo Navigation

Of course in this course you will learn how to make Summit XL Navigate autonomously among other things.
Here you are going to see two examples of navigation: without map and with map.

Navigate without Map


To start the navigation without a map launch the following command:

Execute in WebShell #1

roslaunch sumit_xl_tools start_navigation_without_map.launch

This command will start the navigation with only odometry and lasers, generating local and global Costmaps on the fly by the laser readings. This
will be used by the planner to plan the best route based on the obstacles it detects.

This is launching the RVIZ, so you will have to open the Graphical Tools by clicking on the icon:
Now here you can select the icon 2D Nav Goal to set a goal for Summit XL. It will try to get there without colliding into obstacles that it detects
with its Laser. These obstacles are drawn in the localCostmap and globalcostmap that you can see as blue areas that it should not trespass. You
should get a result similar to this one:
Navigate with Map

Now, once Summit XL has a map, it will navigate better and further. Execute the following command to start the map navigation and the RVIZ
for map navigation.

Execute in WebShell #1

roslaunch sumit_xl_tools start_navigation_with_map_v2.launch

This is launching the RVIZ, so you will have to open the Graphical Tools by clicking on the icon:
Warning: If your robot is not where it should be, indicate the position and orientation with the 2D Pose Estimate. Otherwise Summit will find
much more difficult to localise itself.

Now here you can select the icon 2D Nav Goal to set a goal for Summit XL.
In this course you will learn to:

In this course you will learn all you need to operate and use the Summit XL robot platform in the real world.
You will learn it through hands on experience with a simulated version of the real robot which will have the same exact interface as the real one,
thanks to ROS infraestructure.

You will learn:

How to set up the navigation stack to make it navigate in an indoor environment, generating maps by its own.
How to create a program to navigate in outdoors environments through GPS data.
How to detect persons with the Hokuyo laser sensor.
How to detect person with its PTZ RGB camera.
How to Set WayPoints in a map to make it follow that path to patrole.
Create a reactive programs based on all previously mentioned and create a patroling program that reacts to person detections.

How you will learn all this?


y

Like always, all the courses in RobotIgniteAcademy are based on learning through hands on experience.
In this case you will work with different simulated models of the Summit XL:

Standard Skeed Wheels


With omnidirectional wheels, that allows it to move in any direction in the plane.

The simulation will have the same dimensions and all the sensors present in the Real robot:
GPS
Hokuyo Laser
PTZ camera

Requirements

Its essential that before starting this course you know the following:

ROS Basics: You need to know all the basics of ROS to be able to follow this course. If you don't, please do our ROS Basics in five days
course.
Highy advisable to have done the RobotIgnite Courses for ROS Navigation to be able to understand all what is done in this course.
Its not necessary but would give you a better background having done ROS Perception course.
Python basic knowledge

Special thanks
We have to thank the following inidviduals and organizations:

First of all, thanks to our partner for this course Robotnik (http://www.robotnik.eu/), creators of the Real Summit XL Robot
(http://www.robotnik.eu/mobile-robots/summit-xl/).

Adrian Rosebrock (http://www.pyimagesearch.com/author/adrian/) for its amazing pedestrian program detection with OpenCV Program
(http://www.pyimagesearch.com/2015/11/09/pedestrian-detection-opencv/).
Brian Bingham (https://github.com/bsb808) for the geonav GPS conversor Program (https://github.com/bsb808/geonav_transform).
Daniel Snider (https://github.com/danielsnider) for its great Waypoints manager Program (https://github.com/danielsnider/follow_waypoints).
The MakerHuman Software (http://www.makehuman.org/) used for the creation of the people.
This course wouldn't have been possible without knowledge and work of the ROS Community (http://www.ros.org/), OSRF
(https://www.osrfoundation.org/) and Gazebo Team (http://gazebosim.org/)

You might also like