You are on page 1of 10

SentiBotics Development Kit Trial

Quick Start Tutorial
2016-02-29

Neurotechnology

SentiBotics

1 Installing Sentibotics trial Provided set of scripts installs additional software dependencies. 2 Configuring computer This document provides step-by-step instructions how to start and use SentiBotics trial kit. 2 http://www. If Gazebo GUI is not used.gazebosim. It includes a set of SentiBotics software components (excluding source code. 2 should appear. installs SentiBotics trial packages and associated data: . and physical robot itself). 3. The kit consists of mobile robot. to begin simulation. Trial package license is valid for 30 days. algorithm descriptions. and at least 8 GB of RAM. Press play button in Gazebo GUI.ros. Gazebo simulator GUI Fig. 3.04. It can be done by setting corresponding arguments to false. and requires persistent Internet connection. and ros-indigo-desktop-full package from APT repositories 3 . SentiBotics trial package allows to try SentiBotics kit and explore its capa- bilities. simulation can be started by running start_simulation script from sentibotics_tutorials. 1 Current version is tested and developed with ROS Indigo and Ubuntu 14. For more information about SentiBotics see Neurotechnology website. Install Linux Ubuntu 14. launch \ gazebo_gui := true rviz := true . sets up envi- ronment variables. For slower machines it may be more convenient to exclude Gazebo GUI/Rviz.04 operating system. 1 and Rviz Fig./ setup . 2.org 3 see http://wiki. We recommend to run SentiBotics trial on machine with i7 CPU.1 Run simulator Start Gazebo simulator and Rviz visualization tool: roslaunch s e n t i b o t i c s _ t u t o r i a l s g a z e b o _ w i t h _ a l g o r i t h m s . which show how to run demos and test robots capabilities.4). sh After completing installation we recommend to watch video tutorials (see Sec.org/indigo/Installation/Ubuntu 1 . allowing to simulate SentiBotics robot in popular Gazebo 5 2 simulator. 3 Simulation enviroment and robot control This section describes how to control the robot with control pad or text com- mands. who need a ”ready-to-go” plat- form for algorithm development.1 Introduction SentiBotics is robotics development kit with reference mobile manipulator in- tended for robotics researchers and developers. and accompa- nying ROS(Robot Operating System)-based software with full algorithm source code 1 .

Figure 1: SentiBotics robot in Gazebo simulator. Figure 2: Rviz visualization tool. 2 .

). In order to start behaviour node. You may test your gamepad with jstest tool. that mapping is performed automatically when the robot is driven by the user. Currently behaviour contains 4 basis classes (arm control.3 Using SentiBotics functionality SentiBotics software includes SLAM and autonomous navigation. platform control. functioning as a glue between lower level modules (e. object learn- ing and recognition. g . two meters forward. 3. and object recognition). and ensure the map on Rviz looks to be correct(Select InteractiveMarkers to visualize experiences). platform .0 port of your computer and start joystick node rosrun joy joy_node.py cover more global tasks like grasp. and autonomous object manipulation functionality. etc. Behaviour node.g. like object delivery (which uses navigation. find or put. e.g. and arm movement controls are in OFF state. open another terminal and type rosrun sentibotics_behaviour sentibotics_behaviour_node.3. Shell scripts and programming samples. This section provides an examples how to use it. navigation. 4. object recognition and object grasping). • Driving platform directly (without using map) with given commands: self . self . • For robot arm controls see Fig. Those classes have custom access to robot control and main class behaviour. When robot is started platform movement controls are in ON state. Package sentibotics_tutorials contains a set of auxiliary shell scripts and C++ programming samples. Robot status can be viewed by: rostopic echo /sentibotics_platform/status • For robot platform controls see Fig. which can be installed from APT repository. we recommend to start with simple maps (e. Note. 45 degree turn. 3 . which allow to use SentiBotics functionality (see sentibotics_tutorials/src and sentibotics_tutorials/scripts).g. START button is used for switching between platform and arm control. with visual loop closing as in Fig. 3. 3. map the environment by manually driving the robot around with con- trol pad or commands. platform . execute_command ( delta_x . mapping. execute_command (1 .3. experience map. It is python node. object recognition and manipulation. 5. It provides an easy interface for control aforementioned functionalities and serves as a framework for implementing higher level robot behaviour. In order to test autonomous navigation. Behaviour node is main SentiBotics functionality testing and demonstration tool.2 Control using gamepad Plug the gamepad into USB 2.1 SLAM and autonomous navigation First. 0) This command don’t require a map. and another two meters forward) and proceed to more complex ones. delta_theta ) e .

closest to given target pose. θ (in /odom coordinates). $ \ theta$ )) 4 . and then drive directly to it: self . g . travel_to_pose ( Pose2D (x .y . t r a v e l _ t o _ e x p e r i e n c e (0) to navigate robot to initial experience This command requires a map. t r a v e l _ t o _ e x p e r i e n c e ( experience_id ) e . self . This command will drive the robot to experience. Figure 3: Mobile platform controls • Driving platform to a given experience: self . • Driving platform to a pose x. y.

Also see sentibotics_tutorials/src/autonomous_navigation_tutorial.0 . Figure 4: Robotic arm controls e . self .0.launch.0 .cpp for an example how to move robotic platform and call navigation functionality. g . 5 . You may test it by roslaunch sentibotics_tutorials autonomous_navigation. pi /4)) This command also requires a map. travel_to_pose ( Pose2D (1.

• Switching object recognition on and off: self . stop_learning () or execute script rosrun sentibotics_tutorials stop_learning.2 Object learning and recognition Object learning. recognition_off () or execute corresponding scripts rosrun sentibotics_tutorials recognition_on and rosrun sentibotics_tutorials recognition_off. 3. until acceptable recognition quality is achieved. Make sure that object is not obscured by environment or robot. result is any object detected return after RECOGNITION TRIES(default: 10) times This command also associates 6 . After learning is complete. There shouldn’t be any other kind of object in short range camera field of view to avoid miss classification. • Start object recognition mode. • To stop learning self . recognition_on () self . o bj ec t_ re co gn it io n . Place an object in-front of robot in short range camera field of view. The same object should be learned in different poses for better object recognition. • To switch on object learning: self . o bj ec t_ re co gn it io n .3. o bj ec t_ re co gn it io n . o bj ec t_ re co gn it io n . learn_object (" object_name ") or execute script rosrun sentibotics_tutorials learn_object object_name. Figure 5: Example of experience map (Rviz visualization). we recommend to remove delete incorectly learned object instances and test recognition by placing learned object in variuos positions and orientations.

s t o p _ d e l e t i n g _ m a t c h e s () or rosrun sentibotics_tutorials delete_false_matches verb 0. de let e_ no t_ ma tc he s (" object_label ") or rosrun sentibotics_tutorials delete_false_matches 2 object_label. • Parking robotic arm: self . self . place_object () 7 . until grasping is possible. The robot will try to recognize given object and grasp it with its manipulator.yaml and set graspable parameter to 1. You may test it by roslaunch sentibotics_tutorials object_learning_and_recognition. the robot tries looks around current pose. Also see sentibotics_tutorials/src/object_learning_tutorial.3 Object grasping and manipulation We recommend to start testing with simple grasping scene: single learned and graspable object on plain floor. edit ROS_HOME/sentibotics_arm_vision/objects/object_name/ params. park_arm () • Autonomous grasping of learned object. – delete all objects not matched as object name self . grasp_object (" object_label ") • Placing grasped object into a box: self . obj ec t_ re co gn it io n . After object is learned and successfully recignized.launch. recognized objects to experiences. or restart session. obj ec t_ re co gn it io n . obj ec t_ re co gn it io n . – stop deleting self . so the robot knows where they are lo- cated. delete_matches (" object_label ") or rosrun sentibotics_tutorials delete_false_matches 1 object_label. de let e_ al l_ ma tc he s () or rosrun sentibotics_tutorials delete_false_matches. the robot tries to move closer. obj ec t_ re co gn it io n . rec ognize _objec ts () • Deleting incorectly learned object instances: – delete all objects matched self . also set corresponding ROS parameter to 1. 3. – delete all objects matched as object name self . If no recognition occurs.3.cpp for an example how to call object learning and recog- nition from your software. If recognized object is too far. self .

octomap filtering libraries. • Autonomous object delivery. bring_object (" object_label ") You may test it by roslaunch sentibotics_tutorials object_grasping. where delivery command was given.3 Message packages • sentibotics_arm_msgs . • Control the robot using ’behavior’ node. • sentibotics_exprience_map . self .4 Video tutorials • Mapping and autonomous navigation. grasping pose estimation. 3.1 SentiBotics ROS package structure 4. asso- ciated with given object. • sentibotics_moveit . and try to recognize it directly. • Autonomous object grasping.robot local pose hypotheses tracking.messages for operations with robotic arm. auxiliary libraries for more convenient kinematics calculations and trajectory processing.package contains moveit trajectory planner cus- tomized for Sentibotics robot.messages for Dynamixel servo motors.1 SLAM and autonomous navigation • sentibotics_place_recognition .SLAM and autonomous navigation.2 Object recognition. 8 .navigation messages. • Autonomous object delivery. 4.1. • sentibotics_pose_estimation . 4 SentiBotics structure This section contains information about SentiBotics software.1. put it into the box and return to an experience.1. • sentibotics_arm_vision -contains 3D object recognition. • Object learning and recognition. • sentibotics_dynamixel_msgs . 4.configuration of sentibotics_moveit. The robot will drive to experience. • sentibotics_moveit_config . grasping and manipulation • sentibotics_arm . • sentibotics_navigation_msgs .robotic arm control software. If recognition occurs. the robot tries to grasp the object.visual place learning and recognition. 4.launch. and robot self filtering.

SentiBotics Gazebo simulator package.common procedures.5 Robot description • sentibotics_description . 4.URDF description of SentiBotics robot. 4.6 Robot simulation • sentibotics_gazebo .1.1.1. 4. 9 .4.4 Higher level behavior • sentibotics_behavior .1. • sentibotics_tutorials .tutorials and scripts.7 Other • sentibotics_common .module for implementing higher level behavior and demonstration of robot capabilities.