QTrobot ROS API Reference
The QTrobot ROS interface aims to facilitate accessing basic robot functionalities through a set of user-friendly ROS APIs. ROS is the de-facto standard robotic framework and basic knowledge of ROS is required to work with the QTrobot’s ROS SDK. Most of QTrobot’s functionalities can be accessed in both blocking and non-blocking modes using ROS publish/subscribe and Service/Client interfaces. Following are the main interfaces of QTrobot:
- QTrobot Emotion: implements robot facial emotion
- QTrobot Speech: implements robot text to speech functionality
- QTrobot Audio: implement a simple player to play standard audio files
- QTrobot Gesture: implements robot gesture control
- QTrobot Behavior: implements more complex behaviors by combining the robot basic functionality
- QTrobot Motor: implements robot motor controls using standard ros_control
- QTrobot 3D-Camera: implements different human 3D body and facial tracking using Intel Realsense camera and Nuitrack SDK
- QTrobot Settings: implements some basic setting of robot such as speaker volume control
Naming convention
Most of QTrobot’s functionalities can be accessed in both blocking and non-blocking modes using ROS publish/subscribe and Service/Client interfaces. For each of the main interfaces, there are two different way of accessing them:
- Using ROS Publisher/Subscribers to allow non-blocking call to the interfaces
- Using ROS Services for blocking and accessing more sophisticated interfaces
The name of the most of the QTRobot ROS interfaces starts with /qt_robot/...
prefix. The word follows the prefix indicate the actual interface (e.g. /qt_robot/speech/..
). Each interface may have more sub-functionalities which come after the main interface name. For example:
- /qt_robot/speech/say : implements speech reading interface using QTrobot TTS
- /qt_robot/speech/config : implements configure() method of QTrobot TTS
For user’s convenience we have given the same name to the service and subscriber for each QTrobot ROS interface. That means one can access, for example, the speech functionality using ROS service call or publish/subscribe via the same interface name (e.g. qt_robot/speech/say
). Please notice that some of the complex services (e.g. speech configuration for language, pitch,…) are accessible only via ROS service call.
List of available interfaces
Currently the following interfaces have been implemented:
INTERFACES | ||
---|---|---|
Functionality | Interface prefix | Description |
Speech | /qt_robot/speech/... | robot text to speech functionality |
Audio | /qt_robot/audio/... | simple standard audio file player |
Emotion | /qt_robot/emotion/... | robot facial emotion |
Gesture | /qt_robot/gesture/... | robot gesture control |
Behavior | /qt_robot/behavior/... | more complex behaviors by combining the robot basic functionality |
Motor | /qt_robot/motor/... | robot motor controls using standard ros_control |
Setting | /qt_robot/setting/... | basic setting of robot such as speaker volume control |
Human 3D Skeleton | /qt_nuitrack_app/skeletons | Human full body skeleton information |
Human 3D Hand Pos | /qt_nuitrack_app/hands | Human 3D hand pos and state |
Human Hand Gestures | /qt_nuitrack_app/gestures | Some basic human gestures recognition |
Human facial | /qt_nuitrack_app/faces | human face detection, orientation, eyes and facial features pos, age/gender detection and facial emotion recognition |
Speech Interface
This interface implements QTrobot text to speech functionality which support many languages. The supported languages can be different for each robot. following are some standard supported languages:
- en-US (american english)
- fr-FR (French)
- de-DE (German)
SUBSCRIBERS | ||
---|---|---|
Interface Name | Data Type | Description |
/qt_robot/speech/say | 'std_msgs/String' (text) | Read a text using built-in TTS |
SERVICES | |||
---|---|---|---|
Interface Name | Service Name | Parameters | Description |
/qt_robot/speech/say | 'speech_say' | message | Read a text using built-in TTS |
/qt_robot/speech/config | 'speech_config' | 'language', 'pitch', 'speed' | Configure TTS |
/qt_robot/speech/stop | 'speech_stop' | Stops current speech activity. |
Default language is set in /opt/ros/noetic/share/qt_robot_interface/config/qtrobot-interface.yaml
. the default pitch is usually '140' and speed is '80'. When calling 'speech_config', leave pitch and speed parameters to '0' if you do not want to change them.
Audio Interface
Play a standard audio file (e.g. wav, mp3). The audio file can be given using its complete name (with the file extension) such as happy.mp3 or simply without the file extension: happy. In the second case, the player first looks for happy.mp3 to play and if it is not found it then tries with happy.wav and so on.
SUBSCRIBERS | ||
---|---|---|
Interface Name | Data Type | Description |
/qt_robot/audio/play | 'std_msgs/String' (audio file name) | Play an audio file |
SERVICES | |||
---|---|---|---|
Interface Name | Service Name | Parameters | Description |
/qt_robot/audio/play | 'audio_play' | 'filename', 'filepath' | Play an audio file given by its filename and filepath |
/qt_robot/audio/stop | 'audio_stop' | Stops current audio activity. |
The default path for the audio files is ~/robot/data/audios/
. To play the audio file from the default path, pass an empty string to filepath parameter.
Emotion Interface
Change the robot facial emotions such as 'QT/happy', 'QT/sad', etc.
SUBSCRIBERS | ||
---|---|---|
Interface Name | Data Type | Description |
/qt_robot/emotion/show | 'std_msgs/String' (emotion name) | Show a facial emotion given by its name |
SERVICES | |||
---|---|---|---|
Interface Name | Service Name | Parameters | Description |
/qt_robot/emotion/show | 'emotion_show' | 'name' | Show a facial emotion given by its name |
/qt_robot/emotion/stop | 'emotion_stop' | Stops current emotion activity. |
The complete list of emotion files can be found in '~/robot/data/emotions/'.
Gesture Interface
Plays recorded robot gesture (arms, head) such as 'happy', 'angry', 'bye', etc. The complete list of gesture files can be found in ~/robot/data/gestures/
.
SUBSCRIBERS | ||
---|---|---|
Interface Name | Data Type | Description |
/qt_robot/gesture/play | 'std_msgs/String' (gesture name) | Play a robot gesture given by its name |
SERVICES | |||
---|---|---|---|
Interface Name | Service Name | Parameters | Description |
/qt_robot/gesture/play | 'gesture_play' | 'name', 'speed' | Play a robot gesture given by its name and speed (default 1.0) |
/qt_robot/gesture/record | 'gesture_record' | 'parts', 'idleParts' | Start recording a new gesture. 'parts' is a string array of parts name ('head', 'left_arm','right_arm') which specifies which robot part will be used for recording the gesture. 'idleParts' must be set to 'true' to release the motor PWM and put them in idle mode. If not you must put them in idle mode using '/qt_robot/motors/setControlMode' interface |
/qt_robot/gesture/save | 'gesture_save' | 'name', 'path' | stops the current recording process and save the recorded gesture given by its 'name'. 'path' specifies where to save the gesture instead of the default path |
/qt_robot/gesture/list | 'gesture_list' | return a list of a the available gestures within the default gesture path | |
/qt_robot/gesture/stop | 'gesture_stop' | Stops current gesture activity. |
The default value for speed is 1.0 and it is the default speed with which the gesture got recorded.
If the speed param value is 0 the default speed will be used to play the gestures. Default path to record/play the gesture is ~/robot/data/gestures/
.
Behavior Interface
This interface implements higher-level and more complex behaviors by combing robot basic functionality.
SUBSCRIBERS | ||
---|---|---|
Interface Name | Data Type | Description |
/qt_robot/behavior/talkText | 'std_msgs/String' (message) | Read a text using TTS and show talking emotion |
/qt_robot/behavior/talkAudio | 'std_msgs/String' (audio filename) | Play an audio file and show talking emotion |
SERVICES | |||
---|---|---|---|
Interface Name | Service Name | Parameters | Description |
/qt_robot/behavior/talkText | 'behavior_talk_text' | 'message' | Read a text using TTS and show talking emotion |
/qt_robot/behavior/talkAudio | 'behavior_talk_audio' | 'filename', 'filepath' | Play an audio file and show talking emotion |
To play the audio file from the default path, pass an empty string to filepath param.
The talkAudio and talkText services are mutually exclusive and cannot be used with Emotion Interface at the same time.
Motor Interface
Motor interface provide access to the robot actuators using standard ros_control system. Currently the interface implements ROS 'JointStateController', 'JointGroupPositionController' and a custom 'QTMotorsController' controllers.
Before using the Motor interface, ensure that you fully understood ros_control system and have a clear understanding of what you do at the motor joint level.
4.6.1 QTrobot parts
The robot joints are structured as different parts as shown bellow:
-
head
- HeadYaw
- HeadPitch
-
right_arm
- RightShoulderPitch
- RightShoulderRoll
- RightElbowRoll
-
left_arm
- LeftShoulderPitch
- LeftShoulderRoll
- LeftElbowRoll
SUBSCRIBERS | ||
---|---|---|
Interface Name | Data Type | Description |
/qt_robot/joints/state | sensor_msgs/JointState | publishes joint states (currently only positions) |
/qt_robot/head_position/command | std_msgs/Float64MultiArray | move the robot head to desired position given by (HeadYaw, HeadPitch). |
/qt_robot/right_arm_position/command | std_msgs/Float64MultiArray | move the right_arm to desired position given by (RightShoulderPitch, RightShoulderRoll, RightElbowRoll). |
/qt_robot/left_arm_position/command | std_msgs/Float64MultiArray | move the left_arm to desired position given by (LeftShoulderPitch, LeftShoulderRoll, LeftElbowRoll). |
SERVICES | |||
---|---|---|---|
Interface Name | Service Name | Parameters | Description |
/qt_robot/motors/home | 'home' | 'parts' | moves the desired robot part to the home position. 'parts' is an array of robot parts and/or single joint name (e.g.['left_arm', 'right_arm', 'HeadPitch']) |
/qt_robot/motors/setControlMode | 'set_control_mode' | 'parts' | set the control mode (M_ON=0,M_OFF=1, M_BRAKE=2) of desired robot part. 'parts' is an array of robot parts and/or single joint name (e.g. ['left_arm', 'right_arm', 'HeadPitch']). M_ON: motor is controlled. M_OFF: motor is idle and M_BRAKE: motor is in brake mode (not controlled) |
/qt_robot/motors/setVelocity | 'set_velocity' | 'parts', 'velocity' | sets the moving velocity of the desired robot part. 'parts' is an array of robot parts and/or single joint name (e.g. ['left_arm', 'right_arm', 'HeadPitch']). 'velocity' is given as percentage. |
For safety purpose, every joint has a maximum velocity limits. For example you cannot run 'HeadPitch' joint with more than 20% of the maximum velocity.
Setting Interface
This interface provides some basic setting of robot such as speaker volume control.
SERVICES | |||
---|---|---|---|
Interface Name | Service Name | Parameters | Description |
/qt_robot/setting/setVolume | 'setting_setVolume' | 'volume' | set the robot speaker volume to the desired level (in percentage) |
Human 3D Tracking Interface
This interface implements different human 3D body and facial tracking including human full body skeleton, hands pos and gestures, facial and emotion recognition using Intel Realsense camera and Nuitrack SDK.
PUBLISHERS | ||
---|---|---|
Interface Name | Data Type | Description |
/qt_nuitrack_app/skeletons | 'qt_nuitrack_app/Skeletons' | publishes human full body skeleton information |
/qt_nuitrack_app/hands | 'qt_nuitrack_app/Hands' | publishes human 3D hand pos and state |
/qt_nuitrack_app/gestures | 'qt_nuitrack_app/Gestures' | publishes some basic human gestures recognition |
/qt_nuitrack_app/faces | 'qt_nuitrack_app/Faces' | publishes human face detection, orientation, eyes and facial features pos, age/gender detection and facial emotion recognition |
/camera/color/image_raw | 'sensor_msgs/Image' | publishes ROS standard 2D camera image |