Skip to main content
Version: QTrobot V2

Using QTrobot interfaces

Overview
 Level:  Intermediate
 Goal:  learn how to access QTrobot interfaces such as speech, emotion, gesture, etc in C++ via ROS publishers and services
 Requirements:

If you have followed our previous tutorials, you should know how to start coding on QTrobot with C++ and got the basic knowledge of ROS framework. In this tutorial we will learn about how to access QTrobot interfaces such as speech and audio interface using ROS publishers and service calls.

Create a C++ project

First we create a C++ project for our tutorial. let's call it tutorial_qt_interfaces and add the required C++ file:

cd ~/catkin_ws/src
catkin_create_pkg tutorial_qt_interfaces std_msgs roscpp -D "Command QTrobot via ROS"
cd tutorial_qt_interfaces/src
touch tutorial_qt_interfaces_node.cpp

Remember to update the tutorial_qt_interfaces/CMakeLists.txt file correspondingly:

add_executable(${PROJECT_NAME}_node src/tutorial_qt_interfaces_node.cpp)
# ...
target_link_libraries(${PROJECT_NAME}_node ${catkin_LIBRARIES} )

Using ROS publishers

Most of QTrobot’s functionalities can be accessed in both blocking and non-blocking modes using ROS publish/subscribe and Service/Client interfaces. First we see how we can access those interfaces using ROS publishers.

QTrobot text to speech interface

Open the tutorial_qt_publisher_node.cpp file and the add the following code:

#include "ros/ros.h"
#include "std_msgs/String.h"

int main(int argc, char **argv) {
ros::init(argc, argv, "my_cpp_tutorial_node");
ROS_INFO_STREAM("my_cpp_tutorial_node is started");
ros::NodeHandle n;

ros::Publisher speechSay_pub = n.advertise<std_msgs::String>("/qt_robot/speech/say", 10);
ros::Duration(3.0).sleep();

std_msgs::String msg;
msg.data = "Hello! I am Q.T!";
speechSay_pub.publish(msg);

ros::spin();
return 0;
}

First we included std_msgs/String.h from ROS standard message library. This message is used in /qt_robot/speech/say to communicate with QTrobot speech (TTS) interface.

Tip

How do we know which message type an interface uses? well, There is a useful command in ROS which tells you that:

rostopic info /qt_robot/speech/say
Type: std_msgs/String
...

Then we created a publisher to '/qt_robot/speech/say' topic and wait for few seconds to establish the connection with the interface.

Tip

In the above example, for simplicity we used few seconds of delay using sleep function. A more appropriate way is to wait and check until the connection with a subscriber is established using speechSay_pub.getNumSubscribers()!


Finally we published a text message to QTrobot speech interface which make the robot read that message.

QTrobot talk text interface

The /qt_robot/behavior/talkText interface is similar to /qt_robot/speech/say interface with the only different that the talkText interface asks QTrobot to move his lips while reading the text messages. To try it, just add the following lines to our code:

ros::Publisher behaviorTalkText_pub = n.advertise<std_msgs::String>("/qt_robot/behavior/talkText", 10);
msg.data = "Hello! I am Q.T!";
behaviorTalkText_pub.publish(msg);

QTrobot emotion interface

Now lets show an emotion on QTrobot face. QTrobot comes with plenty of predefined emotion animations. You can find the complete list of the available emotions either using the QTrobot Educator app or by looking into the ~/robot/data/emotions folder in QTRP.

Add the following lines to our code to show the 'happy' emotion under 'QT' category on QTrobot face:

ros::Publisher emotionShow_pub = n.advertise<std_msgs::String>("/qt_robot/emotion/show", 10);
msg.data = "QT/happy";
emotionShow_pub.publish(msg);
Note

As it shown in the above example, you should not give the emotion's file extention (.avi) to the interface!

QTrobot gesture interface

Now lets play a gesture by QTrobot. QTrobot comes with plenty of predefined gestures. You can find the complete list of the available gestures either using the QTrobot Educator app or by looking into the ~/robot/data/gestures folder in QTRP.

Add the following lines to our code to play the 'clapping' gesture under 'QT' category:

ros::Publisher gesturePlay_pub = n.advertise<std_msgs::String>("/qt_robot/gesture/play", 10);
msg.data = "QT/clapping";
gesturePlay_pub.publish(msg);
Note

As it shown in the above example, you should not give the gestures's file extention (.xml) to the interface!

QTrobot audio interface

Now lets play an audio file on QTrobot. QTrobot comes with some audio examples. You can find the complete list of the available audios either using the QTrobot Educator app or by looking into the ~/robot/data/audios folder in QTRP. QTrobot can play both audio wave and mp3 files.

Add the following lines to our code to play the 'Komiku_Glouglou' audio file under 'QT' category:

ros::Publisher audioPlay_pub = n.advertise<std_msgs::String>("/qt_robot/audio/play", 10);
msg.data = "QT/Komiku_Glouglou";
audioPlay_pub.publish(msg);
Note

As it shown in the above example, you do not need to give the audio's file extention (.wav or .mp3) to the interface!

Using ROS services

In the above examples, we have seen how we could access those interfaces using ROS publishers. In the following sections, we will learn how to call those interfaces using ROS service calls. The difference is that using service call, our code access the QTrobot interfaces in blocking mode; meaning that, for example, if we ask QTrobot to play a gesture, our code wait untill the gesture is commpleted before moving to the next line of the code.

Now let's try some of QTrobot services using ROS service call.

Update include_directories of tutorial_qt_interfaces/CMakeLists.txt as follows:

include_directories(
${CATKIN_DEVEL_PREFIX}/include
${catkin_INCLUDE_DIRS}
)

Modify the tutorial_qt_publisher_node.cpp with the following code:

#include "ros/ros.h"
#include "qt_robot_interface/speech_say.h"

int main(int argc, char **argv) {
ros::init(argc, argv, "my_cpp_tutorial_node");
ROS_INFO_STREAM("my_cpp_tutorial_node is started");
ros::NodeHandle n;

ros::ServiceClient speechSay = n.serviceClient<qt_robot_interface::speech_say>("/qt_robot/speech/say");
speechSay.waitForExistence();
qt_robot_interface::speech_say cmd;
cmd.request.message = "Hello! I am QT!";
ROS_INFO_STREAM("calling speechSay...");
speechSay.call(cmd);
ROS_INFO("speechSay return status: %s", (cmd.response.status) ? "okay" : "error");

ros::spin();
return 0;
}

ROS Services are defined by srv files, which contains a request and a response objects. These objects are already compiled in C++ classes by catkin build system. All we need to do is to include them in our C++ code (e.g. #include "qt_robot_interface/speech_say.h"). Then we created a client instant of speech_say service. The optional waitForExistence() waits fot he speech service to be available for the first time. We also created a request command with our desired message; and finally called the service and checked its return status values. The speech_say service request has only one paramter which is meesage to be read by QTrobot.

Tip

How do we know which paramters a service needs? well, There is a useful command in ROS which tells you that:

rosservice info /qt_robot/speech/say
Type: qt_robot_interface/speech_say
Args: message
...

Using ROS subscribers

Let's see how we can use ROS susbcribers to read some data from QTrobot. For this example, we will read the output of the Nuitrack gesture detecor. This data is published by QTrobot nuitrack interface on /qt_nuitrack_app/gestures topic. You can read more about all availables data from nuitrak on Human 3D Tracking Interface. The QTrobot gesture mesasge has the following fields:

qt_nuitrack_app/GestureInfo[] gestures
int32 id
string name

Modify the tutorial_qt_publisher_node.cpp with the following code:

#include "ros/ros.h"
#include "qt_robot_interface/speech_say.h"
#include "qt_nuitrack_app/Gestures.h"

void callback(const qt_nuitrack_app::Gestures::ConstPtr& gestures) {
for(int i=0; i<gestures->gestures.size(); i++) {
ROS_INFO_STREAM("Detected gesture: " <<gestures->gestures[i].name << " by user : "<< gestures->gestures[i].id);
}
}

int main(int argc, char **argv) {
ros::init(argc, argv, "my_cpp_tutorial_node");
ROS_INFO_STREAM("my_cpp_tutorial_node is started");
ros::NodeHandle n;

ros::Subscriber subGestures = n.subscribe("/qt_nuitrack_app/gestures", 10, callback);
ROS_INFO_STREAM("Stay in front of QTrobot around 1-2m away and raise your arm!");

ros::spin();
return 0;
}

First we defined a callback to receive the gestures message and print out the gesture name and the user identification. Then in our main code, we subscribed to the /qt_nuitrack_app/gestures topic. If we run the code and stay in front of the QTrobot (approcimately 2m away from its camera) and raise one arm and lower it, our code will print out the SWIPE UP and SWIPE DOWN. You can see an example of more gestures in QTrobot - 3D Camera - Gesture recognition demo or check the list of available gestures on Nuitrack offcial website.