Skip to main content
Version: QTrobot V1

Human facial expression detection using ROS blocks

Overview

 Level:  Advanced
 Goal:  learn how to detect Human facial expressions using ROS blocks
 Requirements:

In this example, we create an emotion imitation application using our blocks from QTrobot Studio. Here is the scenario:

QTrobot looks for a person's face and recognizes one of these three emotions: happy, angry and surprise. Then QTrobot will imitate that emotion by showing the corresponding facial expression. If no person appears within some time, QTrobot looks around by randomly moving his head to left or right in search of a human face.

How does the QTrobot's emotion recognition work?

Now let’s see how we can easily implement our scenario in QTrobot Studio. There is QTrobot nuitrack interface which is running on the robot. Using QTrobot's camera, the qt_nuitrack_app simply detects whatever somebody is standing in front of the QTrobot and reading his/hers face expressions/emotions. If we stand in front of the QTrobot, the qt_nuitrack_app recognizes the face expression and it publishes the corresponding message to /qt_nuitrack_app/faces topic. The topic uses a message of type qt_nuitrack_app/Faces. Within this message, the value of faces, is an array of FaceInfo (multiple faces):

qt_nuitrack_app/FaceInfo[] faces
int32 id
string gender
int32 age_years
string age_type
float64 emotion_neutral
float64 emotion_angry
float64 emotion_happy
float64 emotion_surprise
float64[] rectangle
float64[] left_eye
float64[] right_eye
float64[] angles

we are interested in emotion_neutral, emotion_angry and emotion_surprise for our scenario. The value of these fields is the confidence level of detected emotion ranged from 0.0 to 1.0. Higher value represents higher confidence level in recognizing the corresponding emotion. In our example, we consider values above 0.9 as confident enough.

Implementation

First we try to read a message from /qt_nuitrack_app/faces topic using ROS Subscriber block and check if the message is not empty. After that, we extend it so that we read a detected emotion confidence level and react accordingly. Now we have our main building blocks, we implement the logic of the game as it is described in the scenario, and finally we put all the pieces together and adds some speech messages to make our games more interesting.

1. Read the detected emotion

We can use the ROS Subscriber block to read the message published by qt_nuitrack_app/faces as it shown here:

2. Extend the blocks to extract emotion confidence level

In our scenario we are interested to know which emotion the user has shown and react accordingly.

Our ROS Subscriber block waits until a human facial emotion is detected (store it in faces variable) or 10s passed without appearance of any person in front of the QTrobot. Then we check if any face has been detected by simple checking the validity of faces. If valid, this variable holds an array of detect faces. In our scenario we are interested in the first detected face. Thus, we need to get the first item of faces array. We call this my_face which is of type FaceInfo. Then we simply check the confidence value of each interested emotions. For example, if the confidence value of emotion_happy is more than 0.9, we show QTrobot Happy emotion. We do the similar thing for other emotions.


3. Make QTrobot to look around

Now we just need to implement the look-around part in case no one appears in front of the robot. We want to move only the robot head yaw joint (left and right) to a random position. For this purpose, we use standard random-integer block to create message with random number between -40 to 40 degree (for HeadYaw joint). Then we simply publish it to /qt_robot/head_position/command topic. Using the random-integer blocks, we also show yawing face randomly every few times that QTrobot does not see any face.

Let see how we can implement it using our blocks:


5. Put it all together

Now we have all our building blocks we can put them together and finalize our scenario. We wrap all blocks by LuxAI repeat until I press stop block. This repeat blocks keeps our game running until we stop it using Educator Tablet. Then we add a condition block (IF) to check the detection of the user. If the user is not present for 10 seconds, QTrobot will look around. Here is the complete source of our memory game: