Skip to main content
Version: QTrobot V2

Create an interactive memory game using human gestures

Overview

 Level:  Advanced
 Goal:  learn how to create an interactive gesture memory game using QTrobot Nuitrack interface
 Requirements:

In this tutorial, we implement a simple gesture memory game using QTrobot Studio and QTrobot Nuitrack interface for gesture recognition. QTrobot shows a sequence of gestures to the user and user should remember them and repeat the same sequence. QTrobot uses 3 simple gestures ("SWIPE_UP", "SWIPE_RIGHT", "SWIPE_LEFT"). After showing the gestures to the user, QTrobot asks the user to show them one-by-one and in the correct order to the QTrobot. The robot starts with a simple sequence and adds more sequence as the game progress. Here is a short video of the scenario:

How does the QTrobot's gesture recognition work?

Now let’s see how we can easily implement our scenario in QTrobot Studio. There is QTrobot nuitrack interface which is running on the robot. Using QTrobot's camera, the qt_nuitrack_app simply detects whatever somebody is standing in front of the QTrobot and showing some gestures. If we stand in front of the QTrobot and we show one of the above mentioned gestures to QTrobot, the qt_nuitrack_app recognize it and publishes the corresponding message to /qt_nuitrack_app/gestures topic. The topic uses a message of type qt_nuitrack_app/Gestures. For example, if we stand in front of the QTrobot and we show the SWIPE UP gesture, the following message will be published:

gestures:
-
id: 1
name: "SWIPE UP"
---

Implementation

First, let's see how we can implement the above scenario using gesture recognition and QTrobot Studio.

First we try to read a message from /qt_nuitrack_app/gestures topic using ROS Subscriber block and check if the message is not empty. After that, we extend it so that we read a detected gesture name. Now we have our main building blocks, we implement the logic of the game as it is described in the scenario, and finally we put all the pieces together and adds some speech messages to make our games more interesting.

1. Read the detected gesture

We can use the ROS Subscriber block to read the message published by qt_nuitrack_app/gestures as it shown here:

2. Extend the blocks to extract its gesture name and ID

In our scenario we are interested to know which gesture the user has shown and save it as ID. To facilitate this, we can implement a subprogram (also known as a function) to do that. We call it ReadGesture. Our subprogram waits for an gesture to be detected. If the gesture is detected it returns the gesture id (e.g. 1 for SWIPE_RIGHT or 2 for SWIPE_LEFT). If no gesture is detected within 30 seconds, our subprogram return -1. The following blocks shows the implementation of our subprogram:


Now we have our subprogram we can use it like any other block.

3. Showing sequence of gestures by QTrobot

As we have explained in our scenario, first QTrobot shows a sequence of gestures ("SWIPE_UP", "SWIPE_RIGHT", "SWIPE_LEFT") to user. The number of gestures in the sequence increases as the game progress. One simple way to implement this is using a list of numbers 1, 2 and 3. In subprogram we extracted gesture IDs and we can say 1 is representing SWIPE_RIGHT, 2 is representing SWIPE_LEFT and 3 is representing SWIPE_UP. Here are the steps we can follow to implement that:

  • first we create a list which contains one of the random number between 1 and 3.
  • then at each step of the game we add one more random number. For example, our list can look like this as game progress:
step 1: 1
step 2: 12
step 3: 122
step 4: 1223
...

Let see how we can implement it using our blocks:


As shown above, first we create an empty list and called it robot_mem. Then in a repeat loop, we use list insert block to randomly add one of the numbers from 1 to 3 to our robot_mem list. So at first, our list should have only one item and as the game progress, it will have more items. Then we use a ForEach loop block to go through each item in robot_mem list and check which number it is:

  • if it's 1 we use Show-says-act block to say Right and play show_right gesture,
  • if it's 2 we use Show-says-act block to say Left and play show_left gesture,
  • if it's 3 we use Show-says-act block to say Up and play one_arm_up gesture.

4. Checking the sequence shown by user

After we ask QTrobot to shows the sequence, we ask users to do the same and check the user shown sequence. Above, we have already implemented our subprogram block to recognize the gesture shown by user. So we just need to go again through our robot_mem list, read the shown gesture by user and check it with each item in the list. As soon we see a wrong gesture shown by user, we say that the user lost the game and we can stop the game. otherwise, we say the name of the gesture that user has shown. Here is the block implementation of what we have just explained:


5. Put it all together

Now we have all our building blocks for showing the robot sequence and checking the user shown sequence, we can put then together and finalize our scenario. We add some introductory messages at the beginning of the game to explain to the user the game scenario. Then we wrap all blocks by LuxAI repeat until I press stop block. This repeat blocks keeps our game running until we stop it using Educator Tablet. However, we also want to stop the game if user left the game. In our subprogram, we already have the logic to understand if user is still playing or he/she has left the game by timeout of ROS Subscriber block in our subprogram. We create a simple variable called game_finished and set it to false. Then we add an condition block (IF) to not repeat the loop whenever whenever game_finished is true. We also add a condition in our If block where we check the user shown gesture. If our subprogram for detecting gestures returns -1, we know that user has left the game and we set the game_finished variable to true. We also ask QTrobot to react correspondingly. Here is the complete source of our memory game: