Gesture With Machine Learning Users Guide

Table of Contents

Overview

This lab demonstrates the use of TI mmWave sensors for gesture recognition applications. The range, velocity, and angle data from mmWave sensors can enable the detection and classification of several natural gestures. The example provided in this demo can recoginize 9 distinct hand gestures: Left swipe, Right swipe, Up swipe, Down swipe, Clockwise twirl, Counterclockwise twirl, On gesture, Off gesture, and Shine gesture.

📝 NOTE This demo is compatible with both the xWR6443 and xWR6843, as it only uses the on-chip Hardware FFT acclerator (HWA) and does not utilize the on-chip c674x DSP. While this demo is intended for the xWR6443 device, a xWR6843 device can be used for emulation.

Requirements

Hardware Requirements

Item Details
Device xWR6843ISK-ODS ES2.0 Antenna Module or xWR6843AOP ES2.0 Antenna Module
MMWAVEICBOOST Carrier Board OPTIONAL: MMWAVEICBOOST Carrier Board for CCS based development and debugging
Computer PC with Windows 10. If a laptop is used, please use the ‘High Performance’ power plan in Windows.
Micro USB Cable
Power Supply 5V, >3.0A with 2.1-mm barrel jack (center positive). The power supply can be wall adapter style or a battery pack with a USB to barrel jack cable.

📝 NOTE Both AWR6843ISK-ODS and IWR6843ISK-ODS are supported and can be used interchangeably. Please consult the respective datasheets for details on the differences between the devices.

Software Requirements

Tool Version Download Link
TI mmWave SDK 3.5.0.x TI mmWave SDK 3.5.0.x and all the related tools are required to be installed as specified in the mmWave SDK release notes
Uniflash Latest Uniflash tool is used for flashing TI mmWave Radar devices. Download offline tool or use the Cloud version
TI Radar Toolbox Latest Radar toolbox should be downloaded to access binaries and source code. Download Instructions in the readme file.

Getting familiar with the device

⚠️ Run Out of Box Demo
Before continuing with this lab, users should first run the out of box demo for the EVM. This will enable users to gain familiarity with the sensor’s capabilities as well as the various tools used across all labs in the mmWave Toolbox.

Quickstart

1. Configure the EVM for Flashing Mode

2. Flash the EVM using Uniflash

Flash the one of the binaries listed below using UniFlash. Follow the instructions for using UniFlash

Both binaries can be found at: <MMWAVE_TOOLBOX_INSTALL_DIR>\source\ti\examples\Gesture_Recognition\Gesture_with_Machine_Learning\prebuilt_binaries\

3. Configure the EVM for Functional Mode

4. Run the Lab

1. Open Visualizer

Navigate to <RADAR_TOOLBOX_INSTALL_DIR>\tools\visualizers\Industrial Visualizer. The visualizer software may be run from Python 3.7.3 or as a standalone executable. In general, running the visualizer from the Python source runs faster and results in fewer errors over long periods of time. If running Python directly from source, run the setUpEnvironment.bat script first to ensure all the correct packages and versions are installed. Then run the visualizer either directly from source with python gui_main.py, or as an executable by running the mmWaveIndustrialVisualizer.exe

2. Select the arrowed options as shown in the below graphic.

  1. Set the Device to IWR6843 as indicated by arrow 1

  2. Set CLI COM and DATA COM to the appropriate COM ports as indicated by arrow 2. If the IWR6843 EVM is plugged in when the visualizer is opened, then the COM ports filled in automatically. If they do not appear, check the Device Manager on Windows for the appropriate ports.

  3. Select the Gesture with Machine Learning as indicated by arrow 3.

  4. Press the “Connect” button as indicated by arrow 4

  5. Press the “Start without Send Configuration” button as indicated by arrow 5.

  6. Change to the ‘Gesture’ tab as indicated by arrow 6.

2. Running the Demo

This lab demonstrates the following features.

Gesture Video (0.5x speed)
Right to Left Swipe
Left to Right Swipe
Up to Down Swipe
Down to Up Swipe
Clockwise Twirl
Counterclockwise Twirl
On Gesture
Off Gesture
Shine Gesture

Perform the above gestures with your hand in front of sensor. The detected gesture will be displayed in the ‘Gesture’ tab of the visualizer.

This concludes the Quickstart Section

Developer’s Guide

Import Lab Project to CCS

To import the source code into your CCS workspace, a CCS project is provided in the lab at the path given below.

🛑 Error during Import to IDE
If an error occurs, check that the software dependencies listed above have been installed. Errors will occur if necessary files are not installed in the correct location for importing.

Build the Lab

Selecting Rebuild instead of Build ensures that the project is always re-compiled. This is especially important in case the previous build failed with errors.

🛑 Build Fails with Errors
If the build fails with errors, please ensure that all the software requirements are installed as listed above and in the mmWave SDK release notes.

📝 NOTE
As mentioned in the Quickstart section, pre-built binary files, both debug and deployment binaries are provided in the pre-compiled directory of the lab.

Execute the Lab

There are two ways to execute the compiled code on the EVM:

UART Output Data Format

This demo outputs data using a TLV(type-length-value) encoding scheme with little endian byte order. For every frame, a packet is sent consisting of a fixed sized Frame Header and then a variable number of TLVs depending on what was detected in that scene. The TLVs for this demo include the extracted features used for the nerual network inference as well as the raw neural network output probabilities for each gesture.

Frame Header

Size: 40 bytes

frameHeaderStructType = struct(...
    'magicWord',                {'uint64', 8}, ... % syncPattern in hex is: '02 01 04 03 06 05 08 07' 
    'version',                  {'uint32', 4}, ... % Software Version
    'totalPacketLen',           {'uint32', 4}, ... % In bytes, including header
    'platform',                 {'uint32', 4}, ... % A6843
    'frameNumber',              {'uint32', 4}, ... % Frame Number
    'timeStamp',                {'uint32', 4}, ... % Message create time in cycles
    'numDetectedObj',           {'uint32', 4}, ... % Number of detected points in this frame
    'numTLVs' ,                 {'uint32', 4}, ... % Number of TLVs in this frame
    'subFrameNumber',           {'uint32', 4}, ... % Sub-Frame number

TLVs

The TLVs can be of type GESTURE FEATURES or ANN OUTPUT PROBABILITIES.

TLV Header

Size: 8 bytes

tlvHeaderStruct = struct(...
    'type',             {'uint32', 4}, ... % TLV object 
    'length',           {'uint32', 4});    % TLV object Length, in bytes, including TLV header 

Following the header, is the the TLV-type specific payload

GESTURE FEATURES TLV

Size: sizeof (tlvHeaderStruct) + 40 bytes = 48 bytes

gestureFeatures = struct(...
    'weightedDoppler',              {'float', 4},
    'weightedPositiveDoppler',      {'float', 4},
    'weightedNegativeDoppler',      {'float', 4},
    'weightedRange',                {'float', 4}, 
    'numPoints',                    {'float', 4},
    'weightedAzimuthMean',          {'float', 4},
    'weightedElevationMean',        {'float', 4},
    'azimuthDopplerCorrelation',    {'float', 4},
    'weightedAzimuthDispersion',    {'float', 4},
    'weightedElevationDispersion',  {'float', 4});

ANN OUTPUT PROBABILITIES

Size: sizeof (tlvHeaderStruct) + 40 bytes = 48 bytes

outputProbs = struct(...
    'probabilityNoGesture',     {'float', 4},
    'probabilityGesture1',      {'float', 4},
    'probabilityGesture2',      {'float', 4},
    'probabilityGesture3',      {'float', 4}, 
    'probabilityGesture4',      {'float', 4},
    'probabilityGesture5',      {'float', 4},
    'probabilityGesture6',      {'float', 4},
    'probabilityGesture7',      {'float', 4},
    'probabilityGesture8',      {'float', 4},
    'probabilityGesture9',      {'float', 4});

Neural Network

This demo utilizes a neural network model which runs on the ARM core making predictions based on extracted features. A sliding window buffer of features, extracted from the data across a number of frames, is used as input to the neural network. The model outputs a set of probabilities, each represents the probability that the input data belongs to a specific gesture.

The model is implemented in the form of header files which contain the weight and bias values for each layer in the neural network. These header files can be found in src/6443/include/neuralnet/.

Features

The model is trained on following extracted features.

Exracted Feature Description
Doppler Average Weighted average of the Doppler across the heatmap
Elevation Weighted Mean Select the N cells of the heat-map with the highest magnitude and for each cell compute the elevation bin (via angle-FFT). The average elevation is the weighted average of all these elevation bin indices.
Azimuth Weighted Mean Select the N cells of the heat-map with the highest magnitude and for each cell compute the azimuth bin (via angle-FFT). The average azimuth is the weighted average of all these azimuth bin indices.
Number of detected points Number of cells in the heatmap that have a magnitude above a certain threshold
Range Average Weighted average of the range across the heatmap.
Doppler Azimuth Correlation Doppler Azimuth correlation which show how Doppler and Azimuth values are related across the frames.

Layers

The structure of the neural network used in this demo can be seen in the diagram below. It consists of a 90 node input (6 features accumulated for the previous 15 frames), two hidden layers of size 30 and 60 respecfully, each with a rectified linear activation (ReLU), and finally a soft max function to convert to 10 output probabilities.

Neuaral Network Structure

Retraining

Users may wish to retrain the neural network model for various reasons such as adding or removing certain gestures. The full process for retraining and deploying a model is not provided in this guide; however, one can save the extracted features which are output over UART and use the saved features as training data in the model building process.

Need More Help?