Overview =========== This lab demonstrates the use of TI mmWave sensors to count and track multiple people simultaneously up to 50m away. Detection and tracking algorithms run onboard the IWR6843 mmWave sensor and are used to localize people and track their movement with a high degree of accuracy. mmWave sensors can reduce false detections from challenging environments such as direct sunlight, no-light, fog, or smoke, and are particularly suited for privacy-conscious applications. In this demonstration, localization and tracking is performed upon any moving object in the scene; static objects such as chairs, tables, and walls are ignored. The IWR6843 device outputs a data stream consisting of point cloud information and a list of tracked objects which can be visualized using the software included in this lab. {{y Detection Range is extended to >100m for a human adult! Please see results in section [4.2.2](#chirp_fov). Please see the document "Beamforming_in_LRPD.pdf" to understand the new 100 meter chirp which implements Transmitter Beamforming to increase effective range.}} [[r! ES2.0 Devices Only This lab only runs on ES2.0 devices. ]] <img src="images/overview.gif" width="400"/> <img src="images/pplcount_overview_block2.png" width="450"/> Quickstart =========== 1. Hardware and Software Requirements ----------- ### Hardware Item | Details --------------------------|----------------- Device | [Industrial mmWave Carrier Board](http://www.ti.com/tool/MMWAVEICBOOST) and [IWR6843 Long Range Antenna Board](http://www.ti.com/tool/IWR6843ISK). Mounting Hardware | The EVM needs to be mounted at a height of ~2.0-2.5m with a slight downtilt. An [adjustable clamp style smartphone adapter mount for tripods](https://www.amazon.com/Vastar-Universal-Smartphone-Horizontal-Adjustable/dp/B01L3B5PBI/) and a [60-75" tripod](https://www.amazon.com/Neewer-Portable-centimeters-Camcorder-kilograms/dp/B01N6JCW8F/) can be used to clamp and elevate the EVM. This is only an example solution for mounting; other methods can be used so far as setup specifications are met. Computer | PC with Windows 7 or 10. If a laptop is used, please use the 'High Performance' power plan in Windows. Micro USB Cable | Due to the high mounting height of the EVM, an 8ft+ cable or USB extension cable is recommended. Power Supply | 5V, 3A with 2.1-mm barrel jack (center positive). The power supply can be wall adapter style or a battery pack with a USB to barrel jack cable. Tape Measure | ### Software Tool | Version | Required For |Download Link| ----------------------------|---------------------------|---------------|-------------| mmWave Industrial Toolbox | Latest | Contains all lab material. | [mmWave Industrial Toolbox](http://dev.ti.com/tirex/explore/node?node=AJoMGA2ID9pCPWEKPi16wg__VLyFKFf__LATEST) Uniflash | Latest | Quickstart Firmware | [Download offline tool](http://www.ti.com/tool/UNIFLASH) or use [cloud version](https://dev.ti.com/uniflash/#!/) 2. Physical Setup ----------- 1. Follow the instructions for [Hardware Setup of ICB for Functional Mode](../../../common/docs/hardware_setup/hw_setup_antenna_module_and_carrier_for_functional.html) 2. For best results, the EVM should be positioned high enough to be above the top of tracked objects and with a slight down tilt. The aim is to position the EVM so that the antenna beam can encompass the area of interest. If the down tilt is too severe, noise from ground clutter would increase and the effective sensing area would decrease. If threre is no down tilt, counting performance would be worse for cases in which one person is in line with and shielded by another person. Given the antenna radiation pattern of the EVM, consideration should be taken to not mount the EVM too close or oriented with beam directed to the ceiling as this can increase the noise floor and result in less optimal performance. <img src="images/downtilt.jpg" width="700"/> **Setup Requirements:** * Elevate EVM: 2.0-2.5m high * Down tilt: ~2-3 degree **Setup using suggested tripod and smartphone clamp mount:** 1. Screw on clamp mount to tripod 2. Clamp EVM across its width below power barrel jack to attach EVM 3. Adjust tripod head for ~2-3 degree down tilt (Tip: Bubble or level smartphone apps can be used to measure down tilt) 4. Plug in micro-usb and power supply to EVM 5. Extend tripod so that the EVM is elevated 2.0-2.5m from the ground 6. Position EVM and tripod assembly in desired location of room. The EVM should be positioned so that the 120 degree FOV of the EVM antenna encompasses the area of interest and points to the region in which people are expected to enter the space. <img src="images/EVM_2m.png" width="300"/> <img src="images/EVM_2degree.png" width ="300"/> <a name="flash_the_evm"></a> 3. Flash the EVM ----------- * Follow the instructions for [Hardware Setup of ICB for Flashing Mode](../../../common/docs/hardware_setup/hw_setup_antenna_module_and_carrier_for_flashing.html) * Follow the instruction to [Flash the mmWave Device](../../../common/docs/software_setup/using_uniflash_with_mmwave.html) Image | Location --------------------------|------------ Meta Image 1/RadarSS | `C:\ti\<mmwave_industrial_toolbox_install_dir>\labs\long_range_people_detection\68xx_long_range_people_det\prebuilt_binaries\long_range_people_det_68xx_demo.bin` <a name="run_gui_quickstart"></a> 4. Run the Lab ----------- To run the lab, launch and configure the visualizer which displays the detection and tracked object data received via UART. ### 1. Launch the visualizer: * Navigate to `C:\ti\<mmwave_industrial_toolbox_install_dir>\labs\people_counting\68xx_people_counting\gui\mmWave_People_Count_GUISetup.exe` * Run `mmWave_People_Counting_GUISetup.exe` * This will open a wizard to install the Visualizer * Once installed, you can run the "mmWave_People_Counting_GUI". It should take 5 - 10 seconds to startup. <img src="images/visualizerStartUp.jpg" width="600"/> <a name="configure_visualizer"></a> ### 2. Configure Visualizer ----------- On the left side of the visualizer setup window are options and parameters for running the demo. On the right side is a 3D graph which will display the demo point cloud and tracker output. The following sections will step through the setup requirements to run the people counting demo: #### 1. Select COM Ports <img src="images/connectComPorts.jpg" width = "600"/> Specify **UART** and **DATA** COM ports using the text boxes. Only enter the number. Then select the parser type. The default is "3D People Counting". However, you can also select parsers for the following labs: - SDK Out of Box demo - Long Range People Detection demo - Indoor False Detection Mitigation - (Legacy): 2D People Counting (For both IWR16xx and IWR68xx) - (Legacy): Overhead People Counting Click **Connect** to open and connect to ports. Text saying "Connected" will appear after a successful connection to both the UART and DATA Com Ports. If the visualizer cannot connect to either Com Port, it will notify with text "Unable to Connect". If you selected the wrong parser, simply change the selection and click "Connect" again. [[g! COM Status Message should update to show that the COM ports have been connected before continuing. ]] #### 2. Statistics <img src="images/statistics.jpg" width = "600"/> This section displays statistics related to the demo performance. * Frame - current frame number as reported by the device * Average Plot Time - time to draw the plot in ms * Points - number of points detected this frame (more points may be drawn based on the Persistent Frames option) * Targets - number of tracked people or other objects in the scene #### 3. Chirp Configuration <img src="images/selectConfiguration.jpg" width = "600"/> {{y There is a new chirp configuration to enable detection of people at more than 100 meters. Please see the document "Beamforming_in_LRPD.pdf" in the docs folder of this lab. All chirp configurations run with the same binary and visualizer. However, you will have to restart the device to change configurations.}} You will need to select a chirp configuration to send to the device. * To load a custom config: select **Select Configuration** option and then choose the desired '.cfg' file. The plot will update with a red cube depicting the valid tracking area based on the SceneryParams or boundaryParams in the chirp config. * There are multiple chirp configurations included with the lab. Please choose the one that fits your use case from the list below. * Once a chirp is selected, the table will populate with values for Range Resolution, Max Range, Velocity Resolution, and Max Velocity. * There are 3 chirps included with this lab: * people_detection_and_tracking_50m_2D.cfg - standard 2D chirp for detection up to 50 meters * people_detection_and_tracking_50m_3D.cfg - standard 3D chirp for detection up to 50 meters * people_detection_and_tracking_100m_2D_advanced.cfg - this implements transmitter beamforming to increase detection range. This chirp makes tradeoffs to enable the longer range: * Lower Updated Rate - 2.5 Frames per second compared to 10 FPS of the other chirps * Lower FOV - this can only see from about -15 degrees to +30 degrees (45 degree FOV). This is a limitation of the current SDK. <a name="chirp_fov"></a> Angle of Arrival (degrees) | 2D Approaching (m) | 3D Approaching (m) | 2D Departing (m) | 3D Departing (m) | Tx Beamforming Approaching | Tx Beamforming Departing | ---------------------------|--------------------|--------------------|------------------|--------------------------------------------------------------------------- 0 | 49 | 46 | 53 | 55 | >100 | >100 15 | 49 | 47 | 50 | 56 | >100 | >100 30 | 55 | 44 | 57 | 50 | >100 | >100 45 | 56 | 42 | 56 | 42 | NA | NA 60 | 34 | 28 | 45 | 34 | NA | NA [[+d Expand for details of included chirps: ### Default Chirp Parameters 2D 50 m, 3D 50 m, and 2D 100 m. Chirp Parameter (Units) | 2D 50m | 3D 50 m | Tx Beamforming 100 m -------------------------|------------|----------|--------- Start Frequency (GHz) | 60.0 |60.0 |61.0 Slope (MHz/us) | 8.241 |8.241 |1.8 Samples per chirp | 125 |125 |256 Chirps per frame | 256 |288 |512 Frame duration (ms) | 100 |100 |400 Sampling rate (Msps) | 3.4330 |3.4330 |2000 Bandwidth (GHz) | 0.300 |0.300 |0.243 Range resolution (m) | 0.49 |0.49 |0.65 Max Unambiguous Range (m)| 60 |60 |150.0 Max Radial Velocity (m/s)| 7.8806 |7.8806 |2.119 Velocity resolution (m/s)| 0.1250 |0.1250 |0.022 Azimuth resolution (deg) | 14.5 |14.5 |29 Number of Rx | 4 |4 |4 Number of Tx | 2 |3 |3 +]] #### 4. Plot Controls <img src="images/plotControls.jpg" width = "600"/> The box labelled *Plot Controls has 4 Options * **Plot Point Color By Index** - points will be the same color as the track they are associated with. If a point is not associated with a track, it will be white. * **Plot Point Color By Height** - points will be colored based on the Z value of the point. This is mutually exclusive with "Plot Point Color By Index" * **Plot Tracks** - when this is on, boxes will be drawn at the location of tracked people in the scene. When this is off, only point cloud output will be visible. This does not disable the tracker. * **Number of Persistent Frames** - this controls the number of frames plotted at a time. When set to a value n, points from the last n values will be plotted. This does not effect demo performance, use this to make the visualization easier to understand. 3 is default. #### 5. Boundary Boxes <img src="images/BoundaryBox.jpg" width = "600"/> These define the valid tracking area. Up to two boundary boxes can be set. People can only be tracked when they are inside the boxes. Each box has 6 parameters. When standing behind the EVM, facing the same direction as the antenna: * Left X - Left side of boundary box * Right X - right side of boundary box * Near Y - close boundary parallel with the EVM * Far Y - Far boundary parallel with the EVM * Bottom Z - Bottom Boundary * Top Z - Upper Boundary All modificaitons must be made before clicking **Send Configuration** #### 6. Sensor Position <img src="images/sensorPosition.jpg" width = 600/> These currently are not functional with the Long Range People Detection Lab. * **Launch Visualizer** * Click **Send Configuration** to configure the device and start tracking. 5. Understanding the Output ----------- The visualizer consists of: * A Grid made of light-grey lines, representing the floor * 1 - 2 Red Boxes representing the boundary boxes * Various colored spheres representing radar detection points. Coloration depends on the options discussed in Plot Controls. * Various colored boxes representing tracked people. Color is based on the Tracker ID. * Each track will have 3 digits next to it, representing (X, Y, Z) coordinates. <img src="images/guiOutput.jpg" width = 1000/> Developer's Guide =========== Build the Firmware from Source Code ----------- <a name='reqs'></a> ### 1. Software Requirements Tool | Version | Download Link ----------------------------|---------------------------|-------------- mmWave Industrial Toolbox | Latest | [mmWave Industrial Toolbox](http://dev.ti.com/tirex/explore/node?node=AJoMGA2ID9pCPWEKPi16wg__VLyFKFf__LATEST) TI mmWave SDK | Latest | [TI mmWave SDK](http://software-dl.ti.com/ra-processors/esd/MMWAVE-SDK/latest/index_FDS.html) and all the related tools are required to be installed as specified in the mmWave SDK release notes Code Composer Studio | 8.1.0 | [Code Composer Studio v8](http://processors.wiki.ti.com/index.php/Download_CCS#Code_Composer_Studio_Version_8_Downloads) TI SYS/BIOS | 6.73.01.01 | Included in mmWave SDK installer TI ARM Compiler | 16.9.6.LTS | Included in mmWave SDK installer TI CGT Compiler | 7.4.16 | Version 7.4.16 must be downloaded and installed. [Download link](https://www.ti.com/licreg/docs/swlicexportcontrol.tsp?form_type=2&prod_no=ti_cgt_c6000_7.4.16_windows_installer.exe&ref_url=http://software-dl.ti.com/codegen/esd/cgt_registered_sw/C6000/7.4.16) XDC | 3.50.08.24 | Included in mmWave SDK installer C64x+ DSPLIB | 3.4.0.0 | Included in mmWave SDK installer C674x DSPLIB | 3.4.0.0 | Included in mmWave SDK installer C674x MATHLIB (little-endian, elf/coff format) | 3.1.2.1 | Included in mmWave SDK installer mmWave Radar Device Support Package | 1.6.1 or later | Upgrade to the latest using CCS update process (see SDK user guide for more details) TI Emulators Package | 7.0.188.0 or later | Upgrade to the latest using CCS update process (see SDK user guide for more details) Uniflash | Latest | Uniflash tool is used for flashing TI mmWave Radar devices. [Download offline tool](http://www.ti.com/tool/UNIFLASH) or use the [Cloud version](https://dev.ti.com/uniflash/#!/) Python 3 64 bit (Visualizer Only) | 3.6.x 64 bit | [Python 3.6 64 Bit](https://www.python.org/ftp/python/3.6.0/python-3.6.0-amd64.exe) ### 2. Import Lab Project For the People Counting lab, there are two projects, the DSS for the C674x DSP core and the MSS project for the R4F core, that need to be imported to CCS and compiled to generate firmware for the xWR6843. [[b! Project Workspace When importing projects to a workspace, a copy is created in the workspace. All modifications will only be implemented for the workspace copy. The original project downloaded in mmWave Industrial Toolbox is not touched. ]] 1. Start CCS and setup workspace as desired. 2. Import the project(s) specified below to CCS. See instructions for importing [here](../../../../docs/readme.html#import-ccs-projects-from-the-mmwave-industrial-toolbox-into-code-composer-studio). * **long_range_people_det_68xx_mss** * **long_range_people_det_68xx_dss** 3. Verify that the import occurred without error: in CCS Project Explorer, both **long_range_people_det_68xx_mss** and **long_range_people_det_68xx_dss** should appear. ### 3. Build the Lab The DSS project must be built before the MSS project. 1. Select the **long_range_people_det_dss** so it is highlighted. Right click on the project and select **Rebuild Project**. The DSS project will build. 2. Select the **long_range_people_det_mss** so it is highlighted. Right click on the project and select **Rebuild Project**. The MSS project will build, the the lab binary will be constructed automatically. 2. On successful build, the following should appear: * In long_range_people_det_dss &rarr; Debug, **long_range_people_det_dss.xe674** (this is the C67x binary used for CCS debug mode) * In long_range_people_det_mss &rarr; Debug, **long_range_people_det_mss.xer4f** (this is the Cortex R4F binary used for CCS debug mode) and **long_range_people_det_lab.bin** (this is the flashable binary used for deployment mode) {{y Selecting Rebuild instead of Build ensures that the project is always re-compiled. This is especially important in case the previous build failed with errors.}} [[r! Build Fails with Errors If the build fails with errors, please ensure that all the software requirements are installed as listed above and in the mmWave SDK release notes. ]] [[b! Note As mentioned in the [Quickstart](#quickstart) section, pre-built binary files, both debug and deployment binaries are provided in the pre-compiled directory of the lab. ]] ### 4. Execute the Lab There are two ways to execute the compiled code on the EVM: * Deployment mode: the EVM boots autonomously from flash and starts running the bin image * Using Uniflash, flash the **long_range_people_det_68xx_demo.bin** found at `<PROJECT_WORKSPACE_DIR>\long_range_people_det_68xx_mss\Debug\long_range_people_det_68xx_demo.bin` * The same procedure for flashing can be use as detailed in the Quickstart [Flash the Device](#flash_the_evm) section. * Debug mode: Follow the instructions for [Using CCS Debug for Development](../../../common/docs/software_setup/using_ccs_debug.html) After executing the lab using either method, the lab can be visualized using the [Quick Start GUI](#run_gui_quickstart) or continue to working with the [GUI Source Code](#getting-started-with-gui-source-files) <a name="getting-started-with-gui-source-files"></a> Visualizer Source Code ----------- {{y Working with and running the Visualizer source files requires a Python 3.6 64 bit install. You can run with a 32 bit installation, but the behavior will be undefined.}} <h3>Installing Python</h3> Please download the Python installer from the link [above](#reqs). Once downloaded, follow the steps below: 1. Install for all users if option is available. 2. In optional features, make sure pip and the .py launcher are selected for install <img src="images/pythonOptionalFeatures.jpg" width="600"/> 3. In Advanced Options, ensure "Add Python to environment variables" is selected. <img src="images/pythonAdvancedOptions.jpg" width="600"/> If Python is already installed, you can run the installer again to modify the installation. <img src="images/pythonModify.jpg" width="600"/> The detection processing chain and group tracking algorithm are implemented in the firmware. The visualizer serves to read the UART stream from the device and then plot the detected points and tracked objects. Source files are located at `C:\ti\mmwave_industrial_toolbox_<VER>\labs\people_counting\68xx_people_counting\gui`. * **gui_main.py**: the main program which controls placement of all items in the GUI and schedules UART read and graphing tasks. * **oob_parser.py** defines an object used for parsing the UART stream. If you want to parse the UART data, you can use this file to do so. The API is defined at the top of the file. * **gui_threads.py** defines the different threads that are run by the demo. These threads handle updating the plot and calling the UART parser. * **graphUtilities.py** contains functions used to draw objects. You will need to setup your python environemnt. The following libraries are used by the visualizer: * PyQT5 * pyqtgraph * pyopengl * numpy * pyserial You can use the script setupEnvironment.bat to install all of these using pip. If you are on the TI Network, please use setupEnvironmentTI.bat. Otherwise you can install these libraries manually using pip or another method. Once everything has been installed, you can run the visualizer by calling "python gui_main.py" on the command line. Data Formats ----------- A TLV(type-length-value) encoding scheme is used with little endian byte order. For every frame, a packet is sent consisting of a fixed sized **Frame Header** and then a variable number of TLVs depending on what was detected in that scene. The TLVs can be of types representing the point cloud, target list object, and associated points. <img src="images/packet_structure.png" width="600"/> ### Frame Header Size: 52 bytes ```Matlab frameHeaderStructType = struct(... 'magicWord', {'uint64', 8}, ... % syncPattern in hex is: '02 01 04 03 06 05 08 07' 'version', {'uint32', 4}, ... % Software Version 'platform', {'uint32', 4}, ... % A6843 'timeStamp', {'uint32', 4}, ... % Message create time in cycles 'totalPacketLen', {'uint32', 4}, ... % In bytes, including header 'frameNumber', {'uint32', 4}, ... % Frame Number 'subFrameNumber', {'uint32', 4}, ... % Sub-Frame number 'chirpProcessingMargin', {'uint32', 4}, ... % time left after chirp processing in cycles 'frameProcessingMargin', {'uint32', 4}, ... % time left after frame processing in cycles 'trackingProcessingTime', {'uint32', 4}, ... % time to run tracker 'uartSendingTime', {'uint32', 4}, ... % time to send uart message 'numTLVs' , {'uint16', 2}, ... % Number of TLVs in this frame 'checksum', {'uint16', 2}); % Subframe number. ```**Frame Header Structure in MATLAB syntax for name, type, length** ### TLVs The TLVs can be of type **DPIF_PointCloudSpherical**, **DPIF_PointCloudSideInfo**, **TARGET_LIST_3D**, or **TARGET_INDEX**. #### **TLV Header** Size: 8 bytes ```Matlab % TLV Type: 06 = DPIF Point cloud spherical, 07 = Target object list, 08 = Target index, 09 = DPIF Point Cloud Side Info tlvHeaderStruct = struct(... 'type', {'uint32', 4}, ... % TLV object 'length', {'uint32', 4}); % TLV object Length, in bytes, including TLV header ```**TLV header in MATLAB syntax** Following the header, is the the TLV-type specific payload #### **Point Cloud TLV** Type: DPIF_POINT_CLOUD_SPHERICAL Size: sizeof (DPIF_PointCloudSpherical) x numberOfPoints <img src="images/tlv_pointcloud_structure.png" width="250"/> Each Point Cloud TLV consists of an array of points. Each point is defined in 16 bytes. ```java DPIF_PointCloudSpherical = struct(... 'range', {'float', 4}, ... % Range, in m 'azimuth', {'float', 4}, ... % Azimuth angle, in rad 'elevation', {'float', 4}, ... % Elevation angle, in rad 'doppler' {'float', 4}, ... % Doppler, in m/s ```**Point Structure in MATLAB syntax** #### **Point Cloud Side Info TLV** Type: DPIF_POINT_CLOUD_SIDE_INFO Size: sizeof(DPIF_PointCloudSideInfo) x numberOfPoints <img src="images/tlv_pointcloud_structure.png" width="250"/> Each Point Cloud Side Info TLV consists of an array of point side info data. Each is 8 bytes. ```java DPIF_PointCloudSideInfo = struct(... 'snr', {'int16_t', 2}, ... % SNR, ratio 'noise' {'int16_t', 2}, ... % Noise ```**Point Side Info Structure in MATLAB syntax** #### **Target Object TLV** Type: TARGET_LIST_3D Size: sizeof (targetStruct3D) x numberOfTargets <img src="images/tlv_target_structure.png" width="250"/> Each Target List TLV consists of an array of targets. Each target is defined in 68 bytes. ```java targetStruct3D = struct(... 'tid', {'uint32', 4}, ... % Track ID 'posX', {'float', 4}, ... % Target position in X dimension, m 'posY', {'float', 4}, ... % Target position in Y dimension, m 'velX', {'float', 4}, ... % Target velocity in X dimension, m/s 'velY', {'float', 4}, ... % Target velocity in Y dimension, m/s 'accX', {'float', 4}, ... % Target acceleration in X dimension, m/s2 'accY', {'float', 4}, ... % Target acceleration in Y dimension, m/s 'posZ', {'float', 4}, ... % Target velocity in Y dimension, m/s 'velZ', {'float', 4}, ... % Target acceleration in X dimension, m/s2 'accZ', {'float', 4}, ... % Target acceleration in Y dimension, m/s ```**Target Structure in MATLAB syntax** #### **Target Index TLV** Type: TARGET_INDEX Size: numberOfPoints <img src="images/tlv_targetid_structure.png" width="250"/> Each Target List TLV consists of an array of target IDs. A targetID at index ***i*** is the target to which point ***i*** of the frame's point cloud was associated. Valid IDs range from 0-249. ```java targetIndex = struct(... 'targetID', {'uint8', 1}); % Track ID ```**Target ID Structure in MATLAB syntax** Other Target ID values: Value | Meaning ------------|----------- 253 | Point not associated, SNR too weak 254 | Point not associated, located outside boundary of interest 255 | Point not associated, considered as noise Customization ----------- * Please refer to the **People Counting Demo Customization Guide** which can be found at `C:\ti\<mmwave_industrial_toolbox_install_dir>\labs\long_range_people_detection\68xx_long_range_people_det\docs\pplcount_customization_guide.pdf` Need More Help? =========== * Find answers to common questions on <a href="https://e2e.ti.com/support/sensor/mmwave_sensors/w/wiki" target="_blank">mmWave E2E FAQ</a> * Search for your issue or post a new question on the <a href="https://e2e.ti.com/support/sensor/mmwave_sensors/f/1023" target="_blank">mmWave E2E forum</a>