People Tracking on AM62A

Demo Background

Vision-based people tracking has applications in several fields such as retail, building automation, security and safety. This demo utilizes YOLOX-S-Lite machine learning model to detect individuals in the video stream. The output of this model is used to track people moving within the scene. An open source library, Norfair, is used for people tracking. This demo is implemented on AM62A SOC and it can be seamlessly ported to run on other AM6xA devices.

The demo offers live tracking of individuals in the scene with timers indicating the duration they spent at their current location. In addition, it features a dashboard presenting several statistics such as the total number of visitors, the current occupancy and the distribution of the time individuals spent in the scene. The demo also includes a heatmap highlighting frequently visited areas. This feature provides insights to understand human behavior which has applications in several industries. For instance, the heatmap data can inform the rearrangement shelf layout at retail stores to enhance customer experience.

The code base and details of how to run the demo are available on TI marketplace as a Github repo: https://github.com/TexasInstruments/edgeai-gst-apps-people-tracking.

How to get started

Following are the steps to run the demo:

  1. Get an AM62A starter kit EVM
  2. Download the Edge AI Linux SDK
  3. Load the Edge AI Linux SDK via an SD card using the quick start guide
  4. Log into the EVM through a network connection
  5. Clone the git repo for this demo onto the EVM (or copy all files to the SD card)
  6. Run the ./setup-people-tracking.sh script. This downloads required pre-train model artificats and a pre-recorded test video to the EVM.
  7. Open the apps_python directory.
  8. Run the demo using ./app_edgeai.py ../configs/people_tracking.yaml.

Additional Resources for the People Tracking Demo

Purpose Link
People Tracking Demo Source Code https://github.com/TexasInstruments/edgeai-gst-apps-people-tracking/
Readme file with instructions to run the demo and reproduce it https://github.com/TexasInstruments/edgeai-gst-apps-people-tracking/blob/main/README.md

General Edge AI and AM62A resources

Please find the following resources related to the AM62A and TI Edge AI.

Purpose Link
AM62A product page https://www.ti.com/product/AM62A7/
AM62A Starter Kit EVM https://www.ti.com/tool/SK-AM62A-LP/
AM62A EVM Quick Start Guide https://dev.ti.com/tirex/explore/node?node=A__AQniYj7pI2aoPAFMxWtKDQ__am62ax-devtools__FUz-xrs__LATEST/
TI Edge AI Studio: Model Analyzer https://dev.ti.com/edgeaisession/
TI Edge AI Studio: Model Composer https://dev.ti.com/modelcomposer/
TI Edge AI Academy https://dev.ti.com/tirex/explore/node?node=A__AN7hqv4wA0hzx.vdB9lTEw__EDGEAI-ACADEMY__ZKnFr2N__LATEST/
Top level github page for Edge AI https://github.com/TexasInstruments/edgeai/
AM62A Datasheet (superset device) https://www.ti.com/lit/ds/sprsp77/sprsp77.pdf
AM62A Academy (Basic Linux Training/bringup) https://dev.ti.com/tirex/explore/node?node=A__AB.GCF6kV.FoXARl2aj.wg__AM62A-ACADEMY__WeZ9SsL__LATEST