Defect detection is a crucial part of quality assurance in the manufacturing process. This demo uses AM62A to run a vision based artificial intelligence model for defect detection for manufacturing applications. The model tests the produced units as they move on a conveyor belt, to recognize the accepted and the defected units.
An object tracker is developed for this demo to provide accurate coordinates of the units for sorting and filtering. A live video is displayed on the screen. The units are marked on the screen using green boxes for good (accepted) units while defected units are marked with boxes with different shades of red to distinguish the types of defects. The screen also includes a graphical dashboard showing live statistics about total products, defect percentage, production rate, and a histogram of the types of defect.
The code base and details of how to run the demo are available on TI marketplace as a Github repo: https://github.com/TexasInstruments/edgeai-gst-apps-defect-detection.
Following are the steps to run the demo:
The AM62A SoC is equipped with a Deep Learning Accelerator (C7x-MMA) with up to 2 TOPs at 1 GHz. The deep learning inference is offloaded to this accelerator. The defect detection application showed a frame rate of 30 FPS. This rate is limited by the performance of the camera used in the application which does not exceed 30 FPS. For this reason, only 22 % of the deep learning accelerator C7x/MMA is utilized. With this low utilization, the frame rate can scale up to +130 FPS if a faster camera was used. This high frame rate could not have been achieved using only ARM cores.
The video show side by side screen recording of the Defect Detection application when inference is executed on the C7x-MMA DLA (left side of the video) and when inference is executed on ARM A53 cores (right side of the video).
This comparison shows that it is not feasible to use ARM cores for machine vision applications, such as defect detection, which require high frame rate and that the C7xMMA DLA is necessary to provide the required frame rate.
Purpose | Link |
---|---|
Defect Detection Demo Source Code | https://github.com/TexasInstruments/edgeai-gst-apps-defect-detection/ |
Readme file with instructions to run the demo and reproduce it | https://github.com/TexasInstruments/edgeai-gst-apps-defect-detection/blob/main/README.md |
Application note: Detailed steps to reproduce the defect detection demo with performance and power analysis | https://www.ti.com/lit/an/spradc9/spradc9.pdf |
Please find the following resources related to the AM62A and TI Edge AI.
Purpose | Link |
---|---|
AM62A product page | https://www.ti.com/product/AM62A7/ |
AM62A Starter Kit EVM | https://www.ti.com/tool/SK-AM62A-LP/ |
AM62A EVM Quick Start Guide | https://dev.ti.com/tirex/explore/node?node=A__AQniYj7pI2aoPAFMxWtKDQ__am62ax-devtools__FUz-xrs__LATEST/ |
TI Edge AI Studio: Model Analyzer | https://dev.ti.com/edgeaisession/ |
TI Edge AI Studio: Model Composer | https://dev.ti.com/modelcomposer/ |
TI Edge AI Academy | https://dev.ti.com/tirex/explore/node?node=A__AN7hqv4wA0hzx.vdB9lTEw__EDGEAI-ACADEMY__ZKnFr2N__LATEST/ |
Top level github page for Edge AI | https://github.com/TexasInstruments/edgeai/ |
AM62A Datasheet (superset device) | https://www.ti.com/lit/ds/sprsp77/sprsp77.pdf |
AM62A Academy (Basic Linux Training/bringup) | https://dev.ti.com/tirex/explore/node?node=A__AB.GCF6kV.FoXARl2aj.wg__AM62A-ACADEMY__WeZ9SsL__LATEST |