Download PDFOpen PDF in browser

Real-Time Visual Target Detection and Tracking Via Unmanned Ground Vehicle

EasyChair Preprint no. 8735

6 pagesDate: August 29, 2022

Abstract

Vision-based Autonomous Robots field is becoming rapidly popular, according to the Artificial Intelligence revolution. The proposed system is a composed Unmanned Ground Vehicle vision-based target tracking robot prototype. Target tracking is useful in multiple real-life issues such as the assistance and security fields. Acquitted visual input processing is applied through using OpenCV library. The object detection stage of UGV system, is done based on a pre-built deep learning object detection model. YOLOv3-tiny is used to detect objects, which is a light computation-cost version comparing to original YOLO models, and the other complex deep-learning networks. Object to track is specified to be a human only. The target tracking algorithm is based on a sequence of mathematical equations with Region of Interest and stream’s frame coefficients. Coefficients refer to the values of locations according to x-axis and y-axis of the frame. A simple mathematical technique is used for the delayed feedback issue. The locomotion of UGV is based on transmitted commands from algorithm to the motors through local network connection. Ultra-sound technique is used for collision avoidance. The locomotion of UGV is based on transmitted commands from algorithm to the motors through local network connection. The results show, an autonomous behavior, streamlined and accurate locomotion of tracking.

Keyphrases: autonomous robot, COCO dataset, Human following, object detection, object tracking, obstacle avoidance, target tracking, UGV, YOLOv3

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:8735,
  author = {Nour Ammar and Ali Okatan},
  title = {Real-Time Visual Target Detection and Tracking Via Unmanned Ground Vehicle},
  howpublished = {EasyChair Preprint no. 8735},

  year = {EasyChair, 2022}}
Download PDFOpen PDF in browser