СИСТЕМНЫЙ АНАЛИЗ, УПРАВЛЕНИЕ И ОБРАБОТКА ИНФОРМАЦИИ, СТАТИСТИКА
УДК 629.052.3 DOI https://doi.org/10.38161/1996-3440-2024-2-63-70
He Binggao, Fan Caitian, Mu Xinbei, Wang Rui
MOBILE ROBOT TRACKING SYSTEM BASED ON MACHINE VISION AND LASER RADAR
He Binggao - PhD, Associate Professor, College of Electronic and Information Engineering, Changchun University, e-mail: [email protected] (China,); Fan Caitian - College of Electronic and Information Engineering, Changchun University, e-mail: [email protected] (China); Mu Xinbei - College of Electronic and Information Engineering, Changchun University, e-mail: [email protected] (China); Wang Rui - College of Electronic and Information Engineering, Changchun University, e-mail: [email protected] (China)
The proposed solution addresses the issue of insufficient real-time performance and accuracy in mobile robot path tracking by introducing a system that combines machine vision and laser radar. In this study, the Broadcom BCM2711 microcontroller chip is connected to the RS232 communication interface for transmitting information to the ARM embedded processor. Users can access position distance, direction, and other robot-related data through the man-machine interface's LCD display in a Windows operating system environment. By initiating an adaptive position tracking algorithm program identified by the robot within the position tracking unit, mobile position tracking of the robot is achieved. Experimental results demonstrate significant improvements in both real-time performance and accuracy of this mobile robot tracking system.
Keywords: laser radar, mobile robot, tracking system Introduction
In recent years, mobile robot has gradually become a comprehensive discipline and has been developed rapidly. The combination of computer, electronics and machinery has obtained the latest research results [1], and has become the highest achievement in the field of mechatronic integration. Mobile robot technology is also widely used in many fields such as industry and agriculture, which has a good application prospect. At the same time, it has become an important topic of attention and research in robotics academia.
However, in the process of operation, the robot is affected by uncontrollable factors, which will make it deviate from the set walking route. In order to avoid such situations, many scholars have designed robot position tracking systems, such as the
© He Binggao, Fan Caitian, Mu Xinbei, Wang Rui, 2024
ВЕСТНИК ТОГУ. 2024. № 2 (73)
BECTHHK TOry. 2024. № 2 (73)
high-precision laser tracking system designed by Li Guicun et al. [2], which uses ray tracing method to establish geometric optical model of the robot position and describes the current position of the robot in the form of simulation. However, the tracking range and tracking speed of this system are not ideal, so the application effect is poor. Chen Hongfang et al. [3] designed a dual-wavelength compensation air refractive index tracking system, which established a robot energy model according to the changing characteristics of polarized light. After using the model to output laser energy aggregation data, the robot position tracking was realized by clustering and tracking the data. However, the tracking result of this system is not accurate enough due to the interference signal. In this paper, lidar is introduced into the process of robot position tracking, and the robot mobile position tracking system based on lidar and vision technology is designed to improve the technical level of robot position tracking.
Aiming at the problems existing in the above research results, a LiDAR-based mobile robot tracking system is designed. The research results of this thesis will provide new ideas and methods for the application of lidar tracking technology in the field of robotics, and promote the development of intelligence and autonomy of robots in complex environments. It is hoped that the research of this paper can provide useful reference and enlightenment for the development and application of robot technology in the future.
System Design
The mobile robot mechanism comprises various types of mobile mechanisms, such as crawler [4], hybrid, snake, wheel, and others. Different scenarios require the application of different mobile mechanisms. This paper primarily focuses on introducing the wheeled moving mechanism. The structure and shape of the wheel are determined based on the vehicle's bearing force and ground properties. The wheeled mobile robot exhibits stable movement, simple control and mechanism , as well as high energy utilization. Fig. 1 illustrates the main components of the mobile robot.
Fig. 1. Mobile robot main components
MOBILE ROBOT TRACKING SYSTEM BASED -
ON MACHINE VISION AND LASER RADAR BtOHHtC TOTY. 2024. № 2 (73)
The mobile robot consists of a control system, perception system, and motion system. The car body serves as the entire motion system [5] equipped with batteries, DC motors, etc, while wheels perform steering and driving functions. The perception system includes a lidar sensor array and a camera. On the other hand, the control system is composed of a man-machine interface module along with controller software and hardware circuits; its physical diagram is depicted in Fig. 2.
LCD
Fig. 2. Mobile robot
Lidar tracking algorithm
When obtaining environmental distance information through lidar sensors for robots, we employ laser adjacent points clustering algorithm to cluster this information into laser scanning segmentation [5] points. By scanning and segmenting these laser scanning points related to environmental distance information for robots continuously gathered together within clusters can form continuous segmentation points that enable recognition of contour data type within each cluster facilitating position tracking realization for robots' environment distance information. Its expression formula is given below:
Lk=fl+f2 , (1)
A = Zi=2J(Hk[i + i]*x-Hk[q*xy, (2)
/2 = ÏÏ-2oA"di + 1]*y-Hk[i]*y)2, (3)
sk = J(Hk[t -1]*x-Hk[0]* x)2 + J(Hk[t -1]*y-Hk[0]* y)2. 4)
ВЕСТНИК ТОГУ. 2024. № 2 (73)
The total length of the laser segmentation point cluster center (Lk) and the straight line length between the head and tail points of the cluster (5^) are denoted as Lk and 5k, respectively. The horizontal and vertical axes are represented by x and y, while i represents the cluster center. The curvature of laser point clustering is denoted as Curve, with its calculation formula provided below.
Curve = ^r . (5)
°k
Formula (5) is utilized to determine if its value exceeds 1; if so, the laser segmentation point cluster Hk is added to the laser point cluster. Subsequently, calculate the position JPcent=(xc,yc) of all laser point clusters' center points and measure their distance from the robot using this expression:
Lcent = V xc2 + VУс2. (6)
Qx and Qy represent the velocity of the robot in horizontal and vertical directions respectively, with their numerical expressions calculated as follows: (Qx = (ix
(Qy = (2Ус - (2У0 , U)
where xc denotes horizontal coordinates of the laser point clustering center while yc represents vertical coordinates; xo denotes horizontal coordinates of robot tracking position while yo represents vertical coordinates. Let (1 and (2 be speed control [Ошибка! Источник ссылки не найден.] parameters accordingly. Based on formula (3-7) results, after setting up an angle between target point coordinates relative to robot body's coordinates into formula (6), we can obtain distance between cluster center point and robot which serves as a result for robot position tracking.
Program flow
The system for tracking mobile positions of robots involves transmitting data measured by a laser radar sensor to a program within a position tracking module responsible for mobile position tracking [8]. This program processes collected range data from robots obtained through a laser radar sensor before inputting it into system's position tracking unit. After clustering adjacent points detected during laser scanning process, the unit obtains segmented points. Fig. 3 shows the functional framework of the system software.
MOBILE ROBOT TRACKING SYSTEM BASED ON MACHINE VISION AND LASER RADAR
BECIHHK TOry. 2024. № 2 (73)
Fig. 3. Software function framework of tracking system
Experiment and conclusion Radar scan tracking test
First, a comprehensive map is created in Eyesim, followed by the placement of two different robots to drive the large car along an arc path and enable the small car to follow suit. The simulation image depicted in Fig. 4 visualizes their trajectory, while on the right side, laser scanning feedback reveals that the tracked car appears as a black section due to its corner-facing radar resulting in sharp scanning angles. In Fig. 5, a flat angle scan is conducted which yields a rectangular feedback for easier recognition. Additionally, three lines are added to indicate the orientation angle between the target and lidar for distance calculation purposes. The results demonstrate that this system exhibits robust capability in accurately tracking robot positions and possesses practical applicability.
BEGTHHK TOry. 2024. № 2 (73)
Fig. 4. Radar scan tracking simulation
Fig. 5. Radar scan tracking simulation
Environmental monitoring effect test
Also, a map is constructed within Eyesim to facilitate tracking of the big car by another vehicle. Despite having a resolution of only 320x240 pixels, it proves sufficient for users to discern their surrounding environment and perform subsequent operations effectively. The simulation image presented in Fig. 6 demonstrates that this system excels at monitoring real-time environmental conditions ahead of the robot with practicality during usage.
MOBILE ROBOT TRACKING SYSTEM BASED -
ON MACHINE VISION AND LASER RADAR BECTHHEC HXY. 2024 № 2 C73)
Fig. 6. Real-time monitoring and tracking simulation diagram Conclusion
The designed robot position tracking system utilizes laser radar and vision technology. The experimental results demonstrate that the mobile robot path automatic tracking system presented in this paper not only enables real-time monitoring of the surrounding environment but also achieves accurate automatic tracking of the robot's path. Moreover, the experiments reveal that the system exhibits strong sensitivity and real-time tracking capabilities when applied, resulting in more precise robot positioning. Despite achieving these research outcomes, there are still some areas for improvement, such as reducing redundancy in lidar sensor-obtained laser point cloud data and generating a three-dimensional map of the robot's positional environment. Future researchers should focus on addressing these aspects.
Aknowledgement
This work is supported by the project of Jilin Provincial Education Department (JJKH20230671KJ), the Project of Jilin Provincial Development and Reform Commission (2023C042-4), and the project of Jilin Provincial Department of Human Resources and Social Security (2023RY17).
Reference
1. High precision positioning algorithm based on two-dimensional code vision and laser radar fusion / Luan Jianing, Zhang Wei, Sun Wei et al. // Computer applications. 2021. № 41. P. 1484-1491.
2. High precision laser tracking system based on two-dimensional galvanometer and position sensitive detector / Li Guicun, Fang Ya, Ji Rong Yi, etc. // Chinese Laser. 2019. № 46. P. 206-212.
3. ZEMAX Simulation method of laser tracking System Based on Double wavelength Method Compensating Air Refractive Index / Chen Hongfang, Tang
ВЕСТНИК ТОГУ. 2024. № 2 (73)
Liang, Shi Zhaoyao et al. // Chinese laser. 2019. № 46. P. 232-239.
4. Method of laser radar target tracking and location based on machine vision / Jiang Wenjuan, Wang Gaoping, Shao Kaili et al. // Laser Journal. 2023. № 44. P. 218-224.
5. Detection and Tracking of Moving Objects Using a Roadside LiDAR System / M. D'Arco, L. Fratelli, G. Graber, etc. // IEEE Instrumentation & Measurement Magazine. 2024. № 27. P. 49-56.
6. Positioning and perception in LIDAR point clouds / Tamas Sziranyi, Saba Benedek, Andras Majdik, Balazs Nagy, etc. // Digital Signal Processing. 2021. № 119. P.1051-2004.
7. Costa F. A. L., Mitishita E. A., Martins M. The Influence of Sub-Block Position on Performing Integrated Sensor Orientation Using In Situ Camera Calibration and Lidar Control Points // Remote Sens. 2019. № 10.
8. Calibrating Range Measurements of Lidars Using Fixed in Unknown Positions / Alhashimi A., Magnusson M., Knorn S. etc. // Sensors. 2021. № 21. P. 155163.
Заглавие: Мобильная система слежения для роботов на основе машинного зрения и лазерного радара
Авторы:
Хэ Бингао - Чанчуньский университет (КНР) Фань Цайтянь - Чанчуньский университет (КНР) Му Синьбэй - Чанчуньский университет (КНР) Ван Жуй - Чанчуньский университет (КНР)
Аннотация: Предлагается решение проблемы недостаточной производительности и точности отслеживания траектории мобильного робота в реальном времени за счет внедрения системы, сочетающей в себе машинное зрение и лазерный радар. В этом исследовании микроконтроллер Broadcom BCM2711 подключен к интерфейсу связи RS232 для передачи информации на встроенный процессор ARM. Пользователи могут получить доступ к позиционированию, управлению и данным, связанным с функционированием робота через ЖК-дисплей человеко-машинного интерфейса в среде операционной системы Windows. Программа адаптивного алгоритма позволяет отслеживать положения робота. Результаты экспериментов показывают значительные улучшения как в производительности, так и в точности этой системы слежения за мобильными роботами в режиме реального времени.
Ключевые слова: лазерный радар, мобильный робот, система слежения