Научная статья на тему 'CONTROLLER FOR INDOOR 4-WHEEL CAMERA-CONTROLLED OMNI ROBOT'

CONTROLLER FOR INDOOR 4-WHEEL CAMERA-CONTROLLED OMNI ROBOT Текст научной статьи по специальности «Компьютерные и информационные науки»

CC BY
23
5
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
4-wheel omni robot / object detection and tracking / YOLOv8 / ByteTrack / ROS / 4-х колесный омни-робот / обнаружение и отслеживание объектов / YOLOv8 / ByteTrack / ROS

Аннотация научной статьи по компьютерным и информационным наукам, автор научной работы — Le Ba Chung, Pham Thanh Hung, Nguyen Van Thanh

The aim of this article is to design solutions and built a control system for a transportation robot using omni wheels guided by a camera. The robotic design solution incorporates four omni-wheels, enabling smooth and flexible move-ment in multiple directions on flat surfaces. The camera navigation solution is a combination of the object detection algorithm YOLOv8 with the object assignment algorithm ByteTrack to build an autonomous system for the robot to follow the person. The test results show that the robot follows the person relatively accurately, moves without jerk-ing, and does not lose the object it is following when the object moves quickly. This configuration makes the robot well-suited for indoor environments such as restaurants, hospitals, and similar settings.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

КОНТРОЛЛЕР ДЛЯ КОМНАТНОГО 4-КОЛЕСНОГО ОМНИ-РОБОТА, УПРАВЛЯЕМОГО КАМЕРОЙ

Целью данной статьи является разработка решений и создание системы управления для транспортного ро-бота, использующего омни-колес и управляемого камерой. В конструкцию робота включены четыре омни-колеса, которые обеспечивают плавное и гибкое движение в различных направлениях по ровным поверхно-стям. Навигация с помощью камеры представляет собой комбинацию алгоритма обнаружения объектов YOLOv8 с алгоритмом присвоения объектов ByteTrack для создания автономной системы, позволяющей робо-ту следовать за человеком. Результаты испытаний показывают, что робот следует за человеком относительно точно, движется без рывков и не теряет объект, за которым он следует, даже если объект движется быстро. Эта конфигурация делает робота подходящим для использования в помещениях, таких как рестораны, больницы и аналогичные места.

Текст научной работы на тему «CONTROLLER FOR INDOOR 4-WHEEL CAMERA-CONTROLLED OMNI ROBOT»

Ад. UNIVERSUM:

№10(127)_m ТЕХНИЧЕСКИЕ НАУКИ_октябрь. 2024 г.

PAPERS IN ENGLISH

COMPUTER SCIENCE, COMPUTER ENGINEERING AND MANAGEMENT

DOI - 10.32743/UniTech.2024.127.10.18344 CONTROLLER FOR INDOOR 4-WHEEL CAMERA-CONTROLLED OMNI ROBOT

Le Ba Chung

PhD, lecturer, Le Quy Don Technical University, Viet Nam, Ha Noi E-mail: chungbaumanvietnam@,gmail.com

Pham Thanh Hung Student,

Le Quy Don Technical University, Viet Nam, Ha Noi E-mail: [email protected]

Nguyen Van Thanh

Student,

Le Quy Don Technical University, Viet Nam, Ha Noi E-mail: thanhpocollo@gmail. com

КОНТРОЛЛЕР ДЛЯ КОМНАТНОГО 4-КОЛЕСНОГО ОМНИ-РОБОТА, УПРАВЛЯЕМОГО КАМЕРОЙ

Ле Ба Чунг

канд. техн. наук, преподаватель, Технический Университет имени Ле Куи Дона,

Вьетнам, Ханой

Фам Тхань Хынг

студент,

Технический Университет имени Ле Куи Дона,

Вьетнам, Ханой

Нгуен Ван Тхань

студент,

Технический Университет имени Ле Куи Дона,

Вьетнам, Ханой

ABSTRACT

The aim of this article is to design solutions and built a control system for a transportation robot using omni wheels guided by a camera. The robotic design solution incorporates four omni-wheels, enabling smooth and flexible movement in multiple directions on flat surfaces. The camera navigation solution is a combination of the object detection algorithm YOLOv8 with the object assignment algorithm ByteTrack to build an autonomous system for the robot to follow the person. The test results show that the robot follows the person relatively accurately, moves without jerking, and does not lose the object it is following when the object moves quickly. This configuration makes the robot well-suited for indoor environments such as restaurants, hospitals, and similar settings.

Библиографическое описание: Le B.C., Pham T.H., Nguyen V.T. CONTROLLER FOR INDOOR 4-WHEEL CAMERA-CONTROLLED OMNI ROBOT // Universum: технические науки : электрон. научн. журн. 2024. 10(127). URL: https://7universum.com/ru/tech/archive/item/18344

ЛД UNIVERSUM:

№10(127)_m ТЕХНИЧЕСКИЕ НАУКИ_октябрь. 2024 г.

АННОТАЦИЯ

Целью данной статьи является разработка решений и создание системы управления для транспортного робота, использующего омни-колес и управляемого камерой. В конструкцию робота включены четыре омни-ко-леса, которые обеспечивают плавное и гибкое движение в различных направлениях по ровным поверхностям. Навигация с помощью камеры представляет собой комбинацию алгоритма обнаружения объектов YOLOv8 с алгоритмом присвоения объектов ByteTrack для создания автономной системы, позволяющей роботу следовать за человеком. Результаты испытаний показывают, что робот следует за человеком относительно точно, движется без рывков и не теряет объект, за которым он следует, даже если объект движется быстро. Эта конфигурация делает робота подходящим для использования в помещениях, таких как рестораны, больницы и аналогичные места.

Keywords: 4-wheel omni robot, object detection and tracking, YOLOv8; ByteTrack, ROS.

Ключевые слова: 4-х колесный омни-робот, обнаружение и отслеживание объектов, YOLOv8, ByteTrack, ROS.

1. Introduction

Indoor transportation robots are being increasingly widely utilized across in many fields thanks to their ability to optimize processes and reduce labor costs. Some applications of transportation robots in residential, hospital, and warehouse environments include: logistics (robots help carry items such as clothing, necessities, and other goods from one room to another); support for the disabled and the elderly (robots can deliver drinks, medicine or daily necessities to those in need, promoting greater independence); transfer of medical supplies (robots quickly transport medicine, surgical instruments or medical records between different areas in the hospital); reconnaissance and patrol (robots can be equipped with sensors and cameras to perform reconnaissance and patrol missions, providing security information in the factory and residential areas they are responsible for).

Currently, in the world, numerous organizations and individuals worldwide have conducted research on indoor transportation robots using advanced image processing technologies [1-6]. In Vietnam, many universities and research groups have published made significant contributions to this field, publishing a variety of research papers, typically: a group of scientists from the Faculty of Aerospace Engineering/ Le Quy Don Technical University have researched and successfully manufactured a medical transportation robot system to serve in infectious disease isolation areas [7-8]; A group of scientists from Hanoi University of Science and Technology has successfully researched and manufactured an autonomous robot for transportation in factories [9]; a research group from Phenikaa University has successfully designed and manufactured a transportation robot using renewable energy sources [10].

Despite the positive outcomes achieved such as: the ability to self-propell, avoid dynamic and static obstacles in an available map; the ability to detect, identify and track objects; the ability to move flexibly and smoothly, ... there are still, several issues that need to be addressed. The stability and reliability of the entire system remain low, the robot often loses its position, has problems leading to low reliability when put into operation in practice. The underlying cause of these limitations is temporal, as at present, navigation system based on sensors such as IMU, Lidar, and encoders is not yet stable and reliable, leading to the control system built on this idea not operating stably and with low efficiency.

In addition, guided by camera has a lot of potential for development, such as improving stability, continuity, efficiency and responsiveness to be able to develop into practical applications on robots [11].

Based on the evaluation of both domestic and international research, the authors propose a research direction focused on developing a controller for an omni-wheeled transportation robot guided by a camera. The aim is to verify and assess the effectiveness of object detection and tracking algorithms on an actual omni-wheeled mobile robot.

Based on the evaluation and analysis of the advantages and disadvantages of the transportation robot design in previous studies, as well as considering the goal of enhancing the robot's mobility, the authors have established the following design criteria:

• The 4-wheel omni robot can operate smoothly in indoor environments under normal temperature and humidity conditions.

• The real environment surface is not too rough, not slippery.

• Maximum moving speed is equivalent to indoor walking speed 0.6 m/s.

• Weight 10kg and load bearing 15kg.

• Expected dimensions 50*50*100 cm.

2. Kinematic model of 4-wheel omni robot

The robot model is a 4-wheel omni type arranged at an angle of 900 to each other as shown in Figure 1.

Figure 1. Model of omni 4-wheel autonomous robot

Let X_[x y 9\ be the coordinate vector of the robot in the global coordinate system,

= [-^'/i J'/i ] be the velocity vector of the robot in the coordinate system attached to the robot center P.

Considering the geometric relationship in Fig. 2, we can determine the expression for the relationship between velocity in the local coordinate system compared to the global coordinate system:

Xr=R(0)X

With rotation matrix:

R (0) =

cos(0) sin(0) 0 - sin(0) cos(0) 0

0

0

1

(1)

Figure 2. Global and local coordinate systems of the robot

The general kinematic model of an omni wheel is shown in Figure 3. The rotation center is located at A. The position of A attached to the robot local coordinate system is described using polar coordinates through the distance from A to P and the deviation angle a, the rotation axis of the wheel makes an angle / with PA. The wheel motion is represented by the rotation angle over time.

Here, define some parameters as follows:

- (p - rotation angle of wheel around axis.

- r - radius of wheels.

- J - deviation angle between the wheel plane and the rotation axis of the rollers.

- l - distance from wheel center to robot rotation center.

Figure 3. General kinematic model for an omni wheel

The rolling constraint of the wheel is represented by the equation:

[sin (a + ß + y) -cos (a + ß + y) -lcos(ß + y)].R(9).X-r.(p.cosy = Q

(2)

The no-slip constraint of the wheel is expressed as follows:

[cos(or + ß + y) sin(or + ß + y) I. sm(ß + y)] ,R(6).X - r.tp. sin y - r^.tp^ = 0

(3)

With the omni-wheel configuration selected for the robot model at a specific angle y = 0 , the wheel motion and the roller motion are independent of each other.

As a result, the non-slip kinematic constraint equations do not contribute to the construction of the robot's kinematic model. Therefore, the four constraint equations for the 4 wheels are obtained as follows:

[sin^ + ßl +y1) -cos(aj + ßl+yl)-ll cos(ß1 + f1)].i?(6').X- r.g>1.cosy1 = 0 [sin(a? + /?, +f?)-cos(a? + ß, + y1)-l1 cos(/?? + y1)\R(0)X-r.q>1.cosy1 =0 [sin(a3 + ß3 + y3) - cos(a3 + ß3 + y3) -13 cos(/?3 + f3)]i?(6>).X - r.(p3.cosy3 = 0 [sin(a4 +ß4 + y4)-cos(a4 + ß4 +y4)-l4cos(ß4 +y4)].R(6)X-r.<p4.cosy4 = 0

(4)

In there:

- ex , /} and Yt (rad) is the omni 4 wheel mount angle;

- i (m) is the distance between center O to each wheel;

- r (m) is the radius of each wheel;

- (p. (rad/s) is the angular velocity of each wheel.

By substituting equation (1) into equation (4) and performing a series of transformation steps, the resulting equation is obtained as follows:

sin(^ + ß + y1 ) — cos(^ + ß + y1 ) —/ cos(ßj + y )

cosy cos y cos y

Ci\ sin(«2 + ß2+ y2 ) — cos(«2 + ß + y2) —/2 cos(ß + y2)

(D2 1 cosy2 cos y cos y

co3 r sin(«3 + ß + y ) —cos(«3 + ß + y3) —/3 cos(ß + y3)

G\ cosy cos y cos y

sin(«4 +ß4 +y4) — cos(«4 +ß4 + y4) —/4 cos(ß4 + y4)

cosy4 cosy4 cosy4

yR

r .

The Jacobian matrix for the inverse kinematic equation of the system is represented by the equation:

J

sin(a + 3 + j ) — cos(a + 3 + J ) ~h cos(/ + J )

cos J cos J cos J

sin(a2 + 32 + J2) — cos(a2 +32 +J2) —12 cos(/2 + J2)

cos J2 cos J2 cosJ2

sin(a + 3 +J ) cos(a + 3 + j ) —13 cos(/3 + J3)

cos J cos J cos J

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

sin(a4 + /4 + j4) — cos(a4 +34 + j4) -14 cos(/4 + J4)

cos j4

cos J'4

cos J'4

(6)

The forward kinematic equation is represented as follows:

n

yJt

CO

R

J

CO,

(7)

Where: J is the pseudo-inverse matrix of J. From there we can calculate the forward kinematics and inverse kinematics for the mobile robot using 4 omni wheels as follows:

V2

2 £

2 2

V2 2

_V2 2

2

V2 2

V2 2

—li —12 —l

Xn

}'r

v2 —v2 —v21

y* r 4 -vi v2 v2 —v2 c2

CR _ -i/A —1/ l2 —1/ l3 —1/ l4 c3 _®4

(8)

(9)

These are the equations used for the design and synthesis of controllers for the 4-wheel omni robot later.

3. Design and synthesis of controller 3.1. Structure diagram of system hardware

Based on the design requirements and the selected robot model, the authors proceeded to calculate, select components, manufacture, assemble and obtain the model as shown in Figure 4.

Figure 4. Model of mobile robot using 4 omni wheels

The drive part of the robot model is powered by four DC motors with encoder sensors, receiving voltage from BTS7960 power units. These power units are controlled by Arduino Mega 2560 board. The block diagram of the robot hardware control system is illustrated in Figure 5. Image processing algorithms are implemented on a personal computer, where the output of the image processing

program is the deviations of the object center position compared to the image center in the two directions X and Y. From these deviation values, the software program will calculate the set speed to each motor.

These set velocity values will be transferred to the Arduino Mega 2560 board via ROS to hash the control pulses of the 04 motors.

battery 24 v

Arduino Mega 2560

DC DRIVER ■ DC

BTS7960 ■ MOTOR

■ DC DRIVER BTS7960

■ m DC DRIVER BTS7960

DC DRIVER BTS7960

DC MOTOR

DC MOTOR

DC MOTOR

Figure 5. Connection diagram of hardware system on robot

Based on the robot hardware connection diagram, the robot control system can be divided into 2 layers:

• Lower Layer Control System: This layer includes the Arduino Mega 2560 controller and the BTS7960 power supplies;

• Upper Layer Control System: This layer consists of a laptop connected to a camera, responsible for implementing image processing algorithms, calculating the desired velocity for the four wheels, and transmitting this information to the Arduino board using the communication environment provided by ROS.

The two layers of the hardware system are organized into three layers within the robot model as illustrated in Figure 4. This structure serves as a foundation for the authors to further research and develop a controller for the robot model.

The use of Arduino for generating pulses to control actuators has been widely researched and published by many organizations and individuals. Therefore, in the content of the article, the authors focuse on presenting the results of implementing image processing algorithms and synthesizing controller on the upper layer.

3.2. Algorithm flowchart

To enable the robot to follow humans in various tasks such as transportation and assisting the disabled, three primary tasks must be addressed:

• Object detection task;

• Object tracking task;

• Calculate the set velocity for 4 wheels.

For object detection, several advanced techniques are being used such as: YOLO, SSD, Faster R-CNN. However, when applied to the task of controlling a robot to move and follow a person, the authors found that the YOLO algorithm is a suitable choice due to the algorithm's real-time processing ability.

In terms of object tracking, various modern techniques are employed, such as Deep SORT, Strong-SORT, and ByteTrack. After careful research and evaluation, the authors selected the ByteTrack algorithm for deployment in the controller, as it demonstrates superior speed and a greater ability to handle real-life scenarios compared to other object tracking algorithms.

Based on the above analysis and selection of object detection and tracking algorithms, we have an algorithm flowchart for the entire control system as shown in Figure 6.

Starting the algorithm, the image captured from the camera will be sent to the image processing module. Here, through the process of applying the object detection algorithm YOLOv8 and the object tracking algorithm ByteTrack, the object center position on the frame will be obtained. The deviation calculation module will calculate the deviation of the object center compared to the frame center. These two modules are processed on the ROS2 environment to ensure real-time. Then the deviation values in the two directions X and Y will be sent to the ROS environment via ros1_bridge. From there, the "Calculate the wheel set velocity" module calculates the set velocities for the 4 DC motors through the deviation of the object's center with the frame center and the inverse kinematic equation. The set velocities after

being calculated are sent down to the Arduino Mega 2560 board via rosserial. The motor control module uses the PID algorithm to output a signal to control the motor velocity according to the set velocity. The algorithm runs in a loop until it stops when there is an exit command.

Image Processing

YOLOv8

ByteTrack

Deviation calculation x_error; y_error

Velocity calculation

PID control for motors

FALSE

Figure 6. Algorithm flowchart

Thus, with the combination of the two algorithms mentioned above, the program calculates the error of the tracked object center with the fixed coordinate system of the frame. Here, the error in the X axis (x_error) is calculated as the error between the tracked object center and the frame center (coinciding with the robot middle axis). This error is used to calculate the robot deviation angle from the tracked object. The error in the Y axis (y_error) is calculated by the lower edge of the tracked

object frame with the horizontal line corresponding to the pixel value y = 475 [11]. The y error is used to determine the relative distance of the robot from the object, serving to calculate the translational velocity of the robot. This value can be customized so that the robot maintains a suitable distance from the object being tracked.

When ROS receives the error values from ROS2, the program starts calculating the angular velocities for the wheels from the two error parameters. First, we calculate the linear velocity and angular velocity of the robot according to the formula:

1

® = —

r 1

a2=-r

f 42 i

-k y error - Ik x e

2 y/_ x _

V

r42 r

-k y error - Ik x e

2 _ x _

1

® = — r

r

' ^k h " --k y error - Ik x error

2 _ x _

y

- ^^ k y error — Ik x x error 2 y/_ x _

(

(11)

ffi' = xR = kY t y _ eiror

H

g® = kx r x _ error

(10)

with k x , k y being optional gain coefficients to set the

robot velocity limit selected through the experimental process on the robot model with the appropriate alignment to the task requirements. From there, substituting into equation (8), we get the set veloctiy for 4 wheels according to the following deviation:

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

The set velocity for the 4 motors is sent down to the Arduino Mega 2560 via the rosserial package to control the actuator motors to move the robot to follow the specified object.

4. Experimental results

The image processing program is implemented using Python 3.8, incorporating the YOLOv8 module and the ByteTrack algorithm within a ROS2 environment on a personal computer. The following results were achieved:

Figure 7. Result of object recognition and ID assignment for an object to be tracked

Based on Figure 7, it is evident that the YOLOv8 algorithm has successfully determined the center of the object (indicated by the red dot) and the ByteTrack algorithm has assigned ID = 1 to the object that the robot wants to follow.

When multiple objects appear in the frame, the program generates the result as illustrated in figure 8.

Figure 8. Result of recognition and ID assignment for multiple objects

As shown in Figure 8, when 2 objects appear in a frame, the program detects and assigns each object a different ID (ID = 1 and ID = 2).

Figure 9. Robot moves to follow the object

The experimental results of robot tracking the object are shown presented in Figure 9. The tracking result is satisfactory when the object moves at a speed of less than 0.6 m/s. Along the Y axis, the robot maintains a relatively good distance, however, along the X axis, the robot's tracking angle is not good yet, there is still a phenomenon of oscillation (shaking) around the main axis. The reason for the above phenomenon is due to the limitation of computing resources of personal computers, the parameters of the PID motor controller have not been optimally selected, and the influence of the transmission speed between the Laptop and Arduino. These disadvantages will be overcome by the authors in subsequent studies.

Experimentally, when the object is hidden from the camera's view (<4s), the proposed algorithm gives positive results: the robot correctly re-detects the object's ID and follows it. This is necessary for the real-world object tracking task.

5. Conclusion

The integration of mechanical design solutions utilizing four omni-wheels and advanced image processing technology represents an effective approach to developing transportation robots for use in warehouses, factories, restaurants, hospitals, and similar environments. This combination yields positive test results, effectively mitigating some of the limitations observed in previously researched and developed robots.

Experimental findings demonstrate that the robot operates smoothly, without shaking, and can successfully track the object even in the presence of noise, such as when the object temporarily disappears from the frame or becomes obscured.

The research outcomes presented in this paper have significant implications for the development of transportation robots in logistics, as well as for medical robots aimed at assisting individuals with disabilities and aiding medical staff in the transportation of medications and essential supplies.

References:

1. Object Detection and Tracking in Cooperative Multi-robot Transportation. Zoran Miljkovic, Milica Petrovic, Lazar Bokic. 38th International Conference on Production Engineering of Serbia - ICPE-S 2021. P. 137-143. URL: https://www.researchgate.net/publication/356810511_Object_Detection_and_Tracking_in_Cooperative_Multi-robot_Transportation (дата обращения: 10.5.2024).

2. Design and implementation of a computer Vision-based Object Tracking Robot using an Uncalibrated Camera. Nitish Nitu, Kalyaanjee Barman, Amit Kumar. Research square. 21 Pages. DOI: 10.21203/rs.3.rs-4269108/v1. URL: https://www.researchgate.net/publication/380024644_Design_and_implementation_of_a_computer_Vision-based_Object_Tracking_Robot_using_an_Uncalibrated_Camera (дата обращения: 30.5.2024).

3. Onboard Dynamic-Object Detection and Tracking for Autonomous Robot Navigation with RGB-D Camera. Zhefan Xu, Xiaoyang Zhan, Yumeng Xiu, Christopher Suzuki, Kenji Shimada. IEEE Robotics and Automation Letters. January 2023. DOI: 10.1109/LRA.2023.3334683. P. 99-107. URL: https://www.researchgate.net/publica-tion/375777312_Onboard_dynamic-object_detection_and_tracking_for_autonomous_robot_navigation_with_RGB-D_camera (дата обращения: 18.5.2024).

4. Human follower robot. Sambhaji Patil. Interantional journal of scientific research in engineering and management. April 2024.

P. 8-18. DOI: 10.55041/IJSREM30976. URL: https://www.researchgate.net/publication/379904482_HUMAN_FO LLOWER_ROBOT (дата обращения: 10.5.2024).

5. A human tracking mobile-robot with face detection. Suzuki Satoru, Yasue Mitsukura, Hironori Takimoto, Takanari Tanabata, Nobutaka Kimura, Toshio Moriya. Conference: Industrial Electronics, 2009. IECON '09. 35th Annual Conference of IEEE. December 2009. P. 4127-4222. URL: https://www.researchgate.net/publication/224115757_A_human_tracking_mobile-robot_with_face_detection (дата обращения: 23.6.2024).

6. Robust Autonomous Car-Like Robot Tracking Based on Tracking-Learning-Detection. Lin Bao Xu, Shu Ming Tang, Jin Feng Yang, Yan Min Dong. Applied Mechanics and Materials. November 2014. DOI: 10.4028/www.scientific.net/AMM.687-691.564. P. 564-571.

URL: https://www.researchgate.net/publication/286789381_Robust_Autonomous_Car-Like_Robot_Tracking_Based_on_Tracking-Learning-Detection (дата обращения: 16.6.2024).

7. Development of an Autonomous Mobile Robot System for Hospital Logistics in Quarantine Zones. Tang Quoc Nam, Hoang Van Tien, Nguyen Anh Van va Nguyen Dinh Quan. ICISN 2023. Vol. 1. № 2. P. 57-72.

8. An indoor localization method for mobile robot using ceiling mounted apriltags. Van Tien Hoang, Quoc Nam Tang, Xuan Tung Truong, Dinh Quan Nguyen. Journal of Science and Technique - ISSN 1859-0209. P. 143-159. https://doi.org/10.56651/lqdtu.jst.v17.n05.531.

9. Design of autonomous robots for transportation in factories. Vu Xuan Thang. Electronic library of Hanoi University of Science and Technology. 2019. Vol. 3. №1. Issue 7. P. 147-161.

10. Building a transportation robot using renewable energy sources. Pham Trung Nam, Nguyen Huu Thuy, Nguyen Thi Ngoc. Electronic Library of Phenikaa University. 2023. Vol. 1. №4. Issue 9. P. 1123-1140.

11. Real-time object detection and tracking for mobile robot using YOLOv8 and StrongSORT. Le Ba Chung, Nguyen Duc Duy. Universum: технические науки. 2023. №11-6 (116). P. 36-44. URL: https://cyberleninka.ru/article/n/real-time-object-detection-and-tracking-for-mobile-robot-using-yolov8-and-strong-sort (дата обращения: 9.12.2023).

i Надоели баннеры? Вы всегда можете отключить рекламу.