Научная статья на тему 'DEVELOPMENT OF MOBILE ROBOT FOR MONITORING AND FIRE EXPLORATION'

DEVELOPMENT OF MOBILE ROBOT FOR MONITORING AND FIRE EXPLORATION Текст научной статьи по специальности «Компьютерные и информационные науки»

CC BY
28
8
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
MOBILE ROBOT / QUADCOPTER / FIRE EXPLORATION / PROCESSING OF IMAGE / FIRE DETECTION

Аннотация научной статьи по компьютерным и информационным наукам, автор научной работы — Kaliyev D.I.

Quadcopters find their application in many areas of people life nowadays. This research work summarizes the development of a mobile exploration robot with an information-reading algorithm, methods of processing of image color component data, and processing video for fire detection.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «DEVELOPMENT OF MOBILE ROBOT FOR MONITORING AND FIRE EXPLORATION»

UDC 004.932

D.I. Kaliyev

DEVELOPMENT OF MOBILE ROBOT FOR MONITORING AND FIRE EXPLORATION

Quadcopters find their application in many areas of people life nowadays. This research work summarizes the development of a mobile exploration robot with an information-reading algorithm, methods of processing of image color component data, and processing video forfire detection.

Keywords: mobile robot, quadcopter, fire exploration, processing of image, fire detection.

One of the absolutely new application of a quadrocopter is in a fire safety, namely fire exploration. In this area, quadcopters are used to detect fires, assess their scale for prompt elimination. Quadcopters allow detection of smoke, forest fires on time, analyze the air condition, the presence of harmful substances in the air and their concentration to determine the affected area [7].

Detection of fire or flame by remote photography or video recording of controlled terrain and their automated analysis are important tasks. Now widely used computer systems using digital image processing. The paper proposes an approach to solving this problem, based on the use of a quadcopter as an unmanned aerial exploration robot and data processing of the image from the robot, which contain or not contain a fire.

How quadcopter works? The frame provides the structure and rigidity and it's where all the components will be mounted. The flight controller manipulates the RPM of the individual motors in response to the user's input. If you tell it to go forward, the flight controller will adjust the RPM of the rear motors. The ESCs take the signals from the flight controller and adjust the speed of the motors. The motors and propellers provide the thrust and the lift for quadcopter. The battery provides the power to all the components. The radio (transmitter and receiver) allows you to control the movements of the quadcopter.

Build a quadrocopter. The main element of the robot is a flight controller, selected Ardupilot Mega 2.8. The controller is built on Atmel ATMEGA 2560 integrated circuits with eight inputs and outputs for signals with pulse-width modulation, three ports of UART, built-in USB-FTDI adapter, 4 MB flash memory for telemetry data records. It includes an inertial measurement system (IMS) and a GPS receiver. The GPS module is required for horizontal positioning to fly along a given route. The module is brought up 12cm upwards to improve the accuracy of measurements of the magnetometer, in addition the reception of GPS signals is better if the upper hemisphere of the antenna, which is also located on the module, is completely open. I used Ublox Neo-M8N.

The built-in barometer MS5611-01BA03 is used to measure the height, the accelerometer and gyroscopes and a magnetometer are used to determine the angular position [3].

The quadcopter body is a DJI 450 frame with radial rays 450 mm in diameter. A brushless motor with an external rotor and a controller is installed at the end of each of the four booms. Electronic Speed Controller (ESC) are Littlebee 30A ESC. They are small and reliable, especially for automatic control. One of the important part of quadcopter is propellers, because propellers largely effect the speed at which the quadcopters fly, the load that they can carry, and the speed at which they can maneuver. To improve the quality of control of the flight controller, vibration isolation with a frame is needed (to reduce noise on the accelerometer and gyroscope), therefore the controller is installed on the damping platform.

To send commands from the ground and receive telemetry from a quadcopter, radio communication is necessary. A good choice is the FlySky I6 with an IA10B receiver. The 2.4 GHz version is installed on the quadcopter that allows to fly for 800-1200 meters without direct visibility. Due to modems, we can send a compiled flight mission to a quadrocopter, give a command to take off or return home, change the autopilot parameters and much more. Also from the robot we can get information about the location, altitude, battery charge, etc. In general, the communication range is not so important, because in long-distance flights the quadrocopter moves independently according to a recorded mission [3].

The fire detection device is a wide-angle action camera with the function of data output using telemetry. The TS832 transmitter installed on the quadcopter, which transmits video to the TC832 receiver at a frequency of 5.8 GHz to the ground.

© Kaliyev D.I., 2018.

Научный руководитель: Швец Ольга Яковлевна - кандидат технических наук, доцент, Восточно-Казахстанский государственный технический университет им. Д.Серикбаева, Казахстан.

The quadcopter has a power module that converts the battery voltage (3 s, 11.1 V) into the power supply voltage of the control electronics (5 V). The power board is equipped with an amperage and voltage sensor to monitor the battery charge. The capacity of the power source is a 5000 mAh 3 -cell battery. The quadcopter assembly is shown in Figure 1.

Fig. 1. Quadcopter with camera for fire exploration

Fire detection. Analysis of digital video. The functioning of monitoring systems can be based on the analysis of photographs or video sequences, in other words, static or dynamic images. There are two main approaches: the detection of moving objects and color analysis.

The basic idea of building the algorithm (Fig. 2) is that the optimally selected combination of different approaches, each of which allows determining a specific flame feature with high accuracy, it must determine the presence of a flame on video frames received from a quadcopter camera with a high probability.

The flame has a large number of different signs, such as color, variability (dynamics), shape, and the behavior of the smoke that appears with the flame, etc. In this work, following the article [1], attention is focused on two basic signs - color and variability (dynamics).

Fig. 2. Fire Detection Algorithm

The principle of detecting moving objects is often used to highlight fires by subtracting successive frames or a background image [2]. In the first case, it finds changes in the images when moving from one frame to another. The main disadvantage of this method is that overlapping areas in images can be mistakenly taken as background.

In the case of subtracting the background image, the dynamic areas are extracted from the static background image, the main disadvantage is that the area can be extracted erroneously, if the background image is not updated on time or incorrectly. However, this method can be used to assess the characteristics of the fire, for example, to measure the coordinates of a fire front.

Another way to detect areas of fire is color analysis. Specific implementations of this method are based on space analysis of abstract mathematical color models, which are sets of 3-4 numbers. The most common color models are [6]:

-RGB describes each color with a set of three coordinates, each of which corresponds to the decomposition of a color into red, green and blue components;

-YCbCr is one of the ways to encode RGB information, where Y is brightness, Cb and Cr are characterize gamma correction;

-HSL or HSI describes each color with a set of three coordinates - color hue, saturation, lightness [8];

-HSV describes each color with a set of three coordinates - hue, saturation, value

Based on the methods describing the flame in the RGB color model, we find belonging of pixel to the flame image based on the rules by the system [3]:

where, R (x, y), G(x, y), B (x, y) - red, blue and green values in pixel (x, y), K - total pixels, Rmean - average intensity of red color on the basis of three selected rules, according to which the saturation of each possible pixel of the flame image must be greater than a certain threshold value [5].

In some cases, it is possible to describe a flame image in the YCbCr color space, using a fuzzy inference system [4] to make a decision about whether a pixel belongs to a flame image.

Dynamic characteristics of fire allow allocate it from other objects of similar color. Analyzes for temporal changes in intensity for each pixel on several consecutive frames [8]. If these changes exceed a certain threshold value, it is taken for a pixel that belong to the image of the flame. It is assumed that the height of the flame changes with time due to the movements of its tongues, therefore the height is the main dynamic characteristic of the flame [9]. In some cases, we take the history of changes in the red channel of each pixel of an RGB image belonging to the fire contour during a short period of time [9]. This data is then used as input in the analysis.

A simple fire detection program (Figure 3).The main aim of this example is to automatically detect fire in video (pnc.4), using computer vision methods, implemented in real-time with the aid of the OpenCV library. Given the computer vision and image processing point of view, stated problem corresponds to detection of dynamically changing object, based on his color and moving features [12].

While static cameras are utilized, background detection method provides effective segmentation of dynamic objects in video sequence. Candidate fire-like regions of segmented foreground objects are determined according to the rule-based color detection.

r

R(x,y) > R,

Vj

-R(x,y) > G(x,y) > B(x,y)

1 #Retrieve current video frame

2 capture.retrieve(frame);

3 #Update background model and save foreground mask to

4 BackgroundSubtractorM0G2 pM0G2;

5 Mat fgMaskM0G2j

6 pM0G2(f ramefgMaskM0G2);

7 #Convert current 8-lbit frame in RGB colon space to 32-bit floating point YCbCr color space. S frame.convertToitemp, CV_32FC3J 1/255.0);

9 cvtColor(tempj imageYCrCb, CV_BGR2YCrCb);

10 #For every frame pixels check if it is foreground and if it meets the expected fire color features

11 colorMask = Mat(frame.rows, frame.cols, CV_8UC1); 12" for (int i = 0; i < imageYCrCb.rows; i++){

13 const uchar* fgMaskValuePt = fgMaskMOG2.ptr<uchar>(i);

14 uchar* colonMaskValuePt = colonMask.ptr<uchar>(i)j

15 fon (int j = 0; j < imageYCrCb.cols; j++){ if (fgMaskM0G2[j] >- 0 && isFirePixel(i, j))

16 colorMaskValuePt[j] = 255;

17 else

18 colorMaskValuePt[j] = 0;

19 }

20 } 21 ...

22 const int COLOR_DETECTION_THRESHOLD = 40;

23" bool isFirePixel(const int row, const int column){

24

25 if (valueY > valueCb

26 && intValueCr > intValueCb

&& (valueY > meanY &ä valueCb < meanCb && valueCr > meanCr)

28 && ((abs(valueCb - valueCr) * 255) > C0L0R_DETECTI0N_THRESH0LD))

29 return true;

30 #Draw bounding rectangle

31 vector<Point> firePixels;

32 ...

33 if (colorMaskPt[j] > 6)

34 firePixels.push_back(Point(jj i));

35 ...

36 nectangle(framej boundingRect(firePixels), Scalar{0, 255, 0), 4, 1, 0)

Fig. 3. Fire detection program in Python using the OpenCV library

Fig. 4. Example of fire detection from video frame

Thus, the existing development of a mobile robot suggests that it can be used for fire exploration and fire detection. Researches shown that the quadcopter is a multifunctional device that combines automatically the execution of a flight program from takeoff to landing, transmitting the necessary information to detect fire by using data from photographing controlled areas with subsequent computer processing of data. The use of modern image processing tools can significantly improve the efficiency of solving many practical problems.

References

1. Nicholas True. Computer Vision Based Fire Detection. San Diego: University of California, 2009.

2. Toreyin B. U., Cetin A. E. Online detection of fire in video // IEEE Conf. on Computer Vision and Pattern. Recognition Proc. 2007. P. 1—5.

3. http://ardupilot-mega.ru/wiki/arducopter/build-your-own-multicopter.html

4. Celik T., Ozkaramanli H., Demirel H. Fire and smoke detection without sensors: image processing based approach // Europ. Signal Proc. Conf. 2007.

5. Chen T. H., Wu P. H., Chiou Y. C. An early fire-detection method based on image processing // IEEE Intern. Conf. on Image Proc. 2004. P. 1707—1710.

6. Gonzalez R. C., Woods R. E. Digital image processing. Prentice Hall, 2002.

7. Ali Marjovi, J.G. Nunes, Lino Marques, A. Almeida. Multi-Robot Exploration and Fire Searching // IEEE/RSJ International Conference on Intelligent Robots and Systems October 11-15, 2009 St. Louis, USA.

8. Dost B., Genf M. Fire detection in video. Istanbul, Turkey.

9. Zhang J. H., Zhuang J., Du H. F. A new flame detection method using probability model // Intern. Conf. on Computational Intelligence and Security. 2006. P. 1614—1617.

10. Toreyin B. U., Dedeoglu Y., Gudukbay U., Cetin A. E. Computer vision based method for real-time fire and flame detection. 2006. N 27. P. 49—58.

11. Bradski G., Kaehler A. Learning OpenCV. O'Reilly, 2014.

12. CELIK, T., DEMIREL, H.: Fire detection in video sequences using a generic color model. In: Fire Safety Journal, 2016, 44.2: 147-158.

КАЛИЕВ ДАНИЯР ИСАТАЙУЛЫ - магистрант, Восточно-Казахстанский государственный технический университет им. Д.Серикбаева, Казахстан.

i Надоели баннеры? Вы всегда можете отключить рекламу.