Научная статья на тему 'AUTONOMOUS STREAMING SPACE OBJECTS DETECTION BASED ON A REMOTE OPTICAL SYSTEM'

AUTONOMOUS STREAMING SPACE OBJECTS DETECTION BASED ON A REMOTE OPTICAL SYSTEM Текст научной статьи по специальности «Компьютерные и информационные науки»

CC BY
99
60
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
VIDEO STREAM / DETECTION / EMBEDDED SYSTEM / SPACE OBJECT

Аннотация научной статьи по компьютерным и информационным наукам, автор научной работы — Baranova V. S., Saetchnikov V. A., Spiridonov A. A.

Traditional image processing techniques provide sustainable efficiency in the astrometry of deep space objects and in applied problems of determining the parameters of artificial satellite orbits. But the speed of the computing architecture and the functions of small optical systems are rapidly developing thus contribute to the use of a dynamic video stream for detecting and initializing space objects. The purpose of this paper is to automate the processing of optical measurement data during detecting space objects and numerical methods for the initial orbit determination.This article provided the implementation of a low-cost autonomous optical system for detecting of space objects with remote control elements. The basic algorithm model had developed and tested within the framework of remote control of a simplified optical system based on a Raspberry Pi 4 single-board computer with a modular camera. Under laboratory conditions, the satellite trajectory had simulated for an initial assessment of the compiled algorithmic modules of the computer vision library OpenCV.Based on the simulation results, dynamic detection of the International Space Station in real-time from the observation site with coordinates longitude 25o41′49″ East, latitude 53o52′36″ North in the interval 00:54:00-00:54:30 17.07.2021 (UTC + 03:00) had performed. The video processing result of the pass had demonstrated in the form of centroid coordinates of the International Space Station in the image plane with a timestamps interval of which is 0.2 s.This approach provides an autonomous raw data extraction of a space object for numerical methods for the initial determination of its orbit.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «AUTONOMOUS STREAMING SPACE OBJECTS DETECTION BASED ON A REMOTE OPTICAL SYSTEM»

Autonomous Streaming Space Objects Detection Based on a Remote Optical System

V.S. Baranova, V.A. Saetchnikov, A.A. Spiridonov

Belarusian State University, Nezavisimosti Ave., 4, Minsk 220030, Belarus

Received 01.10.2021

Accepted for publication 02.12.2021

Abstract

Traditional image processing techniques provide sustainable efficiency in the astrometry of deep space objects and in applied problems of determining the parameters of artificial satellite orbits. But the speed of the computing architecture and the functions of small optical systems are rapidly developing thus contribute to the use of a dynamic video stream for detecting and initializing space objects. The purpose of this paper is to automate the processing of optical measurement data during detecting space objects and numerical methods for the initial orbit determination .

This article provided the implementation of a low-cost autonomous optical system for detecting of space objects with remote control elements. The basic algorithm model had developed and tested within the framework of remote control of a simplified optical system based on a Raspberry Pi 4 single-board computer with a modular camera. Under laboratory conditions, the satellite trajectory had simulated for an initial assessment of the compiled algorithmic modules of the computer vision library OpenCV.

Based on the simulation results, dynamic detection of the International Space Station in real-time from the observation site with coordinates longitude 25o41'49" East, latitude 53o52'36" North in the interval 00:54:00-00:54:30 17.07.2021 (UTC +03:00) had performed. The video processing result of the pass had demonstrated in the form of centroid coordinates of the International Space Station in the image plane with a timestamps interval of which is 0.2 s.

This approach provides an autonomous raw data extraction of a space object for numerical methods for the initial determination of its orbit.

Keywords: video stream, detection, embedded system, space object, OpenCV. DOI: 10.21122/2220-9506-2021-12-4-272-279

Адрес для переписки: Баранова В.С.

Белорусский государственный университет, пр—т Независимости, 4, г. Минск 220030, Беларусь e-mail: rct.baranovaVS@bsu.by

Address for correspondence:

Baranova V.S.

Belarusian State University,

Nezavisimosti Ave., 4, Minsk 220030, Belarus

e-mail: rct.baranovaVS@bsu.by

Для цитирования:

V.S. Baranova, V.A. Saetchnikov, A.A. Spiridonov.

Autonomous Streaming Space Objects Detection Based

on a Remote Optical System.

Приборы и методы измерений.

2021. - Т. 12, № 4. - С. 272-279.

DOI: 10.21122/2220-9506-2021-12-4-272-279

For citation:

V.S. Baranova, V.A. Saetchnikov, A.A. Spiridonov.

Autonomous Streaming Space Objects Detection Based

on a Remote Optical System.

Devices and Methods of Measurements.

2021, vol. 12, no. 4, pp. 272-279.

DOI: 10.21122/2220-9506-2021-12-4-272-279

Автономное потоковое детектирование космических объектов на базе удаленной оптическои системы

В.С. Баранова, В.А. Саечников, А.А. Спиридонов

Белорусский государственный университет, пр-т Независимости, 4, г. Минск 220030, Беларусь

Поступила 01.10.2021 Принята к печати 02.12.2021

Привычные методы обработки стационарных изображений обеспечивают устойчивую результативность как в области астрометрии объектов глубокого космоса, так и в прикладных задачах определения параметров орбит искусственных спутников. Но быстродействие вычислительной архитектуры и функции малых оптических систем стремительно развиваются, что способствует возможности использования динамического видеопотока в приложении детектирования и инициализации космических объектов. Цель данной работы - автоматизировать процесс обнаружения и обработки данных оптических измерений космических объектов при мониторинге околоземного пространства и численных методах определения орбит.

В работе предлагается реализация малобюджетной автономной оптической системы детектирования космических объектов с элементами удалённого управления. Аппаратное и программное исполнение реализовано и протестировано в формате встраиваемой программной системы на базе Linux-ядра одноплатного компьютера Raspberry Pi и модульной камеры. В лабораторных условиях проведено макетное моделирование траектории движения спутника для предварительной оценки эффективности работы скомпилированных алгоритмических модулей библиотеки компьютерного зрения OpenCV.

На основании результатов моделирования выполнено экспериментальное динамическое обнаружение международной космической станции в режиме реального времени из точки наблюдения с координатами 25°41'49" в.д. 53°52'36" с.ш. в промежутке 00:54:00-00:54:30 17.07.2021 (UTC +03:00). Продемонстрирован результат обработки видеосъёмки пролёта в виде массива координат центроида международной космической станции в плоскости изображения с временными метками периодичностью 0,2 с.

Такой подход обеспечивает автономное извлечение предварительных данных с последующей их конвертацией в угловые координаты космического объекта для численных методов начального определения его орбиты.

Ключевые слова: видеопоток, детектирование, встраиваемая система, космический объект, OpenCV. DOI: 10.21122/2220-9506-2021-12-4-272-279

Адрес для переписки: Баранова В.С.

Белорусский государственный университет, пр—т Независимости, 4, г. Минск 220030, Беларусь e-mail: rct.baranovaVS@bsu.by

Address for correspondence:

Baranova V.S.

Belarusian State University,

Nezavisimosti Ave., 4, Minsk 220030, Belarus

e-mail: rct.baranovaVS@bsu.by

Для цитирования:

V.S. Baranova, V.A. Saetchnikov, A.A. Spiridonov.

Autonomous Streaming Space Objects Detection Based

on a Remote Optical System.

Приборы и методы измерений.

2021. - Т. 12, № 4. - С. 272-279.

DOI: 10.21122/2220-9506-2021-12-4-272-279

For citation:

V.S. Baranova, V.A. Saetchnikov, A.A. Spiridonov.

Autonomous Streaming Space Objects Detection Based

on a Remote Optical System.

Devices and Methods of Measurements.

2021, vol. 12, no. 4, pp. 272-279.

DOI: 10.21122/2220-9506-2021-12-4-272-279

Introduction

Due to the active space activity of numerous countries associated with the launch of an uncontrolled number of various sizes and purposes satellites, some altitudes of near-earth orbits tend to become oversaturated with an artificial space object [1]. Deployment of various missions in medium and low orbits increases the risks of cascade collisions. It also creates electromagnetic pollution conditions for deep-space exploration in different wavelengths [2]. At the moment, according to approximate statistical estimates, the total number of artificial functional and non-functional space objects (SO) with a diameter of more than 1 cm in nearearth orbit reached 1 million [3, 4]. And only artificial space objects larger than approximately 10 cm, which amounts to about 40000, are catalogued and actively tracked [1]. For this reason, an urgent task is the continuous outer space monitoring not only by specialized radio (radar) and optical complexes but also by mobile optical systems for ground-based astronomical observations [5].

Existing outer space monitoring optical systems uses expensive wide-aperture telescopes with a narrow field of view to increase light sensitivity [6]. The space objects series observations forms a large number of large size images. It requires complex processing methods to extract and initialize the usable signal [7]. Machine learning techniques are introduced into the databases for timely service of incoming information and updating catalogues. The monitoring and data processing by such specialized optical systems is encapsulated and provided in a limited format. Therefore, the trend of mobile astrometric observation systems in the optical range with low-cost hardware solutions, open software image processing modules [8, 9] and numerical methods for the initial orbit determination [10, 11] is developing. Formalized approaches to serial surveys are focused on obtaining passing space object track images with the intended exposure. Subsequent processing involves extracting the angular coordinates of the SO from the pixel representation by overlaying the field of the calibrated frame on separate parts of astronomical atlases in interactive software environments [12]. In a particular case, detecting and astrometric initialization methods use individual images and different software modules at each processing phase.

Computer vision algorithms and modern computing architectures are tools for optimizing existing approaches to monitoring and detecting SO. Real-time video processing techniques allow the development of autonomous programmable recognition systems for astrometric measurements. Such systems provide the integration of all processing modules - detection, filtering, segmentation, astrometric calibration, conversion of pixel values into angular coordinates, classification, etc. [13] Pipeline processing of a video stream in applied problems of orbital monitoring is studied [14]. Detection and extraction of mathematical properties of space objects in real-time by remote video systems automatic algorithms solve the problem of manual processing of a large volume of astrometric data and makes it possible to calculate orbital parameters under observation conditions [15, 16].

The purpose of this paper is to automate the processing of optical measurement data in problems of detecting space objects and numerical methods for the initial determination of orbits.

Devices and software modules

An autonomous embedded SO detection system can be implemented by low-cost devices: a computing board and a modular camera. The functional elements of the designing were a model of a single-board computer Raspberry Pi 4B, a Raspberry Pi High-Quality camera with a Sony IMX477R sensor, and a 16 mm Telephoto lens. The main features of these elements have presented in Table 1.

GPIO and MIPI CSI modular camera interface allow used the Raspberry Pi as an embedded system for receiving data from various sensors, controlling the PWM signal, and creating automated optical applications [17].

There are several ways for software integration of the modular camera and the Raspberry Pi. The picamera package had developed to support a pure Python interface [18] with the Raspberry Pi modular camera. The picamera package includes several defined modules, the classes of which are ranked by the stages of raw visual data processing. Encoders, color spaces, «-dimensional arrays of camera output, exception handling, rendering, and streaming classes are all available from the pica-mera namespace.

Optical elements parameters Table 1

Camera F/D F, mm Ц, % o, e" ц, цт q, e"

Raspberry Pi High Quality Camera Sensor: Sony IMX477R

Lens: Telephoto 16 mm 1.4-16

0.8

(450-650 nm)

6.2

1.55x1.55

7.2

16

F/D - /-ratio; F - focal length; Ц - quantum efficiency; c - read noise; ^ - pixel size; q - A/D conversion factor

Nevertheless, in video stream processing, the computation speed of the compiled algorithm is essential. The OpenCV [19] computer vision library functions are faster than many of the picamera namespace functions. The OpenCV library is an open-source code. As part of this work, a specific software had assembled by a cross-platform compilation control system using CMake configuration files. But it is not possible to use the OpenCV library in pure form when working with the Raspberry Pi camera module. The VideoCapture class, functions that provide capturing frames from a connected camera or file, and the Raspberry Pi camera module are incompatible.

The main steps of the algorithm object motion detection had described below. Primarily used the OpenCV library, but the visual interface directly used the camera namespaces. An SSH connection is provided remote control to the Raspberry Pi. The general diagram of autonomous optical surveillance system elements in Figure 1 is illustrated.

Figure 1 - Autonomous optical surveillance system Video stream processing

A simple frame difference method had used to implement the space object detection algorithm in the video mode [20]. Under nighttime observation conditions, the background stars remain static at short time intervals, and the noise is random. Therefore, in the absence of jitter of the optical

system, only objects move in the frame during the shooting.

The basic algorithm model is described by the following iterative stages: initial filtering, background initialization, frame differentiation, determination of the detection threshold, and morphological parameters of the object boundary contours (Figure 2).

The video stream processing computational steps had performed in lightweight grayscale color space and binary representations.

Filtering

In image processing, particular attention gives to Gaussian random noise. The process of accumulating and reading data from the camera sensor is the Gaussian noise source [21]. These are reading noise and dark noise, respectively. Lighting conditions and temperature fluctuations in the sensor operation are reading noise and dark noise sources. The filter for removing Gaussian noise has used a 21^21 smoothing kernel. The kernel coefficients had determined by a standard deviation value of 3.5.

Background initialization

The physical conditions can be considered constant for short phases of observation. Therefore, the first frame can assign as the background. Constant background and frame differentiation set the detection threshold.

Setting the threshold

The absolute difference between the values of two-pixel arrays is calculated. It allows to remove the background component and return the active areas as a difference. The threshold setting function at the input takes this difference and binarizes the original array. The detection algorithm used the function of automatically determining the threshold value by the Otsu method using algorithmic analysis of histograms [22].

Figure 2 - Motion detection by frame differentiation

Morphological patterns

To extract the mathematical pattern of a moving object (segment) and morphological characteristics uses standard models for determining the contours of the OpenCV library based on the set threshold value at the previous stage of processing. Contours are the shape boundaries with the same intensity. The properties or moments of contours include geometric center, total intensity (contour area), and orientation [22]. The morphological characteristics had extracted in the form of the center coordinates (cen-troid) and radius.

The concept of space object detection

For space objects, it is necessary to determine their position in the field of view pointed optical

system with an accurate time reference. For further orbital parameters calculation, it is necessary to fix several positions at a current time interval. Therefore, the result of video stream processing had organized as follows. Assume an object had successfully detected at each successive frame. Determine centroid coordinates and the timestamps in UTC format for several positions in the 2D plane with a defined frequency. The centroid position, date, and time write to a text file. Additionally, frames with each recorded centroid for a visual check save. The frequency is set by the "minimum time of the last download" parameter. This parameter refers to a particular object crossing the field of view. The outputting result concept of an autonomous detection system operation described above has algorithmically illustrated in Figure 3.

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Figure 3 - Timestamps of the detected object

Surveillance and photography the International Space Station pass

Space objects' principal filming implies a preliminary prediction of their pass trajectory over the observation site. There are special programs for the satellite position calculation at a certain point in time for such purposes. One of them is the Previsat program. The International Space Station filming over the observation site had predicted by Previsat. It was

The International Space Station pass parameters

for experimentally checking the accuracy of its detection by an autonomous remote system. Table 2 shows the following pass parameters: the space object pass time above the local horizon (Start date -Finish date), the elevation angle (Max Elevation), and the visual magnitude (Magn).

However, for more accurate pointing of the optical system with maximizing the target object aiming into the field of view, detailed parameters of a suitable pass are used, as Table 3 shows.

Table 2

Location of site: 025°41'49'' East 53°52'36'' North, 133 m

Timezone: UTC + 03:00

Satellite Start date Finish date Max Elevation Magn Sun Elevation

ISS 2021/07/17 2021/07/17 45°35'52" -1.6 -14°42'25"

00:50:00 00:55:00

Table 3

The International Space Station pass parameters

Date Hour Sat Azimuth Sat Elev Ra Sat Decl Sat Magn

2021/07/17 00:53:00 138°09' 03" 49°51' 27'' 21h04m13s +19°32'03'' -1.7

2021/07/17 00:53:30 120°54'09" 37°24'34'' 22h15m05s +14°29'27'' -1.2

2021/07/17 00:54:00 112°45' 39" 27°52'05'' 23h00m14s +10°08'01'' -0.6

2021/07/17 00:54:30 108°13'37" 21°00'03'' 23h30m04s +06°44'19'' - 0.0

2021/07/17 00:55:00 105°23'16" 15°54'13'' 23h51m06s +04°03' 56'' +2.6

®

®

®

®

Date Time X У

07Д7/2021 0:5414 S 63 5

07Д7/2021 0:5416 825 81

07/17/2021 0:5417 779 16S

07/17/2021 0:5418 733 254

07/17/2021 0:5419 690 334

07/17/2021 0:54 20 648 414

07/17/2021 0:54 21 605 494

07Д7/2021 0:54 22 563 574

07/17/2021 0:54 23 518 659

According to the calculated pass parameters (Table 3), the optical system was guided (by the elevation and azimuth) to the SO for interval video shooting (frequency 32 fps). The autonomous optical surveillance system automatically detected the object according to the algorithm described above by video processing in the tracking mode. For an informative demonstration of the work results, saved frames with the corresponding timestamps and the centroid position of the detected object had combined into one image, shown in Figure 4.

Figure 4 - The International Space Station frame-by-frame detection result

It is worth noting that each frame timestamp at the output takes on a value with the addition of the computational speed of one iterative processing (Figure 4). Basically, it takes a time to one frame process. That can be called a timing mark error. The code processing time estimate had obtained using the ratio of the total ticks number spent on all iterations to the frequency of the ticks per second (internal functions of the OpenCV library - cv.getTickCount and cv.getTickFrequency, respectively). The timing mark error was 6 seconds and taken into account in the source code to time-stamps record.

The space object's pass time duration (in the experimental video, it is the International Space Station) through the field of view was about 2 seconds from the appearance until the disappearance moments. According to one of the conditions, the detected object pixel center position in the frame had fixed every 0.2 s. In the end, the resulting text file included 9 tags with the corresponding data.

Conclusion

The approach to automating space objects detection using video stream processing had investigated. The low-cost remote control autonomous optical detection system implementation was proposed. It has based on a single-board Raspberry Pi 4 computer and a modular camera. The underlying real-time detection algorithm architecture provides raw data extraction for numerical methods for initial orbit determination.

The detecting space objects concept was tested on the example of filming a pre-predicted passage of the International Space Station. The result showed sufficient accuracy in determining the space object centroid position in the image plane with the appropriate time reference for solving the problems of the initial orbit determination. This approach is remarkable by the speed and autonomy of execution in the format of remote receiving output data for further conversion into angular coordinates space object observed. It excludes manual processing of space object tracks images.

Telescope and the proposed embedded system direct focus integration allow space object autonomous real-time detection based on video stream processing. Also, this system can be used for initial space object detection in a mobile hardware-software unit for space object optical observations.

References

1. Walker C., Hall J. (eds.) Impact of Satellite Constellations on Optical Astronomy and Recommendations Toward Mitigations. [Electronic resource]: https://web. archive.org/web/20201129021356/https://aas.org/sites/ default/files/2020-08/SATC0N1-Report.pdf (accessed: 30.09.2021).

2. Gallozzi S., Scardia M., Roma M.M., Brera I.A., Trieste I. Concerns about ground based astronomical observations: a step to safeguard the astronomical sky.

arXiv: Instrumentation and Methods for Astrophysics, 2020, 16 p.

3. Villela T., Costa C.A., Brandao A.M., Bueno F.T., Leonardi R. Towards the Thousandth CubeSat: A Statistical Overview. International Journal of Aerospace Engineering, 2019, vol. 2019, pp. 1-13.

DOI: 10.1155/2019/5063145

4. Satellite Box Score. Orbital Debris Quarterly News. NASA, 2021, vol. 25, no. 3, 12 p.

5. Shakun L., Koshkin N., Korobeynikova E., Ko-zhukhov D., Kozhukhov O., Strakhova S. Comparative analysis of global optical observability of satellites in LEO. Advances in Space Research, 2020, vol. 67(1), pp.1743-1760.

DOI: 10.1016/j.asr.2020.12.021

6. Woods D.F., Shah R.Y., Johnson J.A., Szabo A., Pearce E.C., Lambour R.L., Faccenda W.J. Space Surveillance Telescope: focus and alignment of a three mirror telescope. Optical Engineering, 2013, vol. 52, no. 5, pp. 053604-1- 053604-11.

DOI: 10.1117/1.OE.52.5.053604

7. Masias M., Freixenet J., Llado X., Peracaula M. A review of source detection approaches in astronomical images. Monthly Notices of the Royal Astronomical Society, 2012, vol. 422, iss. 2, pp. 1674-1689.

DOI: 10.1111/j.1365-2966.2012.20742.x

8. Schildknecht T., Hinze A., Schlatter P., Silha J., Peltonen J., Santti T., Flohrer T. Improved Space Object Observation Techniques using CMOS Detectors. Proceedings of 6th European Conference on Space Debris, Darmstadt, Germany, 2013.

9. Danescu R., Ciurte A., Turcu V. A Low Cost Automatic Detection and Ranging System for Space Surveillance in the Medium Earth Orbit Region and Beyond. Sensors, 2014, vol. 14, no. 2, pp. 2703-2731.

DOI: 10.3390/s140202703

10. Vallado D. Fundamentals of Astrodynamics and Applications. Hawthorne: Microcosm Press, 2013, 1106 p.

11. Spiridonov A.A., Saetchnikov V.A., Usha-kov D.V., Cherny V.E., Kezik A.G. Small Satellite Orbit Determination Methods Based on the Doppler Measurements by Belarusian State University Ground Station. IEEE Journal on Miniaturization for Air and Space

Systems, 2021, vol. 2, no. 2, pp. 59-66. DOI: 10.1109/JMASS.2020.3047456

12. Bonnarel F., Fernique P., Bienaymé O., Egret D., Genova F., Louys M., Ochsenbein F., Wenger M., Bart-lett J.G. The ALADIN interactive sky atlas - A reference tool for identification of astronomical sources. Astron. Astrophys. Suppl. Ser, 2000, vol. 143(1), pp. 33-40. DOI: 10.1051/aas:2000331

13. Yonghui Xu, Zhang Jihui. Real-time Detection Algorithm for Small Space Targets Based on Max-median Filter. The Journal of Information and Computational Science, 2014, no. 11, pp. 1047-1055.

DOI: 10.12733/JICS20102961

14. Zhang Xueyang, Xiang Junhua, Zhang Yulin. Space Object Detection in Video Satellite Images Using Motion Information. International Journal of Aerospace Engineering, 2017, vol. 2017, 9 p.

DOI: 10.1155/2017/1024529

15. Francesco Diprima, Santoni Fabio, Piergentili Fabrizio, Fortunato Vito, Abbattista Cristoforo, Amoruso Leonardo, Cardona T. An efficient and automatic debris detection framework based on GPU technology, 2017. [Electronic resource]: https://api.semanticscholar.org/ CorpusID:55225562 (accessed: 26.09.2021).

16. Marc Masias Moyset. Automatic source detection inastronomicalimages,2014.CorpusID: 129994570. [Electronic resource]: https://www.semanticscholar.org/paper/ Automatic-source-detection-in-astronomical-images-

Moyset/cc5ab6eab73284687713782dfdaa377ac777b9b9 (accessed: 26.09.2021).

17. Cookbook Raspberry Pi, Monk Simon, Media O'Reilly. Inc. 1005 Gravenstein Highway North, Sebastopol, 2016, CA 95472.

18. Beroiz M., Cabral J.B., Sanchez B. Astroalign: A Python module for astronomical image registration. Astronomy and Computing, 2020, vol. 32, p. 100384. DOI: 10.1016/j.ascom.2020.100384

19. Pulli Kari, Baksheev Anatoly, Kornyakov Kirill, Eruhimov Victor. Real-Time Computer Vision with OpenCV. Communications of the ACM, 2012, vol. 55, no. 6, pp. 61-69. DOI: 10.1145/2184319.2184337

20. Singh P., Deepak B.B.V.L., Sethi T., Mur-thy M.D.P. Real-time object detection and Tracking using color feature and motion. International Conference on Communications and Signal Processing (ICCSP), 2015, pp. 1236-1241. DOI: 10.1109/ICCSP.2015.7322705

21. Zhu H.J., Han B.C., Qiu B. Survey of Astronomical Image Processing Methods. In: Zhang YJ. (eds). Image and Graphics. Lecture Notes in Computer Science, Springer, Cham, 2015, vol. 9219, pp. 420-429.

DOI: 10.1007/978-3-319-21969-1_37

22. Zou Y., Zhao J., Wu Y., Wang B. Segmenting Star Images with Complex Backgrounds Based on Correlation between Objects and 1D Gaussian Morphology. Appl. Sci., 2021, vol. 11(9), p. 3763.

DOI: 10.3390/app11093763

i Надоели баннеры? Вы всегда можете отключить рекламу.