Algorithm for correcting camera pan and tilt on aircraft based on recorded video
- 作者: Nikiforov D.L.1, Efimov S.N.1
-
隶属关系:
- Reshetnev Siberian State University of Science and Technology
- 期: 卷 25, 编号 4 (2024)
- 页面: 433-439
- 栏目: Section 1. Computer Science, Computer Engineering and Management
- ##submission.datePublished##: 15.12.2024
- URL: https://journals.eco-vector.com/2712-8970/article/view/646541
- DOI: https://doi.org/10.31772/2712-8970-2024-25-4-433-439
- ID: 646541
如何引用文章
详细
Due to the difficulties that arise when using satellite navigation systems at airfields at present, and the insufficient accuracy of inertial navigation systems, optical measuring systems have again begun to be used to carry out trajectory measurements. However, existing measuring systems have a number of disadvantages. The purpose of this work is to describe a method for increasing the accuracy of trajectory measurements obtained by the goniometric method. The article reviews the main algorithms currently used in trajectory measurements and their shortcomings. An algorithm for frame-by-frame post-flight processing of recorded video from cameras of an optical-electronic measuring complex is proposed. A description of the implementation of this algorithm is given, taking into account the specifics of graphical software interfaces for processing user of the algorithm’s input. The proposed algorithm allows, after carrying out trajectory measurements, without time restrictions, to correct pan and tilt of the platform at each moment in time. The proposed algorithm makes it possible to increase the accuracy of trajectory measurements when testing aircraft, both already carried out and future ones. The proposed algorithm can also be used to obtain would-be pan and tilt of the camera when implementing a goniometric direction-finding complex using fixed wide-angle optical cameras. For example, when measuring the radiation patterns of an aircraft's onboard antennas using a quadcopter-meter to determine its position in space at each moment in time. The article also presents the main advantages and disadvantages of the algorithm, makes proposals for its improvement, and suggests possible areas of its application.
全文:
Introduction
The main method to determine the position of an aircraft in space is to use satellite navigation systems [1]. Currently their use to conduct trajectory measurements on the territory of airfields is impossible due to a number of reasons [2]. Inertial navigation systems, currently used instead of satellite ones, accumulate significant errors during long flights [3]. In addition, carrying out this type of measurement is necessary to install additional equipment directly on the aircraft, which is not always possible for legal or technical reasons, for example, when using small-sized unmanned aerial vehicles with limited payload [4].
One of the options to solve this problem is the use of optical systems [5–7], such as cine-theodolites [8], to calculate the position of the aircraft based on azimuth and elevation angles from two measuring points [9] (two-point direction finding method, shown in Fig. 1).
Рис. 1. Суть угломерного или пеленгационного способа определения координат
Fig. 1. Direction finding model
However, accurately tracking the movement of an aircraft is practically impossible. When manually tracking a target, the tracking result is mainly under the impact of operator errors. For example, an operator cannot perfectly keep a fast moving object in the center of the frame due to the relatively high human reaction time. Also, the operator may make mistakes due to human factor.
Using computer vision algorithms simplifies the task somewhat, but does not solve the problem completely. Brightness-contrast algorithms of computer vision are extremely sensitive to weather conditions and extraneous objects in the frame [10]. More advanced algorithms based on machine learning can solve this problem quite effectively, but they require more computing power and a large training sample for each individual aircraft model to be tracked. Therefore, the development of such algorithms to conduct small-scale tests is economically unprofitable.
In addition, even with an error-free computer vision algorithm, it is fundamentally impossible to solve the problem of the pan-tilt mechanism having a limited rotation speed. Moreover, using the maximum rotation speed cannot be always realized, even if the pan-tilt mechanism allows it. The camera installed on it can be quite heavy [11], and if the speed changes abruptly, the pan-tilt mechanism can fail.
A solution could be a frame-by-frame post-flight processing algorithm, where the operator, not limited by time frames and physical characteristics of the measuring complex, can indicate the position of the aircraft with pixel accuracy [12] and specify the angles of rotation and tilt of the camera relative to the center of the frame [13].
Algorithm description
The essence of post-flight processing is as follows.
A post-flight processing operator reviews the recorded flight video footage. The operator moves to the frame (Fig. 2) corresponding to the required moment in time, points the cursor at a specific point on the aircraft, along which it was decided to conduct trajectory measurements, and by pressing the mouse button starts the operation of the algorithm, which will correct the aircraft sight angles.
Рис. 2. Примерная схема кадра при послеполётной обработке
Fig. 2. Frame structure during post-flight processing
Since the operator's monitor resolution, frame display area size and aspect ratio may change during the post-flight processing, unlike the original video, it is necessary to take this fact into account and not be tied to the pixel resolution, but to work relative to the horizontal and vertical viewing angles.
The APIs that provide the current position of the mouse cursor return it relative to the upper-left corner of the display area [14]. But since the target sighting angles for each frame are set relative to its center, it is necessary to transform the coordinates of the mouse cursor. The position of the mouse cursor relative to the center of the frame can be calculated using the following formula:
where and – the position of the mouse pointer relative to the upper left corner of the frame; and h – the total number of pixels in the frame in width and height respectively.
Now we need to move from a linear representation in pixels to an angular representation in degrees or radians. To do this, we divide the frame into quadrants. We will take the axes denoting the width and height of the frame to be equal to one angle of the solution horizontally and vertically, respectively. Consequently, points along these axes will have coordinates in the range from –0.5 to +0.5. Therefore, we eliminate the need to know the resolution of the original video to calculate the pixel value in angular measure. The coefficients of linear displacement of the target relative to the frame center can be calculated using the following formula:
The angular displacement of the target relative to the center of the frame can be obtained by multiplying the lens aperture angles by the corresponding coefficients obtained earlier:
where and – horizontal and vertical lens aperture angles, respectively.
Now, to obtain the more accurate angles of pan and tilt of the camera on the aircraft, it is necessary to add the obtained angular offset to the angles of the frame center.
Conclusion
The considered algorithm can be used when conducting trajectory measurements to improve the accuracy [15] of aircraft position calculations. Figure 3 presents an example of a post-flight processing program that uses this algorithm. Moreover, this algorithm can be used in systems with fixed wide-angle cameras as the main method to determine the viewing angles of an aircraft. Such a system can be used to measure the radiation patterns of aircraft onboard antennas. However, when using wide-angle cameras, lens distortion is necessarily taken into account.
Рис. 3. Послеполётная обработка траекторных измерений МС-21
Fig. 3. Post-flight processing of Yakovlev MC-21
The main disadvantage of this algorithm is the need to manually specify the position of an aircraft for each frame, which makes its use for long flights labor-intensive. A way to solve this problem may be to sample only key frames at a certain time interval (for example, 2 times per second) depending on the requirements indicated in the technical specifications. Alternatively, more resource-intensive computer vision algorithms can also be used to automate the frame-by-frame processing process. They cannot be used in real time, but are suitable when there are no time constraints on processing a single frame, since we are talking about post-flight processing of the recorded video.
The main advantage of this algorithm is the ability to obtain the highest possible accuracy when conducting trajectory measurements with cine-theodolites, since frame-by-frame processing makes it possible to consistently obtain angles of the same point on the aircraft.
作者简介
Danil Nikiforov
Reshetnev Siberian State University of Science and Technology
Email: nikiforov-danil1997@yandex.ru
ORCID iD: 0009-0004-8238-3427
Postgraduate student
俄罗斯联邦, 31, Krasnoyarskii rabochii prospekt, Krasnoyarsk, 660037Sergey Efimov
Reshetnev Siberian State University of Science and Technology
编辑信件的主要联系方式.
Email: efimov@bk.ru
ORCID iD: 0000-0002-4506-3510
Cand. Sc., assistant professor, department of informational and control systems
俄罗斯联邦, 31, Krasnoyarskii rabochii prospekt, Krasnoyarsk, 660037参考
- Hein G W. Status, perspectives and trends of satellite navigation. Satellite Navigation. 2020. Vol. 1, No. 1, P. 22.
- Gundorov K. V., Sulejmanov V. N., Medvedkov D. A. [Analysis of the satellite system operation in the conditions of disconnection of the Russian Federation from the navigation systems of unfriendly countries]. Vestnik voennogo innovacionnogo tehnopolisa “JeRA.”. 2023, Vol. 4, No. 2, P. 175–183 (In Russ.).
- Litvin M. A., Maljugina A. A., Miller A. B. et al. [Types of errors in inertial navigation systems and methods of their approximation]. Informacionnye process. 2014, Vol. 14, No. 4, P. 326–339 (In Russ.).
- Artjushin A. A., Kurbanov R. K., Marchenko L. A. et al. [Choosing a standard-sized range of unmanned aerial vehicles and payloads for monitoring agricultural fields]. Elektrotehnologii i elektrooborudovanie v APK. 2019, No. 4 (37), P. 36–43 (In Russ.).
- Dodonov A. G., Putjatin V. G. [Ground optical, Optoelectronic and laser-television means of trajectory measurements]. Matematicheskie mashiny i sistemy. 2017, No. 4, P. 30–56 (In Russ.).
- Mussabayev R. R., Kalimoldayev M. N., Amirgaliyev Ye. N. et al. Calculation of 3D Coordinates of a Point on the Basis of a Stereoscopic System. Open Engineering. De Gruyter Open Access. 2018. Vol. 8, No. 1, P. 109–117.
- Enaleev S. F. [Trajectory measurements: a practical guide] Traektornye izmereniya. Moscow, Vologda, Infra-Inzheneriya Publ., 2021, 124 p.
- Gusev M. V. [The history of the development of trajectory optical measuring instruments]. Matrica nauchnogo poznaniya, 2023, No. 1-1, P. 57–65 (In Russ.).
- Iskorkin D. V., Shishkov S. V., Terjoshin A. V. et al. Patent No. RU 2 645 549 C2. Sposob opredeleniya koordinat letatel'nyh apparatov s ispol'zovaniem odnogo direkcionnogo ugla i dvuh uglov mesta [A method for determining the coordinates of aircraft using one directional angle and two angles of location]. No. 2015114888. 2018.
- Hager G. D., Belhumeur P. N. Real-time tracking of image regions with changes in geometry and illumination. Proceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1996. P. 403–410.
- Rysenkov K. N., Vojchenko O. S., Zobov I. S. et al. Patent No. RU 2 758 860 C1. Sposob korrekcii uglov vizirovanija na tochku [A method for correcting the angles of sight to a point]. No. 2020133299. 2021.
- Ko J.-H., Kim E.-S. Stereoscopic video surveillance system for detection of target’s 3D location coordinates and moving trajectories. Optics Communications. 2006, Vol. 266, No. 1, P. 67–79.
- Matsubara H., Tsukada T., Ito H. et al. A three-dimensional position measurement method using two pan-tilt cameras. R&D Review of Toyota CRDL. 2003, Vol. 38, No. 2, P. 43–49.
- Synopsis – Qt for Python. Available at: https://doc.qt.io/qtforpython-6/PySide6/QtGui/QSingle PointEvent.html#PySide6.QtGui.QSinglePointEvent.position (accessed: 07.05.2024).
- Guzevich S. N. Patent No. RU 2 533 348 C1. Opticheskiy sposob izmereniya razmerov i polozhenija obekta i dal'nomer-pelengator [Optical method for measuring the size and position of an object and a range finder]. No. 2013130715/28. 2014.
补充文件
