Application of a Computer Vision Guidance System for a Small Unmanned Loitering Attack Munition

Authors

  • Saowakon Pummalee Graduate School, Navaminda Kasatriyadhiraj Royal Air Force Academy
  • Nattagorn Pramona Department of Aeronuatical Engineering, Navaminda Kasatriyadhiraj Royal Air Force Academy
  • Prasatporn Wongkamchang Graduate School, Navaminda Kasatriyadhiraj Royal Air Force Academy
  • Janewit Kampoon Graduate School, Navaminda Kasatriyadhiraj Royal Air Force Academy

Keywords:

image processing, guidance, navigation, loitering munition

Abstract

The goal of this research is to create a visual guidance system that can control a small unmanned loitering attack munition that is designed to attack ground targets. By applying artificial intelligence in computer vision to calculate the location of the target and computed a trajectory for controlling the movement of the multi-rotor unmanned aerial vehicle to attack the target. Six flight tests with different visual orientations were performed and flies at a height of 80 meters and then reduces altitude to attack the target in a color-strip circle of size 3 meters x 3 meters that is stationary on the ground. Comparing the performance of YOLO v5 and making appropriate adjustments, it was found that YOLO v5, whose conf-thres value is 0.001, had the best performance with the values Precision, Recall, and the F 1 score, which is better than other models. After that, the trajectory is calculated by taking the difference between the center of the target and the center of each frame at different heights used to create a trajectory to be used in the design of the control system of an unmanned aircraft to attack the target in the future.

References

Altug, E., Ostrowski, J. P., & Taylor, C. J. (2005). Control of a quadrotor helicopter using dual camera visual feedback. The International Journal of Robotics Research, 24(5), 329-341. https://d1.amobbs.com/bbs_upload782111/files_29/ourdev_559047.pdf

Arafat, M. Y., Alam, M. M., & Moh, S. (2023). Vision-based navigation techniques for unmanned aerial vehicles: Review and challenges. Drones, 7(2), 89. https://doi.org/10.3390/drones7020089

Beard, R. W., & McLain, T. W. (2012). Small unmanned aircraft: Theory and practice. New Jersey: Princeton University Press

Bochkovskiy, A., Wang, C., & Liao, H. M. (2020). YOLOv4: Optimal speed and accuracy of object detection. ArXiv, 2004, 10934. https://doi.org/10.48550/arXiv.2004.10934

Chuyi, L., Lulu, L., Hongliang, J., Kaiheng, W., Yifei, G., Liang, L., Zaidan, K., Qingyuan, L., Meng, C., Weiqiang, N., Yiduo, L., Bo, Z., Yufei, L., Linyuan, Z., Xiaoming, X., Xiangxiang, C., Xiaoming, W., & Xiaolin, W. (2022). YOLOv6: A single-stage object detection framework for industrial applications. ArXiv, 2209, 02976. https://doi.org/10.48550/arXiv.2209.02976

Conte, G., & Doherty, P. (2008). An integrated UAV navigation system based on aerial image matching. 2008 IEEE Aerospace Conference (pp. 1-10). USA: IEEE

Heikkila, J., & Silvén, O. (1997). A four-step camera calibration procedure with implicit image correction. In Proceedings of 1997 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (pp. 1106-1112). Washington DC: IEEE

Hosseinpoor, H. R., Samadzadegan, F., & DadrasJavan, F. (2017). Pricise target geolocation and tricking based on UAV video imagenary.The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XLI-B6, 2016 XXIII ISPRS Congress, 12–19 July 2016 (pp. 243–249). Prague, Czech Republic: ISPRS. https://doi.org/10.5194/isprs-archives-XLI-B6-243-2016

Jocher, G., Chaurasia, A., Qiu, J., & YOLO by Ultralytics. (2023). Ultralytics YOLOv8. Retrieved from https://github.com/ultralytics/ultralytics

Koo, T. J., Hoffmann, F., Sinopoli, B., & Sastry, S. (1998). Hybrid control of an autonomous helicopter. Proceedings of IFAC Workshop on Motion Control, Grenoble (pp. 1-6). France: IFAC. https://bit.ly/4fltZgb

Kunertova, D. (2023). The war in Ukraine shows the game-changing effect of drones depends on the game. Bulletin of the Atomic Scientists, 79(2), 95–102. https://doi.org/10.1080/00963402.2023.2178180

Liu, X., Yang, Y., Ma, C., Li, J., & Zhang, S. (2020). Real-time visual tracking of moving targets using a low-cost unmanned aerial vehicle with a 3-Axis Stabilized Gimbal System. Applied Sciences, 10(15), 5064. https://doi.org/10.3390/app10155064

Lu, Y., Xue, Z., Xia, G. S., & Zhang, L. (2018). A survey on vision-based UAV navigation. Geo-Spatial Information Science, 21(1), 21–32. https://doi.org/10.1080/10095020.2017.1420509

Luo, P., & Pei, H. (2007). An Autonomous Helicopter with Vision Based Navigation. IEEE International Conference on Control and Automation (pp. 2595-2599). Guangzhou, China: IEEE

Nepal, U., & Eslamiat, H. (2022). Comparing YOLOv3, YOLOv4 and YOLOv5 for autonomous landing spot detection in faulty UAVs. Sensors (Basel, Switzerland), 22(2), 464. https://doi.org/10.3390/s22020464

Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 779-788). Las USA: IEEE

Redmon, J., & Farhadi, A. (2018). Yolov3: An incremental improvement. ArXiv, 1804, 02767.

Voskuijl, M. (2022). Performance analysis and design of loitering munitions: A comprehensive technical survey of recent developments. Defence Technology, 18(3), 325-343. https://doi.org/10.1016/j.dt.2021.08.010

Wang, C. Y., Bochkovskiy, A., & Liao, H. Y. M. (2022). YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. ArXiv, 2207, 02696. https://doi.org/10.48550/arXiv.2207.02696

Wang, X., Li, W., Guo, W., & Cao, K. (2021). SPB-YOLO: An efficient real-time detector for unmanned aerial vehicle images. Proceedings of the 2021 international conference on artificial intelligence in information and communication (ICAIIC) (pp. 99-104). Jeju Island: IEEE.

Zhang, J., Wan, G., Jiang, M., Lu, G., Tao, X., & Huang, Z. (2023). Small object detection in UAV image based on improved YOLOv5. Systems Science & Control Engineering, 11(1), 2247082. https://doi.org/10.1080/21642583.2023.2247082

Downloads

Published

2024-12-12

How to Cite

Pummalee, S. ., Pramona, N. ., Wongkamchang, P. ., & Kampoon, J. . (2024). Application of a Computer Vision Guidance System for a Small Unmanned Loitering Attack Munition. EAU Heritage Journal Science and Technology (Online), 18(3), 150–164. retrieved from https://he01.tci-thaijo.org/index.php/EAUHJSci/article/view/271720

Issue

Section

Research Articles