Real-Time Detection and Tracking of Military and Civilian Vehicles Using YOLOv9 and DeepSORT
Dr. B M Sagar
Department of Information Science & Engineering RV College of Engineering , Bangalore Karnataka, India sagarbm@rvce.edu.in
Aishwarya
Department of Information Science & Engineering RV College of Engineering , Bangalore Karnataka, India aishuborole12@gmail.com
Abstract— In response to the growing demands of national defense and border security, this paper presents a real-time, intelligent surveillance system for the detection, classification, and tracking of military and civilian vehicles. The system leverages advanced deep learning architectures—YOLOv8 and YOLOv9—for high-speed object detection, trained on a publicly available labeled dataset containing diverse vehicle types. Through comparative analysis using precision, recall, F1-score, and mAP metrics, YOLOv9 emerged as the superior model and was integrated with the DeepSORT tracking algorithm to maintain consistent object identity across video frames. The final system is implemented using Python and supports real- time performance on both GPU-enabled platforms and edge devices like NVIDIA Jetson. Extensive testing confirms the system’s ability to accurately distinguish between military and civilian vehicles in varied conditions, offering a scalable, robust solution for defense surveillance. Extensive testing demonstrated that YOLOv9 achieved a mean Average Precision (mAP@0.5) of 76.8% and an inference speed of 52 FPS, making it suitable for real-time deployment in defense scenarios. When integrated with DeepSORT, the system maintained over 90% tracking consistency, even in the presence of occlusion and fast motion. This work lays the groundwork for future developments such as behavioral anomaly detection, automated alerts, and multi-camera integration, thereby enhancing situational awareness and decision-making in sensitive environments.
Keywords—Object Detection, YOLOv9, Real-Time Vehicle Tracking, Military Surveillance, DeepSORT, Streamlit, Deep Learning.