Hybrid Aerial-Vehicular Data Integration for Enhanced Road Assessment and Adaptive Maneuvering in GPS-Denied Spaces
Main Article Content
Abstract
The integration of hybrid aerial and vehicular systems has emerged as a promising solution for enhanced road assessment and adaptive maneuvering in GPS-denied environments. Traditional road monitoring systems face challenges in dynamic environments where reliable geospatial data is unavailable, requiring advanced methods for data collection and analysis. This research proposes a novel framework combining unmanned aerial vehicles (UAVs) and ground vehicles to synergize sensor capabilities, improve situational awareness, and enable adaptive navigation. UAVs are equipped with LiDAR, high-resolution cameras, and inertial measurement units (IMUs), while ground vehicles utilize radar, ultrasonic sensors, and odometry for complementary data collection. A data fusion algorithm is developed to integrate aerial and vehicular sensor streams, utilizing simultaneous localization and mapping (SLAM) to construct precise road models. Key challenges addressed include robust real-time data synchronization, multi-modal sensor fusion, and environmental noise filtering. The proposed framework is tested in GPS-denied settings, including urban canyons and dense forested areas, demonstrating its ability to detect road anomalies, optimize vehicular trajectories, and ensure safe navigation. Results show that hybrid data integration significantly improves road feature recognition by 28% and reduces maneuvering errors by 35% compared to vehicle-only systems. This study highlights the potential of hybrid systems to redefine road assessment and maneuvering strategies in complex environments, advancing the state of autonomous navigation.