The rapid development in the field of mobile robotics and the increasing degree of operation autonomy of these devices brings with it a demand for precise and robust location and navigation systems. The navigation system currently used in the vast majority of unmanned robotic vehicles are based on GNSS and do not provide accurate and robust position information in an environment with reduced satellite navigation signal quality.
The goal of the project was to integrate various localization methods for the purpose of developing a hybrid navigation system for autonomous vehicles that would help mainly in environments with lower availability of GNSS signal and impaired visibility. The realisation of such a system requires hardware sensors of various types (e.g. cameras, radars, LIDARs) and algorithms for the fusion of measurements received from these sensors. The system used in this project is developed in a compact form that makes it applicable as a self-contained localization sensor for navigation support of autonomous vehicles of various types – both ground-based and airborne.
The developed system uses localization methods from a depth stereoscopic camera, radar, and LiDAR. Each of these sensors has some disadvantage and its usage is not expedient in all situations.
Sensors using the visible spectrum are affected by the diversity of the environment with respect to a sufficient number of significant points. They are affected by degraded observation conditions, whether it is insufficient lighting of the scene or, for example, dust or water particles in the air.
The LiDAR sensor is completely independent of the quality of the scene lighting and its diversity in the visible spectrum. On the other hand, this sensor is also affected by particles in the air and is strongly dependent on the geometric diversity of the environment.
The radar-based sensor is very robust to environmental influences, but it is the least accurate of the three. It provides only a small amount of data and the robustness of the resulting location estimate is strongly dependent on the geometric variability of the environment and surface reflectivity.
As the particular sensor suitability usage depends on the environment where it is used, it is advantageous to combine different sensor data into one unit using data fusion based on a Kalman filter.
The hybrid navigation system for unmanned aerial vehicles has been designed as a modular system. This system uses additional modules of sensor platforms in addition to the mandatory core of the system. This modularity allows the end user to easily make adjustments to the system configuration with respect to the required accuracy of the location estimation and application scenario. Individual sensor platforms and their corresponding data processing software differ in the suitability of their use for different environments and applications. On one hand, the fully equipped system with all optional modules excels in the quality and robustness of the obtained location estimates, while on the other, it is necessary to take into account the higher weight and consumption of on-board resources in the case of unmanned aerial vehicles. The higher weight of unmanned aerial vehicles subsequently reduces flight time, and thus also the efficiency of performing the application scenarios themselves. For these reasons, it is advisable to use for the individual application scenarios for only the minimum necessary set of sensor platforms that correspond to the expected properties of the environment.
Our solution uses a hybrid system – the principle of switching individual modes of the system according to its development. In practice, this means that if we have information about the high inaccuracy of a sensor, the system automatically completely excludes the data from that sensor from the solution. This results in the faster detection of faulty measurements than by simply implementing covariations in the filter. Thus, this produces more drastic elimination of erroneous values from the mathematical solution and a faster convergence of the estimate to the real state. Also in this case, the computational complexity is reduced as filtering works with less data. Last but not least, this principle presents the possibility of prediction during planning, when the relevant sensor can be disabled before it starts to provide inaccurate measurements. For example, if an airplane is about to fly into a building from an outdoor environment, the GNSS system can be excluded from the solution before introducing an error into the system and causing a temporary deviation of the position estimation before the filter detects false measurements from the GNSS covariance.
We performed intense testing of different sensor configurations in various environments and designed a complex unit in the form of a functional sample. The system is functional both in natural conditions, such as caves or dense vegetation, and in buildings and structures. Due to the modularity of the sensors and the ability to complement each other with measurements based on various physical principles, it is possible to operate the system in environments with reduced visibility or increased dust.
The use of this navigation system opens up a number of new application scenarios for the use of unmanned aerial vehicles in the areas of automated inspections of industrial infrastructures, inventory, reconnaissance missions, operations in dense urban areas, etc. These are very commercially attractive areas where there is currently considerable demand.