The growing interest in compact cooperative flights of drones motivates an ongoing pursuit for efficient and embeddable onboard sources of mutual relative localization.
Mutual relative localization of flying robots is indispensable in many real-world applications that require deployment of multiple Unmanned Aerial Vehicles (UAVs) sharing the same workspace in small relative mutual distances. Using compact multi-UAV systems presents numerous benefits, including cooperative task completion, extension of the reach of a single robot, and distribution of capabilities to independent members. Moreover, several tasks that can not be solved by a single robot exist and some of them have been successfully solved by teams of UAVs developed.
F4F and the MRS Group at CTU aim to design a system enabling autonomous flights close to obstacles with multiple robots cooperating and interacting with the environment. In order to be reliable even in such circumstances, the units need to be able to independently avoid damage and complete their mission. If the UAVs are flying in a formation, they should be able to preserve it or keep their mutual distances within safe ranges.
In most current systems of compact UAV swarms, the localization of particular UAVs is based on the data obtained from motion capture systems for indoor experiments or on precise RTK-GNSS data outdoors. Such an external infrastructure is unavailable in most real multi-UAV applications and often cannot be pre-installed. To account for such situations as well as to make the system more autonomous, reliance on onboard sensors only is desirable.
We proposed a new methodology for outdoor mutual relative localization of UAVs equipped with active ultraviolet markers and a suitable camera with specialized bandpass filters. Ultraviolet light is less common in nature than visible light or infrared radiation (especially in high intensities). Therefore, artificial light is easily detectable. Additionally, common camera sensors are sensitive to UV light, making the addition of a filter the only necessary modification and keeping the platform low-cost. This also allows small-sized markers to be sufficient, without burdening the processing resources. Thus, the proposed system aspires to be an enabling technology for deployment of large swarms of possibly micro-scale aerial vehicles in real world conditions and without any dependency on an external infrastructure.
Our fully-equipped research platform built for LAAS/CNRS in Toulouse was based on a DJI hexacopter F550 frame with E310 DJI motors, a PixHawk 4 flight controller, an Intel NUC-i7 PC, a high-resolution Mobius ActionCam camera, the fast mvBlueFOX matrix-vision camera with DSL215 fisheye lens with BP365-R6 bandpass filter (which allows it to observe and localize ultraviolet LED-based markers), a Garmin Lidar rangefinder, and a PRECIS-BX305 GNSS RTK. 395 nm UV LEDs with Lambertian radiation pattern were also used.
The UVDAR system can be used for many different purposes. Active markers can be leveraged to encode additional information via blinking patterns. Additionally, several marker IDs can be used to detect relative orientation of UAVs. Our approach has been tested in active UAV deployment both indoors and outdoors. Results from outdoor experiments show excellent detection reliability w.r.t. different backgrounds, such as sky, trees, or even buildings, while still being able to decode the blinking signal.
UVDAR is an excellent specific implementation of the leader-follower formation. The formation control exploits the relative leader pose obtained by the UVDAR sensor comprising of position and yaw to steer the follower to a target pose w.r.t. the body of a moving leader, while also preserving the conditions for continued observation by this vision system. The cooperative combination of UVDAR with a specialized control algorithm was shown to maintain the desired following behavior without direct communication between the two UAVs.