--> --> --> -->










The main goal of Demonstrator 1.4 is to provide a fly-ready drone demonstration that is able to perform some autonomous functionalities during the flight. Thus, the drone should be able to execute a simple point-to-point travel in optimal conditions with planned landing, assuming absence of obstacles on the route and full visibility. Secondly, the demo is aimed at facilitating the acquisition of synchronized multimodal sensor data from different devices and to provide the backbone for the development of the multisensory avionic architecture delivered in Demo 5.1.2. The demonstrator consists into two sub-demonstrations: one targeting outdoor scenarios and one targeting indoor environments. As for the outdoor demonstrator, the partially automated flight features are showcases through HITL simulations involving the Pxihawk4 flight controller and a self-implemented simulation framework developed by IFAG in the context of SC5. The simulation environment supports the acquisition of multimodal synthetic data acquisition in multiple scenarios (forestry, city, open field). The acquisition of real data is carried out through an equipped cycling helmet, where a 3D printed support hosts the sensor setup. In this case, the data collection takes place mostly in the Infineon headquarters located in Munich (Germany), which offers a combination of forest, buildings and open fields areas. The multimodal dataset includes information from Radar, ToF Camera, Stereo Camera, IMU and GPS with RTK (Real Time Kinematics) correction. This data will be recorded following the same format as the ASL EuroMAV. The produced dataset is intended to serve as a benchmark to test the sensor fusion algorithms developed by the partners within SC2. Such algorithms address diverse tasks in the drone autonomous flight including target detection and SLAM (Simultaneous Localization and Mapping). Different static and moving obstacles are introduced in the environments, including trees, people and bikes. In order to better test the robustness of the developed algorithms, effort is put into diversifying the data recording conditions (e.g., type of environment, weather conditions, time of the day, speed). As for the indoor demonstrator, it is based on a DJI-450 Flame Wheel flight-frame. It features a Pixhawk Cube Orange flight controller and a front facing mounting for the environmental sensing module. The module consists of a time-of-flight camera, stereo camera and 60-GHz-radar sensor. The drone system is completed by an Aurix Shieldbuddy TC375 Board and a NVIDIA Jetson Nano to perform the data acquisition and sensor fusion demonstrated in Demo 5.1.2.




ADACORSA has received funding from the ECSEL Joint Undertaking (JU) under grant agreement No 876019.
The JU receives support from the European Union’s Horizon 2020 research and innovation programme and Germany, Netherlands, Austria, France, Sweden, Cyprus, Greece, Lithuania, Portugal, Italy, Finland, Turkey.