SPECIAL SESSION #17
Technological Foundations and Advanced Applications of Sensor Fusion Systems in UAVs
ORGANIZED BY
JesĂșs GarcĂa Herrero
Carlos III University of Madrid
David MartĂn GĂłmez
Carlos III University of Madrid
SPECIAL SESSION DESCRIPTION
Unmanned Aerial Systems (UAS) have evolved rapidly in recent years thanks to advancements in navigation, perception, information fusion, and Artificial Intelligence (AI) technologies, allowing an increase in autonomous operations and evolving applications in defence and the civil areas. For example, among the most popular uses of drone systems is for aerial inspection of industrial infrastructure (oil refineries, telecommunications and power towers, wind turbines, solar plants, etc.). Technology advancements supporting these system applications are in a state of evolution. Key technologies to develop UAS systems include autonomous navigation, machine vision and machine learning, and swarm coordination to achieve coherent behaviour and achieve mission objectives. Defence applications require systems to detect, track, and monitor UAS and UAS swarms, an evolving area of concern to protect against a variety of possible threats. This session will cover technological advancements and advanced applications based on UAS systems, including critical functions such as navigation, perception and coordination, from the point of view of sensor fusion systems. So, the goal of the proposed session is discussing approaches to sensor fusion technological foundations and advanced applications covering the design and development of information fusion systems integrating the combination of sensor data. Moreover, the development of sensor fusion systems inclusive of contextual factors and information offers an opportunity to improve the quality of the fused output.
TOPICS
Topics include but are not limited to:
- Sensor fusion systems in UAVs based on combining data from multiple sensors (like cameras, LiDAR, radar, GPS, accelerometers) to create a more accurate, complete, and reliable understanding of an environment or situation than any single sensor could provide alone;
- Algorithms that fuse raw data from different sources (e.g., a camera's image, a radar's distance);
- Sensor fusion systems in real-time: sensor fusion of cameras, radar, LiDAR, and ultrasonic sensors for safe navigation, object detection, and traffic management;
- Methodologies at different levels of fusion: (i) Data-Level: Fusing raw data directly; (ii)Feature-Level: Fusing extracted features (e.g., edges, shapes), and (iii) Decision-Level: Fusing the final outputs or classifications;
- Sensor fusion for more precise measurements and object identification, better performance in diverse conditions (e.g., low light, bad weather), and redundancy offering safe operation if one sensor fails;
- Injection of a priori knowledge to improve the performance of fusion systems;
- Augmentation of tracking, classification, recognition, reasoning, situation analysis, etc., algorithms with contextual information;
- Adaptation techniques to have the system respond not only to changing targetâs state but also to the surrounding environment;
- Application examples include surveillance systems (security/defence), traffic management, autonomous navigation, etc.
ABOUT THE ORGANIZERS
JesĂșs GarcĂa Herrero interests are artificial intelligence, data and information fusion, computer vision and autonomous vehicles. Within these areas, including theoretical and applied aspects, he is co-author of more than 80 articles in indexed journals and 200 communications to conferences. He is an active reviewer of academic publications and belongs to the editorial board of different journals in the field of his research, such as Information Fusion (Elsevier) or Perspectives on Information Fusion (ISIF). He is coordinator of the Applied Artificial Intelligence Group at Universidad Carlos III de Madrid, has been Deputy Director of the Polytechnic School at the Colmenarejo campus (2005-2008) and Director of the Residencia de Estudiantes Antonio Machado (2008-2010), Deputy Vice Chancellor of the Colmenarejo campus (2019-2023) and Director of Postgraduate School in Engineering and Basic Science since 2023.
He has belonged to different committees in international organizations: Chair of IEEE Spanish Chapter in Electronics and Aerospace Technologies in 2013-2018, Spanish representative in working groups in the NATO-STO partnership since 2011 and member of the steering committee of the International Society for Information Fusion in the period 2014-17.
Prof. David MartĂn GĂłmez graduated in Industrial Physics (Automation) from the UNED in 2002, and holds a PhD in Computer Science from the CSIC and the UNED in 2008, where he was a predoctoral fellow at the CSIC from 2002 to 2006. He was also a researcher at the European Laboratory for Particle Physics (CERN, Switzerland, 2006-2008) and a postdoctoral researcher in Robotics at the CSIC (2008-2011). Currently, he is Full Professor at the Universidad Carlos III de Madrid (UC3M), and a member of the Intelligent Systems Laboratory (LSI) since 2011. His lines of research are perception systems, computer vision, sensor fusion, intelligent transportation systems, advanced driver assistance systems, autonomous ground vehicles, unmanned aerial vehicles, positioning and autonomous navigation of ground and aerial vehicles, and self-awareness, reasoning and decision-making under uncertainty in autonomous vehicles.