HAL Id: tel-02106751
https://tel.archives-ouvertes.fr/tel-02106751
Submitted on 23 Apr 2019
HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers.
L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.
Communicating multi-UAV system for cooperative SLAM-based exploration
Nesrine Mahdoui Chedly
To cite this version:
Nesrine Mahdoui Chedly. Communicating multi-UAV system for cooperative SLAM-based ex-
ploration. Other [cs.OH]. Université de Technologie de Compiègne, 2018. English. �NNT :
2018COMP2447�. �tel-02106751�
Par Nesrine MAHDOUI CHEDLY
Thèse présentée
pour l’obtention du grade de Docteur de l’UTC
Communicating multi-UAV system for cooperative SLAM-based exploration
Soutenue le 7 décembre 2018
Spécialité : Technologies de l’Information et des Systèmes : Unité de recherche Heudyasic (UMR-7253)
D2447
3
Communicating Multi-UAV System for cooperative SLAM-based
Exploration
Nesrine MAHDOUI CHEDLY
Defended on December 7th, 2018. Thesis Committee:
Reviewers:
Olivier SIMONIN Ouiddad LABBANI-IGBIDA Professor Professor
INSA Lyon Univ de Limoges
Examiners:
Véronique CHERFAOUI Isabelle FANTONI Professor Director CNRS-LS2N Univ de Technologie de Compiègne Centrale Nantes
Supervisors:
Vincent FRÉMONT Enrico NATALIZIO Professor Professor Centrale Nantes Université de Lorraine
Université de Technologie de Compiègne Heudiasyc Laboratory, UMR CNRS 7253
07 - 12 - 2018
Spécialité : Technologies de l'Information et des Systèmes
Communicating Multi-UAV System for cooperative SLAM-based
Exploration
Nesrine MAHDOUI CHEDLY
Thèse soutenue le 07 Décembre 2018 devant le jury composé de :
Rapporteurs:
Olivier SIMONIN Ouiddad LABBANI-IGBIDA Professeur des universités Professeur des universités
INSA Lyon Univ de Limoges
Examinateurs:
Véronique CHERFAOUI Isabelle FANTONI Professeur des universités Directrice CNRS-LS2N Univ de Technologie de Compiègne Centrale Nantes
Directeurs de Thèse:
Vincent FRÉMONT Enrico NATALIZIO Professeur des universités Professeur des universités
Centrale Nantes Université de Lorraine
Université de Technologie de Compiègne Laboratoire Heudiasyc UMR CNRS 7253
07 - 12 - 2018
Spécialité : Technologies de l'Information et des Systèmes
Contents
Contents i
List of figures vii
List of tables xiii
Acknowledgment xv
List of symbols xvii
Acronyms xix
Résumé xxi
Abstract xxiii
Introduction 1
1 Background . . . . 1
2 Motivation . . . . 2
3 Problem statement/Approach . . . . 3
4 Contributions and publications . . . . 4
5 Thesis pipeline . . . . 5
1 Multi-robot System Model 7 1 Introduction . . . . 7
2 Related works . . . . 7
2.1 System of Systems . . . . 7
i
2.2 Multi-robot system classification . . . . 8
2.2.1 Degrees of cooperation . . . . 9
2.2.2 Degrees of coordination . . . . 9
2.2.3 Robot types . . . . 9
2.2.4 Inter-robot communication . . . . 9
2.2.5 System architecture . . . . 9
3 Applications . . . 11
3.1 Search and rescue applications . . . 11
3.2 Reconnaissance and surveillance . . . 12
3.3 Infrastructure inspection . . . 13
3.4 Exploration . . . 13
3.5 Discussions . . . 15
4 Advantages of multiple UAVs deployment . . . 15
5 Multi-robot system overview . . . 16
5.1 Architecture block diagram . . . 16
5.2 System coordinate frames . . . 18
5.3 Roles and team hierarchy . . . 19
6 Conclusion . . . 21
2 Simultaneous Localization And Mapping 23 1 Introduction . . . 23
2 Pose estimation . . . 24
2.1 Visual Odometry . . . 24
2.1.1 Overview . . . 24
2.1.2 Related work . . . 26
2.2 Simultaneous Localization And Mapping (SLAM) . . . 28
2.2.1 Overview . . . 28
2.2.2 Related work . . . 30
3 Metric map representation . . . 32
Contents iii
3.1 Point cloud representation . . . 32
3.2 Occupancy grid representation . . . 33
4 Monocular SLAM . . . 36
4.1 Monocular sensor . . . 36
4.2 Visual SLAM . . . 37
4.2.1 Problem formulation . . . 38
4.2.2 Incremental Smoothing and Mapping 2 (iSAM2) . . . 39
4.3 Results and discussions . . . 40
5 RGB-D SLAM . . . 41
5.1 RGB-D sensor . . . 42
5.2 Proposed approach . . . 43
5.2.1 SLAM module . . . 44
5.2.2 Grid-based mapping module . . . 44
5.3 Results and discussions . . . 47
5.3.1 Implementation details . . . 47
5.3.2 Simulation results . . . 48
5.3.2.1 Parameters setting . . . 48
5.3.2.2 SLAM performances using one UAV . . . 48
5.3.2.3 SLAM performances using two UAVs . . . 48
6 Conclusion . . . 51
3 Coordinated exploration 53 1 Introduction . . . 53
2 Related work . . . 53
2.1 Robot-to-target assignment . . . 54
2.2 Utility function . . . 55
3 Proposed exploration strategy . . . 55
3.1 Overview . . . 55
3.2 Frontier points selection . . . 57
3.3 Information gain computation . . . 58
3.4 Frontier points processing . . . 59
3.4.1 First approach: Convex shape map . . . 59
3.4.2 Second approach: Concave shape map . . . 59
3.5 Utility function . . . 65
3.6 Goal assignment process . . . 66
3.6.1 Stop condition . . . 67
3.6.2 Loop rate . . . 67
3.6.3 Scheduling the information gain . . . 68
3.7 Path planning and control . . . 68
4 Results and discussions . . . 69
4.1 Parameters tuning . . . 70
4.1.1 Trade-off parameter λ . . . 70
4.1.2 Loop rate r . . . 70
4.1.3 Common parameters . . . 71
4.2 Exploration strategy performances . . . 72
4.2.1 Maps evolution during the mission . . . 72
4.2.2 Frontier points evolution during the mission . . . 72
4.2.3 UAVs’ trajectories during the mission . . . 72
4.2.4 Goal assignment evaluation: Distribution of the robots in the environment . . . 77
4.2.5 Explored space rate evaluation . . . 79
4.2.6 Overlap rate evaluation . . . 79
4.2.7 Traveled distance evaluation . . . 79
4.2.8 Exploration time evaluation . . . 82
4.3 Exploration mission using relative localization algorithm . . . 83
5 Conclusion . . . 84
Contents v
4 Inter-robot communication 85
1 Introduction . . . 85
2 Network classification . . . 85
2.1 Infrastructure versus infrastructureless mode . . . 85
2.2 Ad Hoc network classification . . . 86
2.3 Network topology classification . . . 88
3 Network typology . . . 88
3.1 Related work . . . 89
3.2 Network standards and protocols . . . 95
3.3 Results and discussion . . . 96
4 Network topology and strategy for MRS robustness . . . 97
4.1 Related work . . . 97
4.2 Inter-robot communication approach . . . 98
4.2.1 Multi-UAV interaction and data exchange . . . 98
4.2.2 Exploration strategy to face communication loss . . . 99
4.2.3 Data exchange strategy discussion . . . 99
4.3 Results and discussion . . . 100
4.3.1 Network setting: From one to multiple machines . . . 100
4.3.2 Exchanged data size evaluation . . . 103
4.3.3 Exchanged data average time evaluation . . . 103
4.3.4 Network interruption evaluation . . . 105
4.3.5 Global map evaluation . . . 106
5 Conclusion . . . 106
Conclusions and future work 109 1 Summary . . . 109
2 Future research . . . 110
A Development tools 113
1 Robot Operating System . . . 113
1.1 Overview . . . 113
1.2 Framework . . . 113
1.3 ROS development tools . . . 115
1.3.1 Visualization tool: Rviz . . . 115
1.3.2 Graph Qt-based framework: Rqt_graph . . . 116
1.3.3 Transform libraby: Tf . . . 118
1.3.4 Bag tools . . . 120
1.4 Important ROS packages . . . 120
1.4.1 Move base package . . . 120
1.4.2 Fkie package . . . 121
1.4.3 Ardrone autonomy package . . . 122
2 Gazebo simulator . . . 123
Bibliography 125
List of figures
1 Drone prototype 1 . . . . 1
1.1 Search and rescue 2 . . . 12
1.2 Reconnaissance and surveillance 3 . . . 12
1.3 Infrastructure inspection 4 . . . 13
1.4 Exploration 5 . . . 14
1.5 NASA’s rover called Sojourner for Pathfinder mission 6 . . . 14
1.6 Architecture block diagram. . . 17
1.7 Apriltags: Visual Fiducial System mounted on robots. . . 17
1.8 Multi-robot coordinate system. . . 19
1.9 Fleet of UAVs containing four clusters. . . 20
1.10 Role selection process. . . 20
2.1 The VO working principle. Images from [Schöps et al., 2014]. . . 24
2.2 Illustration of the visual odometry problem. Case of a stereo camera (At time k, two boxes representing two image frames are available). Image from [Scaramuzza and Fraundorfer, 2011]. . . 25
2.3 Feature-based versus direct methods. Image from [Schöps et al., 2014]. . . 26
2.4 SLAM problem formulation. . . 29
2.5 Examples of a map represented in different structures. (a) Point cloud map. (b) Elevation map. (c) Multi-level surface map. (d) Occupancy grid map based on an octree. Image from [Hornung et al., 2013]. . . 32
2.6 Examples of an occupancy grid map with resolutions of 0.08m, 0.64m , and 1.28m, respectively. Image from [Hornung et al., 2013]. . . 33
2.7 2D occupancy grid example. . . 34
2.8 Map matching of M 1 and M 2 . . . 34
vii
t
camera. α t and ω t are, respectively, the acceleration and angular measure- ments from the IMU sensor. p c and p IMU are the estimated states from the camera and the IMU sensors, respectively. F is the generated 2D factor graph. 37 2.11 Factor graph for monocular SLAM. F 0 (p 0 ) and m i are the variable nodes.
F (0) is the prior density factor. F i (p i+1 , p i , b i ) is the odometry factor be- tween the pose p i and F ij (m j , p i , z ij ) is the measurement likelihood model between the pose p i and its landmark m i . . . 38 2.12 iSAM2 graphical models. . . 39 2.13 The iSAM2 steps. . . 40 2.14 Trajectory results (2D projection) using IMU, libviso2, iSAM2 and GPS. . . 41 2.15 Examples of RGB-D sensors. . . 42 2.16 Localization and Mapping layers outline. . . 44 2.17 ORB-SLAM2 threads and modules. Image from [Mur-Artal and Tardós, 2017a]. 45 2.18 An example of environment representation using Octomap mapping frame-
work (Resolution=0.05m). . . 45 2.19 Map structure evolution during the SLAM process: From point cloud in the
environment to 3D voxels O to 2D cells L. . . 46 2.20 Example of one UAV’s map structure evolution. . . 47 2.21 Implementation details of multiple UAVs on a single Linux machine. . . 47 2.22 One UAV exploration using SLAM algorithm. The kinect’s maximum range
is 4m, the UAV’s linear velocity v i ∈ [0.1, 0.2] m.s −1 and angular velocity ω i = 0.1 rad.s −1 . The RMSE(x)=0.2968 and RMSE(y)=0.2944. . . 49 2.23 Two UAVs cooperative exploration performing each one SLAM algorithm.
The kinect’s maximum range is 4m, the UAV’s linear velocity v i ∈ [0.1, 0.2] m.s −1 and angular velocity ω i = 0.1 rad.s −1 . For UAV 1 , the RMSE(x)=0.2419 and RMSE(y)=0.1415 ;for UAV 2 the RMSE(x)=0.3649 and RMSE(y)=0.1614. . . 50 3.1 Proposed exploration process pipeline. . . 56 3.2 Map structure evolution during the exploration process: From 2D cells L to
2D frontier cells F to candidate frontier cells/points G (candidate targets)
to 2D target cells/points T . . . 57
3.3 Frontier cells/points selection of a 2D occupancy grid map. . . 57
3.4 The information gain computation of a frontier point x (in red). The hatched cells represent the 48 surrounding cells of the frontier point x. The informa- tion gain of x is: I (x) = 25. . . 58 3.5 Frontier points processing example: (a) and (b) represent the frontier points
of UAV 1 and UAV 2 , respectively. (c) represent the candidate targets. The process consists in removing frontier points that belong to overlapped areas, and those that are adjacent to obstacles. . . 60 3.6 Frontier points processing wile assuming two convex map shapes in the first
line and three in the second line. (a) and (d) represent two and three convex map shapes, respectively; (b) and (e) represent the frontier points in overlap (intersection); and (c) and (f) represent the obtained frontier points after processing (final shape). . . 62 3.7 Frontier points processing wile assuming two concave map shapes in the first
line and three in the second line. (a) and (d) represent two and three convex map shapes, respectively; (b) and (e) represent the frontier points in overlap (intersection); and (c) and (f) represent the obtained frontier points after processing (final shape). . . 64 3.8 Utility function behavior: I (t j ) = 25 , n c = 3, d tot = P n k=1,k6=i
c(dmin(P k , t j )).
The average distance of other UAVs d tot has a minimum value different from zero since n c is different from zero too. . . 66 3.9 Example of the shape function to schedule the information gain( I (t k ) t−1 =
29, (t g (x), t g (y)) = (1.125, −0.275), σ x = σ y = 3 ). . . 68 3.10 From 2D (left) to 3D (right) navigation stack. The navigation z is a ROS
process developed to provide a control command on the z axis. . . 69 3.11 The impact of varying the trade off parameter λ over exploration time. . . . 70 3.12 Exploration time while varying the loop rate r. . . 71 3.13 Ratio of explored space during time for one UAV. . . 73 3.14 Coordinated exploration using two robots. Columns 1, 2 and 3 show the
evolution of the local map of the UAV 1 , the UAV 2 and the global map over time, respectively. . . 74 3.15 The evolution of candidate and final frontier points numbers during cooper-
ative exploration. . . 75 3.16 Projected 2D map in a coordinated exploration with a team of one, two and
three UAVs. Green, blue and red markers and arrows define, respectively, the
initial position and the trajectory of UAV1, UAV2 and UAV3. . . 76
with their respective information gain. (b), (c), and (d) represent, respec- tively, the targets assignment when: No further process is performed for the remaining candidate targets; UAV 1 ’s target is removed from the remaining candidate targets and; the information gain of the remained candidate targets are scheduled. . . 78 3.19 Explored and overlapped area rate using two cooperative UAVs. . . 79 3.18 Explored space rate with one, two and three UAVs. . . 80 3.20 Traveled distance by each UAV with one, two and three cooperative UAVs. . 81 3.21 Traveled distance evaluation. . . 82 3.22 Average exploration time. . . 82 3.23 Two UAVs navigation using ORB-SLAM2. Rviz image (left) shows, for each
UAV, the constructed 2D occupancy grid map, its estimated trajectory and the corresponding ground truth. The point cloud (middle) represent the sparse reconstruction of the environment made by each UAV. And, the green markers (right) represent the features computed by each UAV to perform localization. . . 83 3.24 Explored area rate evolution during exploration mission with one and two
UAVs while performing ORB-SLAM2 by each UAV. . . 84 4.1 Infrastructure (left) and infrastructureless (right) mode. . . 86 4.2 Centralized (left), decentralized (middle) and distributed (right) networks . . 88 4.3 Mesh network illustration between three laptops and one drone. . . 96 4.4 Broadcast testbed throughput result in Ad Hoc and BATMAN mesh network
protocol. . . 97 4.5 Data flow between two robots. The FP and GA stand, respectively, for fron-
tier processing and goal assignment. . . 99 4.6 Role evolution in limited communication range. . . 100 4.7 Ad Hoc network illustration during exploration mission. . . 101 4.8 Two ROS configurations in multiple machines case. Image from [Andre et al.,
2014]. . . 102 4.9 Data size when UAVs exchange a whole copy of their local grid map versus
frontier points of it. . . 103
4.10 Network topology evolution during a loop with three cooperative UAVs. . . . 104
4.11 Two cooperative robots exploration along with network connectivity. . . 105
4.12 Global reconstructed 2D occupancy grid map during exploration mission with
two cooperative UAVs. . . 106
A.1 Frontier points vector message definition. . . 114
A.2 Illustration of ROS middleware components interaction. . . 115
A.3 Visualization of two UAVs navigation using Rviz. . . 115
A.4 ROS graph diagram showing topics involved in two cooperative UAVs ex- ploration mission using ORB-SLAM2 approach. Topics related to: UAV 1 are within the Green box (top), UAV 2 are within the Blue box (bottom), the ex- changed data are within the Red box (middle right), and Gazebo environment are within the Yellow box (middle left). . . 117
A.5 ROS TF tree showing the relative frames of two UAVs w.r.t. the world frame. The figure contains frames related to the global localization in the red box (left), frames related to the relative localization computed by ORB-SLAM2 in th two blue boxes (the second box starting from the left and the box on the right), and frames related to the local maps in the green box (the third box starting from the left). . . 119
A.6 move base package framework 8 . . . 121
A.7 Multimaster fkie overview 9 . . . 122
A.8 AR-Drone model. . . 122
A.9 Gazebo environment. . . 123
List of tables
1.1 Multi-robot system architecture comparison. . . 10 2.1 RGB-D sensors comparison. . . 43 2.2 SLAM behavior while modifying linear and angular velocity. T.L: Tracking
Loss; RL: Re-Localization. . . 48 3.1 Common parameters. . . 71 4.1 Infrastructure versus Infrastructureless mode. . . 86 4.2 Comparison between MANET, VANET, FANET and our model’s network
expectation. . . 87 4.3 Examples of some standards used in MRS. . . 91 4.4 Communication module timings. . . 104
xiii
Acknowledgment
I’m so grateful to my supervisors Prof. Vincent Frémont and Prof. Enrico Natalizio for their patience, their continuous encouragement and their constructive advices. They gave me precious directives so that this work come into existence.
I am also grateful to Prof. Ouiddad LABBANI-IGBIDA and Prof. Olivier SIMONIN for accepting being the reviewers of this thesis. Also, I would like to thank Prof. Véronique CHERFAOUI and Prof. Isabelle FANTONI for agreeing to be part of my thesis committee.
I would like to thank all Heudiasyc members, and the SyRI team for their support and kindness. I’m also thankful to my colleagues especially Asma ben said and Arbia Sfar, that become true friends.
Moreover, I thank the Labex for their financial support and for providing the necessary for the smooth running of this thesis.
My gratitude goes to my family without whom, this thesis would not be achieved in good conditions. They provided me continuous support, motivations and positive vibes. My special and heartily thanks goes to my husband who encourages me to always go further.
This thesis is heartily dedicated to my father who took the lead to heaven before the completion of this work.
xv
List of symbols
UAV i UAV of index i.
p i , v i , ω i Pose, linear velocity, and angular velocity of UAV i .
0 W Global reference frame.
W F i UAV i ’s local reference frame w.r.t 0 W.
W h
R t i F
i