• Aucun résultat trouvé

INSPEX: Integrated portable multi-sensor obstacle detection device. Application to navigation for visually impaired people

N/A
N/A
Protected

Academic year: 2021

Partager "INSPEX: Integrated portable multi-sensor obstacle detection device. Application to navigation for visually impaired people"

Copied!
17
0
0

Texte intégral

(1)

HAL Id: cea-01863959

https://hal-cea.archives-ouvertes.fr/cea-01863959

Submitted on 29 Aug 2018

HAL is a multi-disciplinary open access

archive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers.

L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.

INSPEX: Integrated portable multi-sensor obstacle detection device. Application to navigation for visually

impaired people

O Debicki, N Mareau, L. Ouvry, J Foucault, Suzanne Lesecq, Gabriela Dudnik, Marc Correvon

To cite this version:

O Debicki, N Mareau, L. Ouvry, J Foucault, Suzanne Lesecq, et al.. INSPEX: Integrated portable multi-sensor obstacle detection device. Application to navigation for visually impaired people. Design and Automation Conference DAC 2018, Jun 2018, San Francisco, United States. �cea-01863959�

(2)

This work has been partly funded by the European Union's Horizon 2020 Research and Innovation programme under grant agreement no 730953. This work was supported in part by the Swiss secretariat for education, research and innovation (SERI) under grant 16.0136 730953

INSPEX: Integrated portable multi-sensor

obstacle detection device

Application to navigation for visually impaired people

O. Debicki2, N. Mareau1, L. Ouvry1, J. Foucault1, S. Lesecq1

G. Dudnik3, M. Correvon3 1Univ. Grenoble Alpes, CEA, LETI, F-38000 Grenoble 2Univ. Grenoble Alpes, CEA, LIST, F-38000 Grenoble 3CSEM SA, CH-2002 Neuchâtel

(3)

H2020 INSPEX project: Augmenting capabilities of the white cane

• Our motivation:

Help vision impaired persons on their daily commute.

According to the definition of visual impairement of the World Health Organisation:

1.6 million people suffer of blindness in EU, only 5% are fully autonomous in their daily mobility. [WHO12]

40% of the visually impaired suffer head level accidents at least once a month, and 30% suffer a fall accident at least once a month.[MK11]

OptekSystems Sonar Glasses

• Existing Electronic Travel Aid (ETA):

◦ Some wearable solutions (sunglasses, gloves…) are sometimes

considered as extra prosthesis, cumbersome and stigmatizing.

◦ Rely on a single sensor technology: ultrasound which is sensitive to multi-echo leading to wrong detections.

◦ Perception is limited to range sensing (of the nearest target). Current experiments on environmental perception require non autonomous and non embedded devices.

(4)

• Context:

◦ Obstacle detection available in high-end vehicles

◦ Multi-sensor (LiDAR, radar, ultrasonic, vision) solutions to compensate the drawbacks of each sensor technology

◦ Data Fusion highly demanding in terms

of memory and computation to

construct a representation of the vehicle surroundings

Context and objectives

• Objectives: integrate obstacle detection capabilities in a portable device to offer a

virtual safety cocoon to its “user”. This requires:

Integration of several sensor technologies to ensure detection of various obstacles (material,

size, shape, colour) in 3D, at different heights and ranges, in various environment conditions

Low-power light-weight small-size range sensors

Highly efficient fusion technique to be embedded on a small computing platform (typ. µCont.)

Modular system to tackle different application domains that might benefit of such

functionality to improve the “user” navigation experience and safety (e.g. drone)

Occupancy grid

User needs

(5)

A portable multi-sensor obstacle detection device

Radar

LiDARs

Ultrasound

Integration of a single portable device in charge of environment perception.

◦ various weather conditions,

◦ weight and size constraints,

◦ connection to smart environment,

◦ detection of obstacles at head and waist level

Different range sensing technologies (lidar, radar, ultrasound) , each one compensating the drawbacks of another one.

User needs

User req.

(6)

A portable multi-sensor obstacle detection device

Radar

LiDARs

Ultrasound

Integration of a single portable device in charge of environment perception.

◦ various weather conditions,

◦ weight and size constraints,

◦ connection to smart environment,

◦ detection of obstacles at head and waist level

Different range sensing technologies (lidar, radar, ultrasound) , each one compensating the drawbacks of another one.

User needs

User req.

(7)

A portable multi-sensor obstacle detection device

Radar

LiDARs

Ultrasound

Integration of a single portable device in charge of environment perception.

◦ various weather conditions,

◦ weight and size constraints,

◦ connection to smart environment,

◦ detection of obstacles at head and waist level

Different range sensing technologies (lidar, radar, ultrasound) , each one compensating the drawbacks of another one.

User needs

User req.

(8)

System requirements

Portable device ( 200gr, 100cm3, 500mW)

Environment sensors for Context awareness

:

◦ Ultrasound sensors (range 2m),

◦ Radar (range 10m, FoV 60°),

◦ short/long ranges LiDAR (3-5m/10m),

◦ Inertial measurement unit

Overlap of sensors Field-of-View

Autonomous 10 hours lifetime

Detect obstacles up to a distance of 4m, moving at a

relative speed of ~1.4 m/s worst case

User needs

User req.

System req. & archi

(9)

Conceptual architecture

Measurement frontend - Asynchronous acquisition - Data pre-treatment Fusion hub - SigmaFusion®: - Data Fusion - Model of environment IMU Range sensors Occupation grid

White cane conceptual view

Adapt or develop new sensing technologies :

Development of new sensors (LR lidar, UWB radar, Ultrasound piezo)

Mechanical and electronical integration of existing

sensors (SR lidars)

Develop innovative post-treatment for obstacle detection, sensor positioning

Transfer data fusion techniques from the automotive domain into the white cane:

◦ 3D data fusion based on 3D occupancy map of the device surrounding thanks to SigmaFusion® Algorithm (Bayesian fusion) User needs User req. System req. & archi

(10)

Detailed architecture of the proof-of-concept demonstrator (HW)

RS232 RS232 Terabee Uno (IR TOF) (x 2) Garmin LiDAR Lite V3 (x 1) Bosch BNO055 IMU (x 1) CEA UWB RF RADAR (x 1) BLE Acquisition subsystem Based on NUCLEO-F767ZI Cortex M7@216MHz 1 Mbytes of Flash memory 500 Kbytes of SRAM Expansion connectors:

• ST Zio including Arduino™ • ST morpho

Fusion subsystem

Based on Discovery-32F746G

Cortex M7@216MHz 8 Mbytes of SDRAM

1 Mbytes of Flash memory 340 Kbytes of SRAM Communication subsystem Based on Microchip-RN4020 Uses proprietary Serial over BLE

protocol Measurement frontend RS232 300sps RS232 100sps I2C 100sps SPI Fusion hub

A basic architecture with an scalable ecosystem based on two high-end MCUs (CORTEX M7)

The split in two part is appropriate for separation of concerns (i.e. sensor providers do not interfere with the SigmaFusion algorithm/IP).

(11)

Detailed architecture of the proof-of-concept demonstrator (software)

Depth Maps

LWIP (TCP-IP) MODBUS FreeRTOS

LWIP (TCP-IP) MODBUS FreeRTOS Terabee Uno (IR TOF) (x 2) Garmin LiDAR Lite V3 (x 1) Bosch BNO055 IMU (x 1) CEA UWB RF RADAR (x 1)

Acquisition subsystem Fusion subsystem

Commu nic ati on su b sy st em

Measurement frontend Fusion

hub

• The software architecture has been designed in compliance with the system requirements, i.e. ensuring

modularity, software reuse.

(12)

Detailed architecture of the proof-of-concept demonstrator (software)

Depth Maps

LWIP (TCP-IP) MODBUS FreeRTOS Terabee Uno (IR TOF) (x 2) Garmin LiDAR Lite V3 (x 1) Bosch BNO055 IMU (x 1) CEA UWB RF RADAR (x 1) Commu nic ati on su b sy st em

Measurement frontend Fusion

hub

• The software architecture has been designed in compliance with the system requirements, i.e. ensuring

modularity, software reuse.

The same runtime is running on both subsystems in order to ease future evolution of the hardware platform with the integration of the whole software on a unique computing platform (provided it offers enough memory and computational capability).

(13)

Traditionally sensors and post-treatment are paired. There is no access to the raw data.

• In the white cane architecture, the post-treatment is done in the

measurement front-end MCU:

◦ It enables accesses to sensors raw data

A post-treatment module can also consider information from other sensors

• Enables development of more efficient algorithms based on the whole

sensing ecosystem.

Focus on the measurement front-end

RADAR IMU UltraSound Lidar Position estimation Object detection Distance filtering Distance filtering Fusion hub Measurement frontend Ra w Da ta RADAR IMU UltraSound Lidar Position estimation Object detection Distance filtering Distance filtering Fusion hub Measurement frontend Ra w Da ta n n n n N > Σ n

(14)

Focus on the fusion hub

• Model of environment is based on 3D occupancy grid.

• SigmaFusionTMis based on Bayesian sensor fusion.

◦ SigmaFusionTM 2D has been extended to the 3D.

◦ SigmaFusionTM has been extended to take into account odometry and

filtering over time.

A safety cocoon of 8000 cells is scanned at 20Hz. Information is

compressed and sent to the user’s smartphone for rendering.

BMW F. Homm [IV’10] Mercedes D. Nuss [IV’15] CEA (2016) [DAC’16] CEA (2017) [ITS’17] Performance 1 x0,39 x1 x1 (<50%CPU) Power 204W 80W 0,5 W 1.5 W

HW Nvidia GeForce268GTX Desktop µC Cortex M7

@200 MHz

µC Aurix TriCore @300 MHz

ISO26262 ASIL-D Non Non Non Possible

4m/10cm floor 8k cells Safety cocoon Occupancy Probability 0 1

(15)

The early prototype is used in sensor experiments and

firmware development. It is based on standard development

kits and out of the shelf sensors. It has enabled embedded software development during the specification of the

mechanical and electrical constraints.

The intermediate prototype allows connectivity with the new sensors developed in the project. It embeds the two

sub-systems but one micro-controller can be bypassed to try to go one step further in the software integration.

The final prototype is a smart and miniaturized device issued from the user’s needs. So the specification task started at the beginning of the project and preliminary investigations have been used as guidelines for the modules developments.

A incremental approach up to the final integration

GEP_P1 TOP LAYER

BLE SDRAM ACQ-UC APP-UC SDRAM ETHERNET ETHERNET ETH PWR VDD PWR GND GND AHRS AMBIENT LIGHT TEMP REL H%

(16)

The objective of Inspex project is to transpose

automotive technologies for environment perception to portable devices.

• New system requirements imply development of innovative sensor technologies and algorithms while supporting the integration.

• A basic architecture based on two high-end MCUs allows separation of concerns and helps mitigating the risks all along the project.

Conclusions

Requirements

System Architecture

Sub System Architecture

Sub System Integration System Integration

System Validation

• Integrating and sharing sensor post-treatment on the same CPU increases the entropy of data manipulated.

• 2D map fusion has been extended to 3D without additional computing resources.

A roadmap of demonstrators allows confrontation of the results from the early prototype validation with the requirements of the final demo.

(17)

References

• [WHO12] WHO, “Visual impairment and blindness,” 2012,

http://www.who.int/mediacentre/factsheets/fs282/en/

[MK11] Roberto Manduchi, Sri Kurniawan, Mobility-Related Accidents Experienced by People with Visual

Impairment, Research and Practice in Visual Impairment and Blindness, 2011.

• Ref SigmaFusion: “Multi-Sensor Fusion of Occupancy Grids based on Integer Arithmetic”, T. Rakotovao, et al., IEEE Int. Conf. on Robotics and Automation, ICRA, 2016.

• [IV’10] F. Homm, N. Kaempchen, J. Ota, and D. Burschka. Efficient occupancy grid computation on the GPU with lidar and radar for road boundary detection. In IEEE IV, pages 1006–1013, June 2010.

• [IV’15] D. Nuss, T. Yuan, G. Krehl, M. Stuebler, S. Reuter, and K. Dietmayer. Fusion of laser and radar sensor data with a sequential Monte Carlo Bayesian occupancy filter. In IEEE IV, 2015.

Références

Documents relatifs

[r]

La fonction f 0 est d´ efinie sur [0; 7] car le d´ enominateur de f ne s’annule pas sur

[r]

A vant d'aepter la partie, vous essayez de simuler e jeu, pour voir si vous. avez des hanes de

Artificial Neural Networks: From Perceptron to Deep Learning 1 © 2021 ⏐ Younès Bennani - USPN.. Artificial

En déduire tous les éléments du

La répartition d’un collège en fonction de la couleur de leurs tenues d’EPS a donné le tableau suivant :.. Couleur VERT ROUGE

Dans le but de protéger la confidentialité de ses échanges, une agence de renseignement a contacté un informaticien pour mettre sur pied un procède de codage et voudrait que