• Aucun résultat trouvé

A New Haptic Interaction Paradigm

N/A
N/A
Protected

Academic year: 2021

Partager "A New Haptic Interaction Paradigm"

Copied!
3
0
0

Texte intégral

(1)

HAL Id: hal-02161588

https://hal.archives-ouvertes.fr/hal-02161588

Submitted on 4 Jul 2019

HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers.

L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.

A New Haptic Interaction Paradigm

Aakar Gupta, Thomas Pietrzak

To cite this version:

Aakar Gupta, Thomas Pietrzak. A New Haptic Interaction Paradigm. Adjunct Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI 2016), Workshop on Mid-Air Haptics and Displays: Systems for Un-instrumented Mid-Air Interactions, May 2016, San Jose, United States.

�hal-02161588�

(2)

A New Haptic Interaction Paradigm

Aakar Gupta University of Toronto Toronto, ON, Canada aakar@dgp.toronto.edu

Thomas Pietrzak Univ. Lille Lille, France

thomas.pietrzak@univ-lille1.fr

Paste the appropriate copyright statement here. ACM now supports three different copyright statements:

• ACM copyright: ACM holds the copyright on the work. This is the historical approach.

• License: The author(s) retain copyright, but ACM receives an exclusive publication license.

• Open Access: The author(s) wish to pay for the work to be open access. The additional fee must be paid to ACM.

This text field is large enough to hold the appropriate release statement assuming it is single spaced in a sans-serif 7 point font.

Every submission will be assigned their own unique DOI string to be included here.

Abstract

Existing tactile experiences are limited to providing feed- back or conveying pattern information to the user. In this position paper, we discuss how mid-air haptics are uniquely suited to explore wholly independent haptic interaction paradigms which allow the user to interact with a computer completely using her tactile and kinesthetic sense without the presence of a visual interface.

Author Keywords

Haptics; Tactile; Interaction;

ACM Classification Keywords

H.5.2 [Information interfaces and presentation]: User Inter- faces - Haptic I/O

Introduction

Current haptic technologies have enabled advanced user experiences. However, these experiences almost always function as haptic patterns to convey infomrmation to the user or as assistive feedback to a visual interface.Engaging with visual displays is not always feasible or desired. At this stage of haptics, an exploration is needed into the possibil- ities of a wholly independent haptic interaction paradigm which allows the user to interact with a computer com- pletely using her tactile and kinesthetic sense without the presence of a visual interface. While one could argue that

(3)

such independent interactions exist in bits and pieces, an interaction paradigm akin to direct manipulation in visual displays is missing in haptic displays. Owing to its unique affordances, mid-air ultrasound haptics could be uniquely suited to explore such interaction paradigms.

Exploration into a Haptic Interaction Paradigm

The concept of direct manipulaton underlies almost all of the visual interfaces we use today. These could be indi- rect pointing interfaces that utilize a pointer to manipulate objects, or direct pointing interfaces that allow the user to manipulate these objects directly via touch or other means.

A notion analogous to direct manipulation is absent in hap- tic displays. One of our upcoming papers in CHI 2016 [2]

raises this issue and introduces the concept of direct ma- nipulation for tactile displays that are instrumented on the skin. We describe how concepts analogous to a visual dis- play can be adapted into a tactile context, such as a tactile screen, tactile pixel, tactile pointer, and tactile targets. How- ever, there are a lot of open questions.

There are multiple facets to such interaction paradigms that need exploration: tactile vs haptic displays, 1D vs 2D vs 3D displays, direct pointing vs indirect pointing vs something completely novel, and different body and skin regions and their unique constraints. Looking at its demonstrated capa- bilities, mid-air haptics can potentially address a lot of these questions. It actuates multi-point mid-air haptic sensations with users being able to distinguish different properties of points at small separations [1]. Such multiplicity of points will be useful when defining a haptic interface model. Fur- ther, perception studies have shown high resolution local- ization and apparent motion capabilities [6]. Plus, continu- ous 3D and volumetric haptic shapes have been rendered in mid-air using ultrasound [3]. All these works plus inter- action works that have demonstrated ultrasonic widgets [5]

and mid-air haptic interaction with floating screens [4] all encourage us to think that an end-to-end haptic interaction paradigm would not only be feasible, but easily implentable and usable too with the use of mid-air haptics.

References

[1] Tom Carter, Sue Ann Seah, Benjamin Long, Bruce Drinkwater, and Sriram Subramanian. 2013. Ultra- Haptics: Multi-point Mid-air Haptic Feedback for Touch Surfaces.(UIST ’13). ACM, New York, NY, USA, 505–

514. DOI:http://dx.doi.org/10.1145/2501988.2502018 [2] Aakar Gupta, Thomas Pietrzak, Nicolas Roussel, and

Ravin Balakrishnan. 2016. Direct Manipulation in Tac- tile Displays.(CHI ’16). ACM, New York, NY, USA, 10.

DOI:http://dx.doi.org/10.1145/2858036.2858161

[3] Benjamin Long, Sue Ann Seah, Tom Carter, and Sri- ram Subramanian. 2014. Rendering Volumetric Hap- tic Shapes in Mid-air Using Ultrasound. ACM Trans.

Graph.33, 6, Article 181 (Nov. 2014), 10 pages. DOI:

http://dx.doi.org/10.1145/2661229.2661257

[4] Yasuaki Monnai, Keisuke Hasegawa, Masahiro Fuji- wara, Kazuma Yoshino, Seki Inoue, and Hiroyuki Shin- oda. 2014. HaptoMime: Mid-air Haptic Interaction with a Floating Virtual Screen.(UIST ’14). ACM, New York, NY, USA, 663–667. DOI:http://dx.doi.org/10.1145/

2642918.2647407

[5] Dong-Bach Vo and S.A. Brewster. 2015. Touching the invisible: Localizing ultrasonic haptic cues. InWorld Haptics Conference (WHC), 2015 IEEE. 368–373.

DOI:http://dx.doi.org/10.1109/WHC.2015.7177740 [6] Graham Wilson, Thomas Carter, Sriram Subrama-

nian, and Stephen A. Brewster. 2014. Perception of Ultrasonic Haptic Feedback on the Hand: Localisa- tion and Apparent Motion.(CHI ’14). ACM, New York, NY, USA, 1133–1142. DOI:http://dx.doi.org/10.1145/

2556288.2557033

Références

Documents relatifs

The user follows the different steps of the crepe preparation process (stirring and scooping the dough, pouring the dough into the pan, flipping the crepe, spreading the toppings)

Participants will first learn haptics by experience: they will feel how it affects their sensations, rather than having to guess or imag- ine the effects of such stimuli

This paper introduced MICE (Machines Interaction Control in their Environment) which allows a user to program the interaction between a set of sensors and a

In 2004, we started to investigated the benefits that haptic and multimodal feedback can provide in human-scale inter- action with virtual mock-ups using the SPIDAR [IJRS05, RCI ∗

We designed a new prototype, called V4, where several sensors are used to control each finger: a lever-switch for controlling the virtual finger in a free motion

in every graph and the good fit of the psychometric curves show that curvature discrimination is possible using dynamic ultrasound haptic stimuli, regardless of the sampling

Figure 10: A similar attack in the DWG model, but using a “smooth” friction model—here, horizontal forces experienced by the operator are related only to the direction of movement,

Starting from the studies concerning the building of motor programs [9-12] the usability and the learnability of 4 versions of the 3DHM have been tested, in order to