logo haid 2019
International Workshop on Haptic and Audio Interaction Design
13-15 March 2019 — Lille, France

The proceedings are accessible here.

Wednesday 13 Thursday 14 Friday 15
9:00 - 10:00 Welcome Paper
session 1
Paper
session 3
9:00 - 10:30
Keynote
Stephen Brewster
10:00 - 11:00
Coffee break
Stands
Coffee break 10:30 - 11:00
11:00 - 12:00 Research Demos Community townhall 11:00 - 12:30
Industry pitches
12:00 - 13:00
Lunch break
Stands
Lunch break 12:30 - 14:00
13:00 - 14:00
14:00 - 15:00 Industry pitches Paper
session 2
Workshop
session 2
14:00 - 15:30
15:00 - 16:00 Coffee break
Stands
Coffee break 15:30 - 16:00
16:00 - 17:00 Industry pitches Workshop
session 1
Closing 16:00 - 17:30
17:00 - 18:00 Cocktail
After 18:00 Concert

Keynote

Stephen Brewster
Stephen A. Brewster
Stephen Brewster is a Professor of Human-Computer Interaction in the School of Computing Science at the University of Glasgow. He leads the Multimodal Interaction Group. His research focuses on multimodal HCI, or using multiple sensory modalities and control mechanisms (particularly audio, haptics and gesture) to create a rich, natural interaction between human and computer. His work has a strong experimental focus, applying perceptual research to practical situations. A long-term focus has been on mobile interaction and how we can design better user interfaces for users who are on the move. Other areas of interest include accessibility, wearable devices and in-car interaction. He pioneered the study of non-speech audio and haptic interaction for mobile devices with work starting in the 1990's. He is a Member of the ACM SIGCHI Academy, an ACM Distinguished Speaker and a Fellow of the Royal Society of Edinburgh.
HAID History and some new ideas
In this talk I will look back at a bit of the history or the Haptic and Audio Interaction Design conference and how it got started. I will then present some new ideas that we are working on at Glasgow using ultrasound to do haptics, audio and levitation for future interactive displays.

Industry pitches

Session 1: Interaction in Virtual Reality

Chair: Stefano Papetti
Eric Vezzoli, Malak Benchekroun, Go Touch VR
+
Handheld and Wearable Haptics for XR
Wearable and Handheld haptics devices have the powerful ability to recreate natural interaction and increase the usability of immersive reality content. There are several friction for the adoption of such technologies both in the design and rendering pipeline. Go Touch VR is addressing the challenge by developing a generic 4 dimensional design and rendering pipeline for wearable and handheld haptics. The framework is based on the restriction to 4 major tactile sensations perceived during manipulation and interaction, and the maximum perception capabilities of the human body. Go Touch VR commercialises XR haptics solutions based on this framework delivered through its proprietary and third party haptics devices.
Massimiliano Di Luca, Facebook Reality Labs
Hand-based VR interaction
William Dulot, Thérémix
+
Virtual reality for music creation and performance
Thérémix Tech presents Modulia Studio, a virtual reality app that allows artists to play and create music in a whole new way. Launch music sequences, play notes in many ways, mix in a 3 dimensional space, assign sound parameters to gestures… Modulia Studio offers an organic, visual and expressive way to interact with music in the studio or on stage.

Session 2: Haptics

Chair: James Leonard
Cédrick Chappaz, hap2U
+
Haptic in Touch Interfaces : Challenges & Opportunities
The massive diffusion of screens and tactile interfaces in our daily life has highlighted the limits of these interaction tools. Whether through a screen in the dashboard of a car or that of an industrial equipment, the user will be faced with as many situations where the use of the sense of vision is problematic. Starting from this observation, the haptic integrated into the tactile interfaces, proves an intuitive and complementary communication tool.
Orestis Georgiou, Ultrahaptics
+
Touching the invisible: the emerging field of ultrasonic mid-air haptics
Gesture-based control interfaces are becoming ubiquitous in our daily lives. We use gestures to interact with our appliances, smartphones, electronic devices and, more recently, virtual and augmented environments. The availability of these solutions has been made possible by a new generation of camera tracking devices. One drawback of gesture-based interfaces is their lack of physicality, feedback, and sense of agency. Recent developments in ultrasound technologies have bridged this gap and made it possible to combine gesture interfaces with haptic feedback. In this presentation, I will discuss how this is achieved, describe some of the research taking place in this space with particular focus on UX and multimodal UI design, and also the commercial applications that are on the horizon.

Session 3: Music

Chair: Stephen Sinclair
David Speith, Native Instruments
The Haptic Drive Jog Wheel - a new interface for digital DJs
Vianney Apreleff, Bleass
BLEASS develops audio software for mobile devices
Mark Zadel, Ableton
A brief introduction to Ableton, Live, and Push
Ludovic Potier, Jonathan Aceituno, Aodyo Instruments
+
New interactions for musical expression, between tradition and innovation
Sylphyo is an electronic wind instrument allowing to perform as a traditional instrument and offering alternate gestures for controlling additional sound parameters. These new interactions take advantage of unused dimensions of traditional practice, such as instrument movements and fingers slides, and are designed to be complementary. Sound is also designed and mapped to respond to the multiple dimensions of the controls.

Stands

Industrial stands

Thomas Rouvillain, Next Sound Lab
Oresti Georgiou, Ultrahaptics
David Speith, Benjamin Weiss, Matthieu Ranc, Native Instruments
William Dulot, Theremix
Ludovic Potier, Jonathan Aceituno, Laurent Pouillard, Aodyo Instruments
Vianney Apreleff, Alexis Zbik, Bleass
Cédrick Chappaz, Corentin Lefebvre, Tom Rouillard, Maxime Harazi, Hap2U
Julian Vogels, Soundbrenner

Academic stands

Corentin Bernard, Michael Wiertlewski, Sølvi Ystad
+
Sound and Texture Synthesizer
Music production on traditional instruments creates vibrations that are perceived via hearing and touch. This is particularly true when producing music on a violin where the vibrations transmitted to the chin helps the musician to hone-in the tone quality. In contrast, modern music synthesizers create high quality sounds without including relevant haptic content. We present the implementation of a haptic sound synthesizer which creates sounds and vibration patterns in real time rendered both acoustically through headphones and haptically through an ultrasonic surface display. The aim of our device is to further investigate the interactions between synthetic haptic textures and sounds.
Christian Frisson, Colin Gallacher, Marcelo Wanderley
+
Haptic techniques for browsing sound maps organized by similarity
This demo paper showcases haptic techniques for browsing sound maps organized by content-based similarity. Our system Freesound Tracker forks the Freesound Explorer application inside a web browser augmented with haptic capabilities using the Chai3d library that we modified to support Haply haptic devices, ours configured with a 2-DOF pantograph mechanism. A first technique en- hances the exploratory navigation of sound maps by adding physical effects, such as: magnetizing the pointer to its closest sound item, rendering viscosity when hovering items. A second technique assists loop-based musical creation by having user-defined sound paths followed by the force-feedback pointing device.
Mengyu Chen, Jing Yan, Yin Yu
+
Biometric Perception Interface
Soft robotics are primarily composed of soft materials with low moduli which are close to that of biological materials. This unique feature leads to its potential of being utilized on human body as wearable devices that can directly stay and interact with the user’s skin, further expanding the application scenario of robot as a haptic agent of interpersonal communication. With the possibility of conveying intimate messages while maintaining a desired physical distance, a skin-contact based remote communication can create a new form of intimate relationship for people. We present Biometric Perception Interface, a wearable perception extension interface that measures and converts pulse into haptic actuation, and allows users to record and playback pulse data from and to their bodies. We demonstrate three inter-connected components of Biometric Perception Interface: Choker, Antenna, and Memorizer. Additionally, Biometric Perception Interface challenges the common practices of visual memory and quantified abstraction of biological phenomena, and proposes an alternative interpersonal intimate communication mediated by soft-robotics.
James Leonard, Jérôme Villeneuve
+
Fast audio-haptic prototyping with mass-interaction physics
This paper presents ongoing work on the topic of physical modelling and force-feedback interaction. Specifically, it proposes a framework for rapidly prototyping virtual objects and scenes by means of mass-interaction models, and coupling the user and these objects via an affordable multi-DoF haptic device. The modelled objects can be computed at the rate of the haptic loop, but can also operate at a higher audio-rate, producing sound. The open-source design and overall simplicity of the proposed system makes it an interesting solution for introducing both physical simulations and force-feedback interaction, and also for applications in artistic creation. This first implementation prefigures current work conducted on the development of modular open-source mass-interaction physics tools for the design of haptic and multisensory applications.
Charlotte Magnusson, Héctor Caltenco and Steinunn Arnars Ólafsdóttir
+
Designing activity games for stroke survivors
In this paper we present work carried out in the EU project STARR and the NordForsk project ActivAbles. We report on the design and iterative development of an outdoor activity game for stroke survivors, and discuss design choices, experiences from the iterative testing and outline potential future developments.

Papers

Session 1: Perception & Psychophysics

Chair: Sølvi Ystad
Sebastian Merchel, Mehmet Ercan Altinsoy
+
Psychophysical Comparison of the Auditory and Vibrotactile Perception - Absolute Sensitivity
In this paper, the psychophysical abilities and limitations of the auditory and vibrotactile modality will be discussed. A direct comparison reveals similarities and differences. The knowledge of those is important for the design of audio-haptic systems, multimodal music applications or perceptually optimized human-machine interfaces. Literature data and own results for psychophysical characteristics are discussed. This paper focuses on the absolute perception thresholds of both modalities. The main factors which influence these thresholds are discussed: age, energy integration, masking and adaptation.
Mengyu Chen, Jing Yan, Yin Yu
+
Biometric Perception Interface
Soft robotics are primarily composed of soft materials with low moduli which are close to that of biological materials. This unique feature leads to its potential of being utilized on human body as wearable devices that can directly stay and interact with the user’s skin, further expanding the application scenario of robot as a haptic agent of interpersonal communication. With the possibility of conveying intimate messages while maintaining a desired physical distance, a skin-contact based remote communication can create a new form of intimate relationship for people. We present Biometric Perception Interface, a wearable perception extension interface that measures and converts pulse into haptic actuation, and allows users to record and playback pulse data from and to their bodies. We demonstrate three inter-connected components of Biometric Perception Interface: Choker, Antenna, and Memorizer. Additionally, Biometric Perception Interface challenges the common practices of visual memory and quantified abstraction of biological phenomena, and proposes an alternative interpersonal intimate communication mediated by soft-robotics.
Yuri De Pra, Federico Fontana, Hanna Järveläinen, Stefano Papetti, Michele Simonato, Riccardo Furlanetto
+
Auditory and tactile recognition of resonant material vibrations in a passive task of bouncing perception
Besides vision and audition, everyday materials can be passively explored also using touch if they provide tactile feedback to users, for instance in consequence of an external force exciting their natural resonances. If such resonances are known to provide informative auditory cues of material, on the other hand their role when a recognition is made through touch is debatable. Even more questionable is a material recognition from their reproductions: if happening, then they could be used to enrich existing touch-screen interactions with ecological auditory and haptic feedback furthermore requiring inexpensive actuation. With this goal in mind, two experiments are proposed evaluating user's ability to classify wooden, plastic, and metallic surfaces respectively using auditory and haptic cues. Although the literature reports successful auditory classification of everyday material simulations, especially the passive recognition of such material reproductions by holding a finger on a vibrating glass surface has never been tested. By separately reproducing the sound and vibration of a ping-pong ball bouncing on wood, plastic and metal surfaces, our tests report not only auditory, but also tactile recognition of the same materials significantly above chance. Discrepancies existing between our and previously reported results are discussed.

Session 2: Tools & Technologies

Chair: Géry Casiez
Tamara Fiedler, Yasemin Vardar
+
A Novel Texture Rendering Approach for Electrostatic Displays
Generating realistic texture feelings on tactile displays using data-driven methods have attracted a lot of interest in the last decade. However, the need for large data storages and transmission rates complicates the use of these methods for the future commercial displays. In this paper, we propose a new texture rendering approach which can compress the texture data significantly for electrostatic displays. Using three sample surfaces, we rst explain how to record, analyze and compress the texture data, and render them on a touchscreen. Then, through psychophysical experiments conducted with nineteen participants, we show that the textures can be reproduced by a significantly less number of frequency components than the ones in the original signal without inducing perceptual degradation. Moreover, our results indicate that the possible degree of compression is affected by the surface properties.
James Leonard, Jérôme Villeneuve
+
Fast audio-haptic prototyping with mass-interaction physics
This paper presents ongoing work on the topic of physical modelling and force-feedback interaction. Specifically, it proposes a framework for rapidly prototyping virtual objects and scenes by means of mass-interaction models, and coupling the user and these objects via an affordable multi-DoF haptic device. The modelled objects can be computed at the rate of the haptic loop, but can also operate at a higher audio-rate, producing sound. The open-source design and overall simplicity of the proposed system makes it an interesting solution for introducing both physical simulations and force-feedback interaction, and also for applications in artistic creation. This first implementation prefigures current work conducted on the development of modular open-source mass-interaction physics tools for the design of haptic and multisensory applications.
Aditya Tirumala Bukkapatnam, Philippe Depalle, Marcelo Wanderley
+
Autoregressive Parameter Estimation for Equalizing Vibrotactile Systems
Haptic feedback has become a common feature in many digital devices, particularly in the form of vibration, often known as vibrotactile feedback. Vibrotactile systems largely rely on ERMs (Eccentric Rotating Masses), or LRAs (Linear Resonant Actuators), which are inexpensive, but exhibit limitations like limited frequency bandwidth or a coupled control over amplitude and frequency of the vibration. Due to the highly developed physical and cognitive expertise involved in musical performance, advanced haptic feedback systems in musical applications have seen the need for independent control of amplitude and frequency of the vibration, over a wide frequency bandwidth. Some of the systems that are proposed to address this requirement have benefit from the characterization of various technical aspects of the amplifiers and actuators involved in such implementations, as well as the equalization of their overall frequency response characteristics. This equalization process is typically implemented with the help of manually configured parametric equalizers that counter the system's intrinsic frequency characteristics. In this paper, we propose an autoregressive method that automatically estimates stable and minimum-phase filter parameters, which by design, remain stable upon inversion. We demonstrate this method, with an example implementation, and present concluding remarks on the degree of equalization achieved by this method.

Session 3: Human-Computer Interaction & Multimodality

Chair: Hasti Seifi
Wanjoo Park, Muhammad Hassan Jamil, Mohamad Eid
+
Prefrontal and Interhemispheric Functional EEG Connectivity in Presence and Absence of Tactile Stimulation
Developing quantitative means to measure the effects of tactile stimulation on user experience is gaining increasing attention over the past decade. This paper strives to find quantitative evidence, based on brain wave analysis, of the relationship between tactile stimulation and cognitive processes. In this study, participants performed an active touch task comprising of stroking the strings of a virtual guitar displayed on a touch-screen device while measuring the 64-channel EEG signal. For the analysis, phase locking values from alpha, beta, and gamma frequency bands are extracted and compared during the absence and presence of tactile stimulation. Results demonstrated an increase in the connectivity of beta and gamma bands in the prefrontal cortex in the presence of tactile stimulation. Such increase is associated with the development of Bereitschaftspotential, which reflects the intention, planning, and execution of precise voluntary movements. Another interesting finding was an increased interhemispheric connectivity in the absence of tactile stimulation, which is associated with motor impairments. These findings suggest that tactile stimulation not only trigger sensation, but further activate cognitive processing associated with motor skills.
Charlotte Magnusson, Héctor Caltenco, Steinunn Arnars Ólafsdóttir
+
Designing activity games for stroke survivors
In this paper we present work carried out in the EU project STARR and the NordForsk project ActivAbles. We report on the design and iterative development of an outdoor activity game for stroke survivors, and discuss design choices, experiences from the iterative testing and outline potential future developments.
Eliott Audry, Jérémie Garcia
+
Towards congruent cross-modal audio-visual alarms for supervision tasks
Operators in surveillance activities face cognitive overload due to the fragmentation of information on several screens, the dynamic nature of the task and the multiple visual or audible alarms. This paper presents our ongoing efforts to design efficient audio-visual alarms for surveillance activities such as traffic management or air traffic control. We motivate the use of congruent cross-modal animations to design alarms and describe audio-visual mappings based on this paradigm. We conclude with the design of a study to validate our designs as well as future research directions.

Demos

Stefano Papetti, Martin Fröhlich
+
The TouchBox: open-source haptic device for finger-based interaction
The TouchBox is a low-cost DIY human-computer interface providing fast, wide-band and high-dynamics vibrotactile feedback, tracking the position of up to two finger-pads in contact with its top surface, and measuring their contact areas as well as the applied normal and lateral (3D) forces. Finger-pressing is a common gesture while interacting with everyday objects, including digital devices. Also, while one or more fingers are in contact with an object (as the result of finger-pressing), another common gesture is that of pushing or sliding the finger(s) over the touched surface, giving rise to shear strain. The proposed interface addresses a simplified version of such scenario. Applications range from using the interface as calibrated measurement device to advanced human-machine interaction.
Corentin Bernard, Michael Wiertlewski, Sølvi Ystad
+
Sound and Texture Synthesizer
Music production on traditional instruments creates vibrations that are perceived via hearing and touch. This is particularly true when producing music on a violin where the vibrations transmitted to the chin helps the musician to hone-in the tone quality. In contrast, modern music synthesizers create high quality sounds without including relevant haptic content. We present the implementation of a haptic sound synthesizer which creates sounds and vibration patterns in real time rendered both acoustically through headphones and haptically through an ultrasonic surface display. The aim of our device is to further investigate the interactions between synthetic haptic textures and sounds.
Christian Frisson, Colin Gallacher, Marcelo Wanderley
+
Haptic techniques for browsing sound maps organized by similarity
This demo paper showcases haptic techniques for browsing sound maps organized by content-based similarity. Our system Freesound Tracker forks the Freesound Explorer application inside a web browser augmented with haptic capabilities using the Chai3d library that we modified to support Haply haptic devices, ours configured with a 2-DOF pantograph mechanism. A first technique en- hances the exploratory navigation of sound maps by adding physical effects, such as: magnetizing the pointer to its closest sound item, rendering viscosity when hovering items. A second technique assists loop-based musical creation by having user-defined sound paths followed by the force-feedback pointing device.
Stephen Sinclair and Bret Battey
+
DIMPLE links 3D interactive haptics with audiovisual design software
DIMPLE is a software system providing a 3D "blank canvas" in which users can create 3D interactive scenes for haptic audio-visual inter- action from any media creation system that supports Open Sound Control messaging. This update modernizes DIMPLE for the cur- rent generation of haptic devices and operating systems.
Mengyu Chen, Jing Yan, Yin Yu
+
Biometric Perception Interface (paper)
Soft robotics are primarily composed of soft materials with low moduli which are close to that of biological materials. This unique feature leads to its potential of being utilized on human body as wearable devices that can directly stay and interact with the user’s skin, further expanding the application scenario of robot as a haptic agent of interpersonal communication. With the possibility of conveying intimate messages while maintaining a desired physical distance, a skin-contact based remote communication can create a new form of intimate relationship for people. We present Biometric Perception Interface, a wearable perception extension interface that measures and converts pulse into haptic actuation, and allows users to record and playback pulse data from and to their bodies. We demonstrate three inter-connected components of Biometric Perception Interface: Choker, Antenna, and Memorizer. Additionally, Biometric Perception Interface challenges the common practices of visual memory and quantified abstraction of biological phenomena, and proposes an alternative interpersonal intimate communication mediated by soft-robotics.
James Leonard, Jérôme Villeneuve
+
Fast audio-haptic prototyping with mass-interaction physics (paper)
This paper presents ongoing work on the topic of physical modelling and force-feedback interaction. Specifically, it proposes a framework for rapidly prototyping virtual objects and scenes by means of mass-interaction models, and coupling the user and these objects via an affordable multi-DoF haptic device. The modelled objects can be computed at the rate of the haptic loop, but can also operate at a higher audio-rate, producing sound. The open-source design and overall simplicity of the proposed system makes it an interesting solution for introducing both physical simulations and force-feedback interaction, and also for applications in artistic creation. This first implementation prefigures current work conducted on the development of modular open-source mass-interaction physics tools for the design of haptic and multisensory applications.
Charlotte Magnusson, Héctor Caltenco and Steinunn Arnars Ólafsdóttir
+
Designing activity games for stroke survivors (paper)
In this paper we present work carried out in the EU project STARR and the NordForsk project ActivAbles. We report on the design and iterative development of an outdoor activity game for stroke survivors, and discuss design choices, experiences from the iterative testing and outline potential future developments.

Workshops

Stephen Sinclair, James Leonard, Jérôme Villeneuve, Christian Frisson
+
Open technologies for force-feedback in artistic creation: challenges and opportunities

This workshop proposes a discussion and collective reflexion on the topic of accessible (open-source and/or open-hardware) haptic technologies for applications in artistic contexts, such as music, interactive or immersive digital arts, etc.

This field raises several challenges, such as the access to modular force-feedback devices, access to frameworks that enable non-expert users to design multi-modal virtual objects and define cross-modal mapping strategies, as well as the relative lack of haptic libraries for recent programming languages (in particular web-based). However, it also offers promising perspectives as open haptic and software technologies advance, rendering the incorporation of haptics into any artistic process easier than ever before.

The workshop will be completed with a time for hands-on experimentation with a series of devices and tools allowing to: haptically interact with modular audio-visual processes (DIMPLE), directly manipulate sound-producing virtual objects or instruments (miPhysics) or assist the user in a variety of sound-related HCI tasks (Freesound Tracker).

Annie Luciani, Nicolas Castagné, Claude Cadoz
+
Enactive Learning in Digital Creativity

In this workshop, we propose to explain and discuss how tangibility supported by haptic and multisensory interactions will contribute to better learning of dexterous gestural tasks (musical and artistic creation, medical applications, industrial tasks, fundamental physics).

The explanations and discussions will be supported by demos from ACROE and partners of Grenoble Idex project on Enactive learning and of European Art-Science-Technology for Digital Creativity.

“Enactive Learning” is a project supported by an IDEX Grenoble-Alpes University educational consortium.

“European Art-Science-Technology for Digital Creativity” is a large scale european project supported by the AECEA (Education, Audiovisual and Culture Executive Agency) in which 14 european institutions (academic and artistic) are involved.

Collaborative musical performance, by means of the two coupled ACROE 12 DOF haptic devices, allowing two performers to play together on the same virtual instrument while they feel each other’s gestures.

Concert

Peter Orins - "Having Never Written A Note For Percussion" (James Whitney)
Peter Orins explores the various sound textures one can produce with drums, searching for the ambiguity of timbres and how they are produced, for the saturation of sounds and the harmonics of each percussion. He seeks to highlight both the "microscopic" sounds from prepared heads and metals brushed and struck at very low levels, and the acoustic pressure from drums played at a loud volume. In that spirit, he will perform a piece by James Tenney entitled "Having Never Written A Note For Percussion", from the "Postal Pieces" collection. This piece demonstrates the natural yet intense relation between haptics and sound as the percussionist explores the richness of sounds and responses from a cymbal.
Peter Orins and Ivann Cruz - Vibrating shapes
Peter Orins and Ivann Cruz have both been performing improvised music for more than 15 years, in various configurations and genres.In this structured improvisation, they explore the intersection of virtual and physical spaces with visual, sonic and haptic feedback.3D shapes placed around their acoustic instruments (guitar and drums) are revealed both visually, as video projections of the intersections with the physical bodies and instruments, and sonically, as vibrations of surface speakers propagated to sounding objects.