logo haid 2019
International Workshop on Haptic and Audio Interaction Design
13-15 March 2019 — Lille, France

This is a preliminary program. The details about sessions will be posted soon.

The proceedings are accessible here.

Wednesday 13 Thursday 14 Friday 15
9:00 - 10:30 Welcome Paper session 1 Paper session 3
Keynote
Stephen Brewster
10:30 - 11:00 Coffee break
11:00 - 12:30 Pitches
Demos
Research Demos Workshop session 2
12:30 - 14:00 Lunch break
14:00 - 15:30 Pitches
Demos
Paper session 2 Workshop session 3
15:30 - 16:00 Coffee break
16:00 - 17:30 Pitches
Demos
Workshop session 1 Community townhall
17:30 - 18:00 Closing
After 18:00 Welcome Reception Concert

Keynote

Stephen Brewster
Stephen A. Brewster
Stephen Brewster is a Professor of Human-Computer Interaction in the School of Computing Science at the University of Glasgow. He leads the Multimodal Interaction Group. His research focuses on multimodal HCI, or using multiple sensory modalities and control mechanisms (particularly audio, haptics and gesture) to create a rich, natural interaction between human and computer. His work has a strong experimental focus, applying perceptual research to practical situations. A long-term focus has been on mobile interaction and how we can design better user interfaces for users who are on the move. Other areas of interest include accessibility, wearable devices and in-car interaction. He pioneered the study of non-speech audio and haptic interaction for mobile devices with work starting in the 1990's. He is a Member of the ACM SIGCHI Academy, an ACM Distinguished Speaker and a Fellow of the Royal Society of Edinburgh.

Industry pitches

Session 1

Eric Vezzoli, Go Touch VR
Cédrick Chappaz, hap2U
Haptic in Touch Interfaces : Challenges & Opportunities

Session 2

Massimiliano Di Luca, Facebook Reality Labs

Session 3

William Dulot, Thérémix
+
Virtual reality for music creation and performance
Thérémix Tech presents Modulia Studio, a virtual reality app that allows artists to play and create music in a whole new way. Launch music sequences, play notes in many ways, mix in a 3 dimensional space, assign sound parameters to gestures… Modulia Studio offers an organic, visual and expressive way to interact with music in the studio or on stage.
Orestis Georgiou, Ultrahaptics
+
Touching the invisible: the emerging field of ultrasonic mid-air haptics
Gesture-based control interfaces are becoming ubiquitous in our daily lives. We use gestures to interact with our appliances, smartphones, electronic devices and, more recently, virtual and augmented environments. The availability of these solutions has been made possible by a new generation of camera tracking devices. One drawback of gesture-based interfaces is their lack of physicality, feedback, and sense of agency. Recent developments in ultrasound technologies have bridged this gap and made it possible to combine gesture interfaces with haptic feedback. In this presentation, I will discuss how this is achieved, describe some of the research taking place in this space with particular focus on UX and multimodal UI design, and also the commercial applications that are on the horizon.

Papers

Session 1: Perception & Psychophysics

Sebastian Merchel, Mehmet Ercan Altinsoy
+
Psychophysical Comparison of the Auditory and Vibrotactile Perception - Absolute Sensitivity
In this paper, the psychophysical abilities and limitations of the auditory and vibrotactile modality will be discussed. A direct comparison reveals similarities and differences. The knowledge of those is important for the design of audio-haptic systems, multimodal music applications or perceptually optimized human-machine interfaces. Literature data and own results for psychophysical characteristics are discussed. This paper focuses on the absolute perception thresholds of both modalities. The main factors which influence these thresholds are discussed: age, energy integration, masking and adaptation.
Mengyu Chen, Jing Yan, Yin Yu
+
Biometric Perception Interface
Soft robotics are primarily composed of soft materials with low moduli which are close to that of biological materials. This unique feature leads to its potential of being utilized on human body as wearable devices that can directly stay and interact with the user’s skin, further expanding the application scenario of robot as a haptic agent of interpersonal communication. With the possibility of conveying intimate messages while maintaining a desired physical distance, a skin-contact based remote communication can create a new form of intimate relationship for people. We present Biometric Perception Interface, a wearable perception extension interface that measures and converts pulse into haptic actuation, and allows users to record and playback pulse data from and to their bodies. We demonstrate three inter-connected components of Biometric Perception Interface: Choker, Antenna, and Memorizer. Additionally, Biometric Perception Interface challenges the common practices of visual memory and quantified abstraction of biological phenomena, and proposes an alternative interpersonal intimate communication mediated by soft-robotics.
Yuri De Pra, Federico Fontana, Hanna Järveläinen, Stefano Papetti, Michele Simonato, Riccardo Furlanetto
+
Auditory and tactile recognition of resonant material vibrations in a passive task of bouncing perception
Besides vision and audition, everyday materials can be passively explored also using touch if they provide tactile feedback to users, for instance in consequence of an external force exciting their natural resonances. If such resonances are known to provide informative auditory cues of material, on the other hand their role when a recognition is made through touch is debatable. Even more questionable is a material recognition from their reproductions: if happening, then they could be used to enrich existing touch-screen interactions with ecological auditory and haptic feedback furthermore requiring inexpensive actuation. With this goal in mind, two experiments are proposed evaluating user's ability to classify wooden, plastic, and metallic surfaces respectively using auditory and haptic cues. Although the literature reports successful auditory classification of everyday material simulations, especially the passive recognition of such material reproductions by holding a finger on a vibrating glass surface has never been tested. By separately reproducing the sound and vibration of a ping-pong ball bouncing on wood, plastic and metal surfaces, our tests report not only auditory, but also tactile recognition of the same materials significantly above chance. Discrepancies existing between our and previously reported results are discussed.

Session 2: Tools & Technologies

Tamara Fiedler, Yasemin Vardar
+
A Novel Texture Rendering Approach for Electrostatic Displays
Generating realistic texture feelings on tactile displays using data-driven methods have attracted a lot of interest in the last decade. However, the need for large data storages and transmission rates complicates the use of these methods for the future commercial displays. In this paper, we propose a new texture rendering approach which can compress the texture data significantly for electrostatic displays. Using three sample surfaces, we rst explain how to record, analyze and compress the texture data, and render them on a touchscreen. Then, through psychophysical experiments conducted with nineteen participants, we show that the textures can be reproduced by a significantly less number of frequency components than the ones in the original signal without inducing perceptual degradation. Moreover, our results indicate that the possible degree of compression is affected by the surface properties.
James Leonard, Jérôme Villeneuve
+
Fast audio-haptic prototyping with mass-interaction physics
This paper presents ongoing work on the topic of physical modelling and force-feedback interaction. Specifically, it proposes a framework for rapidly prototyping virtual objects and scenes by means of mass-interaction models, and coupling the user and these objects via an affordable multi-DoF haptic device. The modelled objects can be computed at the rate of the haptic loop, but can also operate at a higher audio-rate, producing sound. The open-source design and overall simplicity of the proposed system makes it an interesting solution for introducing both physical simulations and force-feedback interaction, and also for applications in artistic creation. This first implementation prefigures current work conducted on the development of modular open-source mass-interaction physics tools for the design of haptic and multisensory applications.
Aditya Tirumala Bukkapatnam, Philippe Depalle, Marcelo Wanderley
+
Autoregressive Parameter Estimation for Equalizing Vibrotactile Systems
Haptic feedback has become a common feature in many digital devices, particularly in the form of vibration, often known as vibrotactile feedback. Vibrotactile systems largely rely on ERMs (Eccentric Rotating Masses), or LRAs (Linear Resonant Actuators), which are inexpensive, but exhibit limitations like limited frequency bandwidth or a coupled control over amplitude and frequency of the vibration. Due to the highly developed physical and cognitive expertise involved in musical performance, advanced haptic feedback systems in musical applications have seen the need for independent control of amplitude and frequency of the vibration, over a wide frequency bandwidth. Some of the systems that are proposed to address this requirement have benefit from the characterization of various technical aspects of the amplifiers and actuators involved in such implementations, as well as the equalization of their overall frequency response characteristics. This equalization process is typically implemented with the help of manually configured parametric equalizers that counter the system's intrinsic frequency characteristics. In this paper, we propose an autoregressive method that automatically estimates stable and minimum-phase filter parameters, which by design, remain stable upon inversion. We demonstrate this method, with an example implementation, and present concluding remarks on the degree of equalization achieved by this method.

Session 3: Human-Computer Interaction & Multimodality

Wanjoo Park, Muhammad Hassan Jamil, Mohamad Eid
+
Prefrontal and Interhemispheric Functional EEG Connectivity in Presence and Absence of Tactile Stimulation
Developing quantitative means to measure the effects of tactile stimulation on user experience is gaining increasing attention over the past decade. This paper strives to find quantitative evidence, based on brain wave analysis, of the relationship between tactile stimulation and cognitive processes. In this study, participants performed an active touch task comprising of stroking the strings of a virtual guitar displayed on a touch-screen device while measuring the 64-channel EEG signal. For the analysis, phase locking values from alpha, beta, and gamma frequency bands are extracted and compared during the absence and presence of tactile stimulation. Results demonstrated an increase in the connectivity of beta and gamma bands in the prefrontal cortex in the presence of tactile stimulation. Such increase is associated with the development of Bereitschaftspotential, which reflects the intention, planning, and execution of precise voluntary movements. Another interesting finding was an increased interhemispheric connectivity in the absence of tactile stimulation, which is associated with motor impairments. These findings suggest that tactile stimulation not only trigger sensation, but further activate cognitive processing associated with motor skills.
Charlotte Magnusson, Héctor Caltenco, Steinunn Arnars Ólafsdóttir
+
Designing activity games for stroke survivors
In this paper we present work carried out in the EU project STARR and the NordForsk project ActivAbles. We report on the design and iterative development of an outdoor activity game for stroke survivors, and discuss design choices, experiences from the iterative testing and outline potential future developments.
Eliott Audry, Jérémie Garcia
+
Towards congruent cross-modal audio-visual alarms for supervision tasks
Operators in surveillance activities face cognitive overload due to the fragmentation of information on several screens, the dynamic nature of the task and the multiple visual or audible alarms. This paper presents our ongoing efforts to design efficient audio-visual alarms for surveillance activities such as traffic management or air traffic control. We motivate the use of congruent cross-modal animations to design alarms and describe audio-visual mappings based on this paradigm. We conclude with the design of a study to validate our designs as well as future research directions.

Demos

Stefano Papetti, Martin Fröhlich
+
The TouchBox: open-source haptic device for finger-based interaction
The TouchBox is a low-cost DIY human-computer interface providing fast, wide-band and high-dynamics vibrotactile feedback, tracking the position of up to two finger-pads in contact with its top surface, and measuring their contact areas as well as the applied normal and lateral (3D) forces. Finger-pressing is a common gesture while interacting with everyday objects, including digital devices. Also, while one or more fingers are in contact with an object (as the result of finger-pressing), another common gesture is that of pushing or sliding the finger(s) over the touched surface, giving rise to shear strain. The proposed interface addresses a simplified version of such scenario. Applications range from using the interface as calibrated measurement device to advanced human-machine interaction.
Corentin Bernard, Michael Wiertlewski, Sølvi Ystad
+
Sound and Texture Synthesizer
Music production on traditional instruments creates vibrations that are perceived via hearing and touch. This is particularly true when producing music on a violin where the vibrations transmitted to the chin helps the musician to hone-in the tone quality. In contrast, modern music synthesizers create high quality sounds without including relevant haptic content. We present the implementation of a haptic sound synthesizer which creates sounds and vibration patterns in real time rendered both acoustically through headphones and haptically through an ultrasonic surface display. The aim of our device is to further investigate the interactions between synthetic haptic textures and sounds.
Christian Frisson, Colin Gallacher, Marcelo Wanderley
+
Haptic techniques for browsing sound maps organized by similarity
This demo paper showcases haptic techniques for browsing sound maps organized by content-based similarity. Our system forks the Freesound Explorer application inside a barebones web browser augmented with haptic capabilities using the Chai3d library that we modified to support Haply haptic devices, ours configured with a 2-DOF pantograph mechanism. A first technique enhances the exploratory navigation of sound maps by adding physical effects, such as: magnetizing the pointer to its closest sound item, feeling viscosity when hovering items. A second technique assists loop-based musical creation by having user-defined sound paths followed by the force-feedback pointing device.
Stephen Sinclair and Bret Battey
+
DIMPLE links 3D interactive haptics with audiovisual design software
DIMPLE is a software system providing a 3D "blank canvas" in which users can create 3D interactive scenes for haptic audio-visual inter- action from any media creation system that supports Open Sound Control messaging. This update modernizes DIMPLE for the cur- rent generation of haptic devices and operating systems.
Mengyu Chen, Jing Yan, Yin Yu
+
Biometric Perception Interface (paper)
Soft robotics are primarily composed of soft materials with low moduli which are close to that of biological materials. This unique feature leads to its potential of being utilized on human body as wearable devices that can directly stay and interact with the user’s skin, further expanding the application scenario of robot as a haptic agent of interpersonal communication. With the possibility of conveying intimate messages while maintaining a desired physical distance, a skin-contact based remote communication can create a new form of intimate relationship for people. We present Biometric Perception Interface, a wearable perception extension interface that measures and converts pulse into haptic actuation, and allows users to record and playback pulse data from and to their bodies. We demonstrate three inter-connected components of Biometric Perception Interface: Choker, Antenna, and Memorizer. Additionally, Biometric Perception Interface challenges the common practices of visual memory and quantified abstraction of biological phenomena, and proposes an alternative interpersonal intimate communication mediated by soft-robotics.
James Leonard, Jérôme Villeneuve
+
Fast audio-haptic prototyping with mass-interaction physics (paper)
This paper presents ongoing work on the topic of physical modelling and force-feedback interaction. Specifically, it proposes a framework for rapidly prototyping virtual objects and scenes by means of mass-interaction models, and coupling the user and these objects via an affordable multi-DoF haptic device. The modelled objects can be computed at the rate of the haptic loop, but can also operate at a higher audio-rate, producing sound. The open-source design and overall simplicity of the proposed system makes it an interesting solution for introducing both physical simulations and force-feedback interaction, and also for applications in artistic creation. This first implementation prefigures current work conducted on the development of modular open-source mass-interaction physics tools for the design of haptic and multisensory applications.
Charlotte Magnusson, Héctor Caltenco and Steinunn Arnars Ólafsdóttir
+
Designing activity games for stroke survivors (paper)
In this paper we present work carried out in the EU project STARR and the NordForsk project ActivAbles. We report on the design and iterative development of an outdoor activity game for stroke survivors, and discuss design choices, experiences from the iterative testing and outline potential future developments.

Workshops

Stephen Sinclair, James Leonard, Jérôme Villeneuve, Christian Frisson
+
Force feedback with open source platforms
Charlotte Magnusson, Gerhard Weber, Marcelo Wanderley
+
ISO standards

Concert

Peter Orins - "Having Never Written A Note For Percussion" (James Whitney)
Peter Orins explores the various sound textures one can produce with drums, searching for the ambiguity of timbres and how they are produced, for the saturation of sounds and the harmonics of each percussion. He seeks to highlight both the "microscopic" sounds from prepared heads and metals brushed and struck at very low levels, and the acoustic pressure from drums played at a loud volume. In that spirit, he will perform a piece by James Tenney entitled "Having Never Written A Note For Percussion", from the "Postal Pieces" collection. This piece demonstrates the natural yet intense relation between haptics and sound as the percussionist explores the richness of sounds and responses from a cymbal.
Peter Orins and Ivann Cruz - Vibrating shapes
Peter Orins and Ivann Cruz have both been performing improvised music for more than 15 years, in various configurations and genres.In this structured improvisation, they explore the intersection of virtual and physical spaces with visual, sonic and haptic feedback.3D shapes placed around their acoustic instruments (guitar and drums) are revealed both visually, as video projections of the intersections with the physical bodies and instruments, and sonically, as vibrations of surface speakers propagated to sounding objects.