Cheek Haptics and Other Weird Computer Interfaces from CHI 2017

All the weirdest computer interfaces from CHI 2017

6 min read

All the weirdest computer interfaces from CHI 2017
Image: ACM SIGCHI

The ACM CHI Conference on Human Factors in Computing Systems is taking place in Denver this week, and just like last year, it’s host to some amazing, incredible, and utterly bizarre technology demos. This year’s theme is “Explore, Innovate, Inspire,” which, as far as we can tell, has no specific meaning and therefore does not constrain the weirdness that CHI is so well known for. We’ve gone through hundreds of 30-second video clips to find the most interesting and craziest stuff, and we can promise you won't be disappointed. Today, we're bringing you some interesting ways of interfacing with technology. In addition to videos showing off these breakthroughs, the researchers behind them describe their brainchildren. We’ll have even more videos on virtual and augmented reality and 3D printing later this week.

  1. Project Telepathy
  1. DeskWave
  1. Air Vortex Rings
  1. Hey, Wake Up!
  1. Cito
  1. Sparkle
  1. BreathScreen
  1. IllumiPaper
  1. Emotion Actuator

Project Telepathy: Targeted Verbal Communication using 3D Beamforming Speakers and Facial Electromyography

Thanks to the miracle of technology, you can now target individual people and then scream at them without anyone else hearing you, and all it takes it a minimal amount of facial electronification: 

Anne-Claire Bourland, Peter Gorman, Jess McIntosh, Asier Marzo, University of Bristol, Bristol, United Kingdom

Speech is our innate way of communication. However, it has limitations. Because it’s a broadcast process, it has limited reach and it works only in air. Here, we explore the combination of technologies turn verbal communicaion on its ear. The results could be a boon to coordination of tasks or private conversations. For detecting words, we measure the bioelectric signals produced by facial muscles during speech. An electromyographic system composed of four surface electrodes had an accuracy of 80 percent when discriminating between 10 words. More importantly, the system was equally effective in discriminating spoken and silently mouthed words. For transferring the words, we used the sound-through-ultrasound phenomenon to generate audible sound within a narrow beam. We built a phased array of ultrasonic emitters, capable of emitting sound that can be steered electronically without physically moving the array. Two prototypes that combine detection and transfer of words are presented and their limitations analysed.

DeskWave: Desktop Interactions using Low-cost Microwave Doppler Arrays

Microwave arrays can imbue any microwave-transparent surface with fancy multi-touch capabilities. The low-cost array can be installed under your existing desk, and works with both gestures and tangible interactions. Plus, you can use it to keep your coffee warm, although this is not recommended:

Jess McIntosh, Paul Worgan, Mike Fraser, Asier Marzo, University of Bristol, Bristol, United Kingdom

Microwaves are a type of electromagnetic radiation that can pass through a variety of commonly found materials but partially reflect off human bodies. Microwaves are non-ionizing and at controlled levels do not pose a danger. A wave that is capable of passing through materials and imaging humans could have useful applications in human-computer-interaction. However, only recently has the full potential of microwaves for interactive devices begun to be explored. Here, we present a scalable, low-cost system using an array of off-the-shelf microwave Doppler sensors and explore its potential for tabletop interactions. The arrays are installed beneath a desk, making it an unobtrusive device that enables a wide range of interactions such as 3D hand tracking, gesture recognition, and different forms of tangible interaction. Given the low cost and availability of these sensors, we expect that this work will stimulate future interactive devices that employ microwave sensors.

Investigating Haptic Perception of and Physiological Responses to Air Vortex Rings on a User's Cheek

Vortex rings seem like a fun way of providing targeted physical notifications at range, sort of like a gentle, silent, long-distance nudge. Although, according to the video, the vortex rings are somehow powerful enough to knock humans unconscious:

Yuka Sato, Ryoko Ueoka, Kyushu University, Fukuoka, Japan

Haptic perception is one of the primary means of interaction with the world. Recent research on affective haptics suggests that it can affect emotional and behavioral responses. In this study, we evaluate user perceptions of haptic stimuli generated by air vortex rings on the cheek and investigate the effects on their physiological responses. To develop a cheek haptic display, we investigated and found that the cheek had enough resolution to perceive the differences in haptic stimuli in a two-point discrimination threshold test of the face. [In these experiments, we were able] to evaluate quantitatively the effects of four different combinations of haptic stimuli on the physiological responses in terms of stress modification, brainwave activities, task performance, and subjective assessment. The results suggest that different stimuli affect physiological responses and task performance.

Hey, Wake Up: Come Along with the Artificial Learning Companion to the e-Learner’s Outcomes High!

Learning online is tough because there's nobody there to make sure you’re awake and paying attention. This little robot-thing, which is wearing a santa hat for some reason that I'm not sure I understand, exists only to keep you from dozing off while studying:

Hyun Young Kim, Bomyeong Kim, JeeHang Lee, Jinwoo Kim, Yonsei University, Seoul, Republic of Korea

Compared to offline learners, online learners’ attitude during the learning process is relatively poor. One common problem is loneliness because they often study alone. This results in a poor learning outcome. Herein we present pioneering work on a co-existing, artificial learning companion capable of improving the learner’s attitude through sleepiness detection. We capture, analyze, and estimate the level of drowsiness employing a machine learning technique using the pilot study data. Then, we propose a prototype called LearniCube [that reacts to the learner's level of alertness and interacts with him or her].

Cito: An Actuated Smartwatch for Extended Interactions

Due to their small size and limitations in form factor, smartwatches are often frustrating to use. Cito is an exploration of how giving a smartwatch the ability to move around can significantly improve its usability:

Jun Gong, Lan Li, Daniel Vogel, Xing-Dong Yang, Dartmouth College, Hanover, NH

We propose and explore actuating a smartwatch face [so that it will position itself to be seen no matter what the user is doing]. Five face movements are defined: rotation, hinging, translation, rising, and orbiting. These movements are incorporated into interaction techniques to address limitations of a fixed watch face. For example, the watch face will move itself from behind the cuff of a long sleeve shirt or flip up to reveal itself if the wearer is carrying something that prevents him or her from rotating the wrist so the screen is face up. We present Cito, a high-fidelity proof-of-concept hardware prototype that investigates technical challenges.

Sparkle: Hover Feedback with Touchable Electric Arcs

There aren't very many ways to generate tactile feedback in midair so you can get the feel of a touchscreen without actual contact. Sparkle sends electric arcs between a surface and your finger, providing little targeted zaps when a finger hovers nearby:

Daniel Spelmezan, Deepak Ranjan Sahoo, Sriram Subramanian, University of Sussex, Brighton, United Kingdom

Many finger sensing input devices now support proximity input, enabling users to perform in-air gestures. While near-surface interactions increase the input vocabulary, they lack tactile feedback, making it hard for users to perform gestures or to know when the interaction takes place. Sparkle is the first step towards providing a new type of hover feedback, and it does not require users to wear tactile stimulators.Sparkle’s electric arcs deliver in-air tactile or thermal feedback that is sharper and more feelable than acoustic mid-air haptic devices. We combine this technology with infrared proximity sensing in two proof-of-concept devices with form factor and functionality similar to a button and a touchpad. 

BreathScreen – Design and Evaluation of an Ephemeral UI

I'm honestly not sure why the world needs a vape display, but here's one anyway. It'll also work if it's cold enough outdoors, and I will admit that there's something to be said for being able to create a relatively large display on the fly that disappears in seconds:

Ismo Alakärppä, Elisa Jaakkola, Ashley Colley, Jonna Häkkilä, University of Lapland, Rovaniemi, Finland

We present BreathScreen, a concept where clouds created by breathing are used as a projection surface for a picoprojector, creating an ephemeral user interface. In cold weather conditions, the clouds are created naturally by warm breath condensing. But in other conditions, an electric vaporizer may be used. We present an initial evaluation of the concept in a user study (n = 8), utilising a vaporizer-based BreathScreen prototype. The concept was positively received by study participants as a natural, hands-free interface and considered magical and aesthetically beautiful. Additionally, we provide guidance on the quantity of content that may be displayed on a BreathScreen, which is limited both by the length of a human breath and the contrast of the system.

IllumiPaper: Illuminated Interactive Paper

IllumiPaper is something in between paper and a digital display. The idea is that you can add simple illuminated elements to specific areas of standard pieces of paper, making them interactive:

Konstantin Klamka, Raimund Dachselt, Technische Universität Dresden, Dresden, Germany

Due to their simplicity and flexibility, digital pen-and-paper solutions have a promising potential to become a part of our daily work. Unfortunately, they lack dynamic visual feedback and thereby restrain advanced digital functionalities. In this paper, we investigate new forms of paper-integrated feedback, which build on emerging paper-based electronics and novel thin-film display technologies. Our approach focuses on illuminated elements, which are seamlessly integrated into standard paper. For that, we introduce an extended design space for paper-integrated illuminations. Furthermore, we contribute a fully-functional research platform including a paper-controller, digital pen, and illuminated, digitally controlled papers that demonstrate the feasibility of our techniques. 

Emotion Actuator: Embodied Emotional Feedback through Electroencephalography and Electrical Muscle Stimulation

If you have trouble communicating your emotions to other people, this system will read them directly from your brain via an EEG and then use a muscle stimulation system to force another person to make a physical gesture that expresses your emotion for you: 

Mariam Hassib, Max Pfeiffer, Stefan Schneegass, Michael Rohs, Florian Alt, University of Munich (LMU), University of Stuttgart, University of Hannover, University of Münster, Germany

The human body reveals emotional and bodily states through measurable signals, such as body language and electroencephalography. However, such manifestations are difficult to communicate to others remotely. We propose EmotionActuator, a proof-of-concept system to investigate the transmission of emotional states (amused, sad, angry, and neutral) in which the recipient performs emotional gestures to understand and interpret the state of the sender. We call this kind of communication embodied emotional feedback, and present a prototype implementation. 

The Conversation (0)