Research Area
Year Published

92 Results

July 7, 2019

Affective touch communication in close adult relationships

IEEE World Haptics Conference

Inter-personal touch is a powerful aspect of social interaction that we expect to be particularly important for emotional communication. We studied the capacity of closely acquainted humans to signal the meaning of several word cues (e.g. gratitude, sadness) using touch sensation alone.

By: Sarah McIntyre, Athanasia Moungou, Rebecca Boehme, Peder M. Isager, Frances Lau, Ali Israr, Ellen A. Lumpkin, Freddy Abnousi, Håkan Olausson
Areas: AR/VR

July 7, 2019

Uncovering Human-to-Human Physical Interactions that Underlie Emotional and Affective Touch Communication

IEEE World Haptics Conference

Couples often communicate their emotions, e.g., love or sadness, through physical expressions of touch. Prior efforts have used visual observation to distinguish emotional touch communications by certain gestures tied to one’s hand contact, velocity and position. The work herein describes an automated approach to eliciting the essential features of these gestures.

By: Steven C. Hauser, Sarah McIntyre, Ali Israr, Håkan Olausson, Gregory J. Gerling
Areas: AR/VR

July 7, 2019

From Human-to-Human Touch to Peripheral Nerve Responses

IEEE World Haptics Conference

Human-to-human touch conveys rich, meaningful social and emotional sentiment. At present, however, we understand neither the physical attributes that underlie such touch, nor how the attributes evoke responses in unique types of peripheral afferents. Indeed, nearly all electrophysiological studies use well-controlled but non-ecological stimuli. Here, we develop motion tracking and algorithms to quantify physical attributes – indentation depth, shear velocity, contact area, and distance to the cutaneous sensory space (receptive field) of the afferent – underlying human-to-human touch.

By: Steven C. Hauser, Saad S. Nagi, Sarah McIntyre, Ali Israr, Håkan Olausson, Gregory J. Gerling
Areas: AR/VR

July 7, 2019

A Compact Skin-Shear Device using a Lead-Screw Mechanism

IEEE World Haptics Conference

We present a skin-shear actuator based on the lead screw mechanism. The lead screw mechanism is simple, reliable, offers fewer components, and accommodates into compact form-factors. We show mechanical design of a single assembly unit and implement multiple units in a single handheld device. We evaluate the actuator in one instrumentation-based test and one preliminary user study.

By: Pratheev Sreetharan, Ali Israr, Priyanshu Agarwal
Areas: AR/VR

June 27, 2019

Sensor Modeling and Benchmarking — A Platform for Sensor and Computer Vision Algorithm Co-Optimization

International Image Sensor Workshop

We predict that applications in AR/VR devices [1] and intelligence devices will lead to the emergence of a new class of image sensors — machine perception CIS (MPCIS). This new class of sensors will produce images and videos optimized primarily for machine vision applications, not human consumption.

By: Andrew Berkovich, Chiao Liu

June 16, 2019

Self-Supervised Adaptation of High-Fidelity Face Models for Monocular Performance Tracking

Conference on Computer Vision and Pattern Recognition (CVPR)

Improvements in data-capture and face modeling techniques have enabled us to create high-fidelity realistic face models. However, driving these realistic face models requires special input data, e.g. 3D meshes and unwrapped textures. Also, these face models expect clean input data taken under controlled lab environments, which is very different from data collected in the wild. All these constraints make it challenging to use the high-fidelity models in tracking for commodity cameras. In this paper, we propose a self-supervised domain adaptation approach to enable the animation of high-fidelity face models from a commodity camera.

By: Jae Shin Yoon, Takaaki Shiratori, Shoou-I Yu, Hyun Soo Park

June 14, 2019

2.5D Visual Sound

Conference Computer Vision and Pattern Recognition (CVPR)

Binaural audio provides a listener with 3D sound sensation, allowing a rich perceptual experience of the scene. However, binaural recordings are scarcely available and require nontrivial expertise and equipment to obtain. We propose to convert common monaural audio into binaural audio by leveraging video.

By: Ruohan Gao, Kristen Grauman

May 31, 2019

Soundfield Reconstruction in Reverberant Environments Using Higher-Order Microphones and Impulse Response Measurements

IEEE International Conference on Acoustics, Speech and Signal Processing

This paper addresses the problem of soundfield reconstruction over a large area using a distributed array of higher-order microphones. Given an area enclosed by the array, one can distinguish between two components of the soundfield: the interior soundfield generated by sources outside of the enclosed area and the exterior soundfield generated by sources inside the enclosed area.

By: Federico Borra, Israel Dejene Gebru, Dejan Markovic
Areas: AR/VR

May 17, 2019

Psychophysical Evaluation of Persistence- and Frequency-Limited Displays for Virtual and Augmented Reality

SID Display Week

Little is known about user sensitivity to flicker and eye movement-induced ghosting at refresh rates and persistence levels relevant for head-mounted displays (HMDs). In this report, we describe a pair of psychophysical experiments that we performed to comprehensively quantify these artifacts and make general recommendations for HMD design.

By: T. Scott Murdison, Christopher McIntosh, James Hillis, Kevin J. MacKenzie

May 10, 2019

Touch with Foreign Hands: The Effect of Virtual Hand Appearance on Visual-Haptic Integration

ACM Symposium on Applied Perception (SAP)

Hand tracking and haptics are gaining more importance as key technologies of virtual reality (VR) systems. For designing such systems, it is fundamental to understand how the appearance of the virtual hands influences user experience and how the human brain integrates vision and haptics.

By: Valentin Schwind, Lorraine Lin, Massimiliano Di Luca, Sophie Jörg, James Hillis
Areas: AR/VR