From Human-to-Human Touch to Peripheral Nerve Responses

IEEE World Haptics Conference


Human-to-human touch conveys rich, meaningful social and emotional sentiment. At present, however, we understand neither the physical attributes that underlie such touch, nor how the attributes evoke responses in unique types of peripheral afferents. Indeed, nearly all electrophysiological studies use well-controlled but non-ecological stimuli. Here, we develop motion tracking and algorithms to quantify physical attributes – indentation depth, shear velocity, contact area, and distance to the cutaneous sensory space (receptive field) of the afferent – underlying human-to-human touch. In particular, 2-D video of the scene is combined with 3-D stereo infrared video of the toucher’s hand to measure contact interactions local to the receptive field of the receiver’s afferent. The combined and algorithmically corrected measurements improve accuracy, especially of occluded and misidentified fingers. Human subjects experiments track a toucher performing four gestures – single finger tapping, multi-finger tapping, multi-finger stroking and whole hand holding – while action potentials are recorded from a first-order afferent of the receiver. A case study with one rapidly-adapting (Pacinian) and one C-tactile afferent examines temporal ties between gestures and elicited action potentials. The results indicate this method holds promise in determining the roles of unique afferent types in encoding social and emotional touch attributes in their naturalistic delivery.

Related Publications

All Publications

ISMAR - July 29, 2021

Instant Visual Odometry Initialization for Mobile AR

Alejo Concha, Michael Burri, Jesus Briales, Christian Forster, Luc Oth

ICSA - November 6, 2019

Auralization systems for simulation of augmented reality experiences in virtual environments

Peter Dodds, Sebastià V. Amengual Garí, W. Owen Brimijoin, Philip W. Robinson

Journal of the Audio Engineering Society - July 20, 2021

Six-Degrees-of-Freedom Parametric Spatial Audio Based on One Monaural Room Impulse Response

Johannes M. Arend, Sebastià V. Amengual Garí, Carl Schissler, Florian Klein, Philip W. Robinson

ACM Transactions on Applied Perception Journal (ACM TAP) - September 16, 2021

Evaluating Grasping Visualizations and Control Modes in a VR Game

Alex Adkins, Lorraine Lin, Aline Normoyle, Ryan Canales, Yuting Ye, Sophie Jörg

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy