Human Hair Inverse Rendering using Multi-View Photometric data

The Eurographics Symposium on Rendering (EGSR)


We introduce a hair inverse rendering framework to reconstruct high-fidelity 3D geometry of human hair, as well as its reflectance, which can be readily used for photorealistic rendering of hair. We take multi-view photometric data as input, i.e., a set of images taken from various viewpoints and different lighting conditions. Our method consists of two stages. First, we propose a novel solution for line-based multi-view stereo that yields accurate hair geometry from multi-view photometric data. Specifically, a per-pixel lightcode is proposed to efficiently solve the hair correspondence matching problem. Our new solution enables accurate and dense strand reconstruction from a smaller number of cameras compared to the state-of-the-art work. In the second stage, we estimate hair reflectance properties using multi-view photometric data. A simplified BSDF model of hair strands is used for realistic appearance reproduction. Based on the 3D geometry of hair strands, we fit the longitudinal roughness and find the single strand color. We show that our method can faithfully reproduce the appearance of human hair and provide realism for digital humans. We demonstrate the accuracy and efficiency of our method using photorealistic synthetic hair rendering data.

Related Publications

All Publications

ISMAR - July 29, 2021

Instant Visual Odometry Initialization for Mobile AR

Alejo Concha, Michael Burri, Jesus Briales, Christian Forster, Luc Oth

ICSA - November 6, 2019

Auralization systems for simulation of augmented reality experiences in virtual environments

Peter Dodds, Sebastià V. Amengual Garí, W. Owen Brimijoin, Philip W. Robinson

Journal of the Audio Engineering Society - July 20, 2021

Six-Degrees-of-Freedom Parametric Spatial Audio Based on One Monaural Room Impulse Response

Johannes M. Arend, Sebastià V. Amengual Garí, Carl Schissler, Florian Klein, Philip W. Robinson

ACM Transactions on Applied Perception Journal (ACM TAP) - September 16, 2021

Evaluating Grasping Visualizations and Control Modes in a VR Game

Alex Adkins, Lorraine Lin, Aline Normoyle, Ryan Canales, Yuting Ye, Sophie Jörg

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy