Immersive AR/VR can demand unprecedented eye-tracking performance. Eye tracking should be precise, be accurate, and work all the time, for every person, in any environment. Developing robust eye-tracking solutions that meet this criteria requires large volumes of accurate eye-gaze data. Capturing such data can require a highly sophisticated setup, and even with this setup, accuracy is limited by user fixation ability and cooperation. These issues place practical limitations on the amount and quality of training data that can be collected.
We developed this challenge because augmented and virtual reality platforms are constrained in the power requirements needed to capture accurate gaze labels. By developing defined problems for ML and CV communities to solve, we can engage these communities in helping develop eye tracking solutions that work for everyone, everywhere, all the time.
In the absence of accurate gaze labels, Facebook Reality Labs would like to help advance the state of the art for eye tracking that works for everyone all the time and under all illumination conditions by carefully designing two challenges that combine human annotation of eye features with unlabeled data. The first challenge targets the design of a low-complexity solution to detect key eye regions: the sclera, the iris, the pupil, and everything else (background). The second challenge focuses on generating realistic eye images.
Participants are encouraged to join either or both of these challenges. The submissions will be evaluated according to criteria outlined on the Semantic Segmentation Challenge page and the Synthetic Eye Generation Challenge page.
The challenge will run from May 1 through September 15, 2019. The top three winners in each track will receive a cash prize as follows:
- 1st place winner, $5,000
- 2nd place winner, $3,000
- 3rd place winner, $2,000
In addition, the 1st place winner of each challenge will be invited (with all travel expenses covered) to present their work at the OpenEDS Workshop on Eye Tracking for VR and AR at ICCV 2019.
For information regarding rules, how to participate, and more, visit the OpenEDS Challenge page.