Real-time Head-based Deep-learning Model for Gaze Probability Regions in Collaborative VR
Riccardo Bovo, Daniele Giunchi, Ludwig Sidenmark, Enrico Costanza, Hans Gellersen, Thomas Heinis
ETRA ‘22: ACM Symposium on Eye Tracking Research and Applications
Eye behavior has gained much interest in the VR research com- munity as an interactive input and support for collaboration. Re- searchers used head behavior and saliency to implement gaze in- ference models when eye-tracking is missing. However, these solu- tions are resource-demanding and thus unfit for untethered devices, and their angle accuracy is around 7°, which can be a problem in high-density informative areas. To address this issue, we propose a lightweight deep learning model that generates the probability density function of the gaze as a percentile contour. This solution allows us to introduce a visual attention representation based on a region rather than a point. In this way, we manage the trade-off between the ambiguity of a region and the error of a point. We tested our model in untethered devices with real-time performances; we evaluated its accuracy, outperforming our identified baselines (average fixation map and head direction).
Riccardo Bovo, Daniele Giunchi, Ludwig Sidenmark, Enrico Costanza, Hans Gellersen, and Thomas Heinis. 2022. Real-time head-based deep-learning model for gaze probability regions in collaborative VR. In ETRA ’22: ACM Symposium on Eye Tracking Research and Applications, June 08–11, 2022, Seattle, WA. ACM, New York, NY, USA, 8 pages.
BibTex
@inproceedings{10.1145/3517031.3529642, author = {Bovo, Riccardo and Giunchi, Daniele and Sidenmark, Ludwig and Costanza, Enrico and Gellersen, Hans and Heinis, Thomas}, title = {Real-time head-based deep-learning model for gaze probability regions in collaborative VR}, year = {2022}, isbn = {9781450392525}, publisher = {Association for Computing Machinery}, address = {New York, NY, USA}, url = {https://doi.org/10.1145/3517031.3529642}, doi = {10.1145/3517031.3529642}, booktitle = {ACM Symposium on Eye Tracking Research and Applications}, numpages = {8}, keywords = {eye movement, gaze depth estimation, eye tracking, fixation depth, 3D gaze estimation, VOR}, location = {Seattle, Washington}, series = {ETRA '22} }