Gaze and Hand Multimodal Interaction

Gaze not only involves coordination of eye, head and body to direct visual attention, but is also closely linked to movements we perform in manipulation of our environment. We use our gaze to sample information required for action just-in-time, for example by fixating an object before we reach for it, with the eyes leading the hand. The close coordination of eye and hand is also fundamental to direct manipulation of user interfaces and virtual worlds.

Gaze has been widely explored as an alternative to manual input, as it functions as a natural pointer. In comparison, gaze is faster and less effort for pointing at objects while hand movement is more deliberate and expressive. In our work we are exploring how the two modalities can be combined in ways that build on natural eye-hand coordination.

In prior work we have shown how gaze can extend direct touch interaction, leveraging the relative strengths of the modalities with a principle of “gaze selects, touch manipulates” [1]. Gaze naturally looks ahead to objects and manual input is seamlessly redirected to where the users is looking, without breaking any of the familiar styles of multitouch input. Gaze is effective in extending manual reach and natural in modulates between direct and indirect modes of manual input, depending on whether we focus on what our hands do or look ahead to objects we aim to manipulate [2]. These concepts also extend to 3D, where they are particularly effective for bridging between interaction close up and at a distance [3].

We are also studying gaze in the context of manual pointing. When gaze and hand are both used for pointing, then targets can be become pre-selected by one modality and confirmed by alignment with the other [4]. This is significant for avoiding Midas Touch input in touchless interaction, as neither gaze nor mid-air gesture trigger input unless they become aligned.

  1. Gaze-touch: combining gaze with multi-touch for interaction on the same surface
    Ken Pfeuffer, Jason Alexander, Ming Ki Chong and Hans Gellersen
    UIST '14: ACM Symposium on User Interface Software and Technology

  2. Gaze-shifting: Direct-indirect input with pen and touch modulated by gaze
    Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang and Hans Gellersen
    UIST '15: ACM Symposium on User Interface Software and Technology

  3. Gaze+pinch interaction in virtual reality
    Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi and Hans Gellersen
    SUI '17: ACM Symposium on Spatial User Interaction

  4. Gaze-Hand Alignment: Combining Eye Gaze and Mid-Air Pointing for Interacting with Menus in Augmented Reality
    Mathias Lystbæk, Peter Rosenberg, Ken Pfeuffer, Jens Emil Grønbæk and Hans Gellersen
    ETRA '22: ACM Symposium on Eye Tracking Research and Applications