Gaze and Eye Movement in Interaction
Where we look not only reflects our information needs, but also guides our movements and actions in the world. As gaze is so central to our interactions, it has been studied for human-computer interaction (HCI) for as long as we have had modern computer interfaces. However, in decades of research, the approach has been to isolate eye movement as input, and to treat it as separate from other movements of the body that we use for interaction with computers.
In the GEMINI project, we are re-thinking gaze and eye movement for HCI. We recognise that gaze is a coordinated effort of eye, head and body, and that the eyes continuously interact with other parts of the body when we direct our visual attention and navigate and manipulate our environment. Our goal is to design multimodal interfaces that better reflect the interplay of gaze and movement, and that let users interact more naturally in extended reality, using their eyes and body in concert.