reGaze:

Eye Tracking Assisted Superhuman Interactions in Virtual Reality Games

Watch the video

We researched and prototyped how eye tracking could be used as an input for interaction in VR games.

Technological advancements in VR headsets have enabled novel forms of VR interaction. Some of them, such as eye tracking, seem to be overlooked. Thus, the aim of our project was to introduce eye gaze based superhuman interactions, specifically when travelling and selecting and manipulating objects within a VR game. Dashing was utilised as a travel technique and telekinesis as an object selection and manipulation method. Via the proposed interaction methods the effect on game experience was compared between eye gaze based and traditional controller inputs.

An iterative process of design and implementation was followed to develop a prototype in which the game experience could be measured. Each iteration had its own evaluation to gather data that would be used to improve the following iteration. The final experiment involved a within-group study (n=30) where each participant tested the prototype twice in randomised order, one for each interaction method; eyes and controllers. The evaluation was done through the use of a refined version of the game experience questionnaire (GEQ).

Based on the results, there are indications that eye tracking as a primary input method for travel and selection & manipulation in VR, can yield a comparable experience to the handheld controllers in an exploration and puzzle solving game scenario. If you’re curious about our whole research paper, you can read it here.

The technologies we used

HTC Vive Pro Eye

Precision eye tracking combined with professional-grade sound and graphics, designed for studios, home offices, and VR users who require a premium immersive experience. You can read more about it here.

Tobii XR SDK

This SDK provides a comprehsive guide and toolset for developing immersive XR experiences with novel technologies. You can read more about it here.

Unity 3D

Unity is a powerful tool which lets creators develop interactive and immersive experiences for XR. You can read more about it here.

You can just access our Github repository here, and feel free to play around! You’ll find a detailed guide on how to install and run it there.

A tiny bit about us

We’re a passionate group of Medialogy students from Aalborg University Copenhagen. We research, design and develop everything that fits into the continuum of Human-Computer Interaction. We love novel interactions, game development, sound design, and innovating boring stuff with technology. The project was supervised by Niels Christian Nilsson, and it was developed in the Multisensory Experience Lab.

Balazs Andras Ivanyi
Christian Vasileios Tsalidis de Zabala
Lilla Julia Toth
Marcus Alexander Dyrholm
Scott James Naylor
Truls Bendik Tjemsland

We also like to drink loads of covfefe.

Hey, thanks for making it this far! let’s get in touch if you have any questions regarding the project, or just wanna talk about interesting ideas or cool tech: