AR and VR technologies both require and enable the design of new interaction paradigms for the user. Utilizing natural means from interpersonal communication such as gaze, is still underrepresented in current AR and VR technologies. This thesis looks into the potentials of using gaze and eye-tracking technology as an interaction modality for AR and VR applications. The goal of this thesis is to setup and develop a test environment for gaze-based interaction with either AR (i.e., Hololense) or VR technology (i.e., HTC Vive). Based on the pupil-labs eye tracking hardware, this test environment should be able to capture and display the gaze data of a user within a virtual scene (e.g., as a cursor while playing). Further, it should allow a user to interact with virtual objects via gaze.
Possible specialization within this topic:
- Exploring different approaches to visualizing gaze in AR/VR and related user experience aspects
- Designing interaction concepts for virtual or mixed reality environments based on gaze