

This tutorial shows an input scenario called Put-That-There dating back to research from the MIT Media Lab in the early 1980's with eye, hand and voice input. Summary: Scroll, pan, zoom, 3D rotation using a combination of eyes, voice and hand input. In addition, there is an example for hands-free rotation of 3D holograms by making them automatically rotate based on your current focus. These are some of the examples showcased in this tutorial regarding eye-supported navigation. Or how about magically zooming directly toward where you were looking at? Imagine that you are reading some information on a distant display or your e-reader and when you reach the end of the displayed text, the text automatically scrolls up to reveal more content.

Summary: Fast and effortless target selections using a combination of eyes, voice and hand input. In addition, there is a simple example of smart notifications that automatically disappear after being read. It includes an example for subtle yet powerful feedback to provide the user with the confidence that a target is focused without being overwhelming. This tutorial showcases the ease of accessing eye gaze data to select targets. Overview of the eye tracking demo samples Eye-Supported Target Selection
Eye tracking how to#
The MRTK eye tracking demo scenes are loaded additively, which we will explain below how to set up. The following section is a quick overview of what the individual eye tracking demo scenes are about. In the following section, you will find more details on what each of the different samples in the MRTK eye tracking example package (Assets/MRTK/Examples/Demos/EyeTracking) includes: The demos also include an example for eye-gaze-directed scroll, pan and zoom of text and images on a slate.įinally, an example is provided for recording and visualizing the user's visual attention on a 2D slate. This enables users to quickly and effortlessly select and move holographic content across their view simply by looking at a target and saying 'Select' or performing a hand gesture. The demo includes various use cases, ranging from implicit eye-based activations to how to seamlessly combine information about what you are looking at with voice and hand input. These samples let you experience one of our new magical input capabilities: Eye tracking! This topic describes how to quickly get started with eye tracking in MRTK by building on MRTK eye tracking examples (Assets/MRTK/Examples/Demos/EyeTracking).
