Event-based Gaze Tracking

Zinn Labs develops vision-first neural interfaces to connect people to their devices in more intuitive ways

Vision is our most important sense and is the dominant input to our understanding of, and interaction with, the world.

Zinn systems measure where a user is looking, what they’re looking at, and how that connects to attention and intent.

Event-based gaze tracking

Conventional image-based pipelines rely on relatively slow image capture and complex computer vision algorithms that require expensive hardware acceleration.

Zinn Labs develops gaze tracking systems based on event sensors, which enable dramatically lower latency and higher framerates. In addition, the tailored sensor data can run in limited-compute embedded environments at low power while maintaining high-performance gaze accuracy.

Conventional

Advantages to event-based sensing

Low latency and high frame rate

Low power for all-day use

Robust to lighting and facial features

Low bandwidth and easier integration

Privacy preserving at the sensor level

Applications

Zinn Lab’s gaze-tracking systems can satisfy a wide variety of head-mounted applications, from AR/VR headsets to smart eyewear.

In addition, Zinn Labs can develop novel gaze-enabled applications and product designs and help you integrate gaze into your next project. See selected demo videos below.

Virtual/Augmented Reality

Smart frames

Immersive displays

Foveated rendering
Optical distortion correction

Advanced UI

Gaze-based selection
Contextual AI assistants

Gaze statistics

Contextual understanding
Health and wellness data

Generative AI wearables

A vision interface built into eyewear gives seamless context to your AI assistant for questions like, “what is this?”

Autofocals

Automatically refocusing reading glasses for the world’s 2 billion presbyopes.

Smart context

Contextual interactions, smart home control, and gestures informed by user gaze and attention

Zinn DK1 Eval Kit

Available Soon

DK1 Features

  • Head-mounted frames

  • Outward-facing world camera (frame-based image), speaker, microphone

  • Gaze index and foveation of world camera

  • Tethered compute with full breakouts

  • Optional: optical breadboard adapter for evaluation and prototype integration

Technical Specs

  • 120 Hz gaze sample rate

  • 10 ms end-to-end latency

  • Pupil + corneal glint 3D gaze model

  • < 1 degree gaze accuracy