18144079. Interactive Motion-Based Eye Tracking Calibration simplified abstract (Apple Inc.)

From WikiPatents
Jump to navigation Jump to search

Interactive Motion-Based Eye Tracking Calibration

Organization Name

Apple Inc.

Inventor(s)

Walter Nistico of Berlin (DE)

Andrii Nikiforov of Berlin (DE)

Borys Lysyansky of Berlin (DE)

Interactive Motion-Based Eye Tracking Calibration - A simplified explanation of the abstract

This abstract first appeared for US patent application 18144079 titled 'Interactive Motion-Based Eye Tracking Calibration

Simplified Explanation

The invention is a method for calibrating an eye tracking device using a moving stimulus object and captured images of the user's eye.

  • Moving stimulus object displayed in a certain area, capturing images of the user's eye during display.
  • Gaze data provided based on captured images, determining gaze points with respect to the display area.
  • Calibration parameters of a predefined model determined based on analysis of gaze points.
  • Control parameters determined based on captured gaze data to control calibration procedure execution.
    • Potential Applications:**

- Eye tracking technology for research purposes - Assistive technology for individuals with disabilities - Virtual reality and gaming applications

    • Problems Solved:**

- Ensures accurate calibration of eye tracking devices - Improves the performance and reliability of eye tracking technology - Enhances user experience in various applications

    • Benefits:**

- Precise and reliable eye tracking data - Improved accuracy in gaze tracking - Enhanced user interaction in virtual environments


Original Abstract Submitted

The invention is concerned with a method for performing a calibration procedure for calibrating an eye tracking device (), wherein a stimulus object (S) is displayed within a certain display area (), such that the stimulus object (S) is at least temporarily moving along a defined trajectory () and images of at least one eye () of at least one user () are captured during the displaying of the stimulus object (S). Based on the captured images gaze data are provided and in dependency of the gaze data gaze points (P) of the at least one eye () of the user () with respect to the display area () are determined. Further, at least one calibration parameter (a a a a a a a a a a a a a a R; K; a; b; r) of at least one predefined calibration model (M, M M M M M M) is determined in dependency of a first analysis at least of positions of at least part of the respective gaze points (P) with regard to the defined trajectory () of the stimulus object (S). Further, at least one control parameter (N, (A, D), T) is determined in dependency of at least part of the captured gaze data and controlling the execution of at least part of the calibration procedure in dependency of the at least one determined control parameter (N, (A, D), T).