20240013439. AUTOMATED CALIBRATION METHOD OF A SYSTEM COMPRISING AN EXTERNAL EYE-TRACKING DEVICE AND A COMPUTING DEVICE simplified abstract (Eyeware Tech SA)

From WikiPatents
Jump to navigation Jump to search

AUTOMATED CALIBRATION METHOD OF A SYSTEM COMPRISING AN EXTERNAL EYE-TRACKING DEVICE AND A COMPUTING DEVICE

Organization Name

Eyeware Tech SA

Inventor(s)

Kenneth Alberto Funes Mora of Lausanne (CH)

AUTOMATED CALIBRATION METHOD OF A SYSTEM COMPRISING AN EXTERNAL EYE-TRACKING DEVICE AND A COMPUTING DEVICE - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240013439 titled 'AUTOMATED CALIBRATION METHOD OF A SYSTEM COMPRISING AN EXTERNAL EYE-TRACKING DEVICE AND A COMPUTING DEVICE

Simplified Explanation

The present invention is a method for calibrating a system that includes an external eye-tracking device and a computing device, and for capturing the gaze of a user on the screen of the computing device in real-time.

  • The calibration of the system involves capturing images of landmarks on the user's face using the cameras of the eye-tracking device to determine their 3D positions in the coordinate system of the eye-tracking device.
  • The same landmarks on the user's face are also captured using the camera of the computing device to determine their 2D positions in the image coordinate system of the computing device camera.
  • The 3D pose of the computing device's camera, defined as the camera coordinate system, is computed based on the 3D and 2D positions of the landmarks in their respective coordinate systems.
  • The 3D pose of the screen of the computing device, defined as the screen coordinate system, is computed as a function of the camera coordinate system and mechanical parameters describing the screen's position relative to the camera.
  • Capturing the user's gaze on the screen in real-time involves retrieving the user's gaze ray with the eye-tracking device and intersecting it with the plane of the screen using the computed coordinate system parameters to determine the gaze-on-screen.

Potential applications of this technology:

  • Eye-tracking for user interface control in computing devices.
  • Gaze-based interaction in virtual reality and augmented reality systems.
  • Attention monitoring in educational or training applications.
  • Eye-tracking for assistive technologies, such as controlling devices for individuals with limited mobility.

Problems solved by this technology:

  • Accurate and real-time gaze tracking on the screen of a computing device.
  • Calibration of the eye-tracking system to ensure accurate gaze tracking.
  • Integration of eye-tracking technology with computing devices.

Benefits of this technology:

  • Improved user experience and interaction with computing devices.
  • Enhanced accessibility for individuals with disabilities.
  • Potential for new applications and innovations in various fields.
  • Real-time monitoring of user attention and engagement.


Original Abstract Submitted

the present invention relates to a method for calibrating a system () comprising an external eye-tracking device () and a computing device () and for capturing the gaze of a user (p) on the screen () of the computing device in real-time. the calibration of the system () comprises: capturing with one or more cameras () of the eye-tracking device () at least of one image of landmarks (f, f, f . . . fn) of the face of the user (p) to identify the 3d position of each landmark in the coordinate system (ecs) of said eye tracking device; capturing with a camera () of the computing device () the same landmarks (f, f, f) of the face of the user (p) in the image coordinate system (ics) of the computing device camera () to identify the 2d position of each landmark in the image coordinate system ics; computing the 3d pose of the camera () of the computing device (), defined as the camera coordinate system (ccs), as a function of the 3d position and 2d position of each landmark (f, f, f . . . fn) respectively in the coordinate system ecs and in the coordinate system ics, and computing the 3d pose of the screen of the computing device, defined as the screen coordinate system (scs), as a function of the camera coordinate system and mechanical parameters describing how the screen () is positioned with respect to the camera () of the computing device. capturing the gaze of a user (p) on the screen () of the computing device in real-time comprises: retrieving a gaze ray (d) of the user (p) with the eye-tracking device (), and intersecting the gaze ray (d) of the user (p) with the plane of the screen of the computing device, as a function of the ecs and scs parameters, to capture the gaze-on-screen in real-time.