18436322. DETERMINING GAZE DIRECTION TO GENERATE AUGMENTED REALITY CONTENT simplified abstract (Snap Inc.)

From WikiPatents
Jump to navigation Jump to search

DETERMINING GAZE DIRECTION TO GENERATE AUGMENTED REALITY CONTENT

Organization Name

Snap Inc.

Inventor(s)

Kyle Goodrich of Venice CA (US)

DETERMINING GAZE DIRECTION TO GENERATE AUGMENTED REALITY CONTENT - A simplified explanation of the abstract

This abstract first appeared for US patent application 18436322 titled 'DETERMINING GAZE DIRECTION TO GENERATE AUGMENTED REALITY CONTENT

Simplified Explanation

The subject technology involves determining a gaze direction in a user's field of view using eyewear, generating an anchor point based on the gaze direction, identifying a ground plane surface, determining the distance from the surface to the anchor point, generating AR content based on the distance, and rendering the AR content for display on the eyewear device.

  • Gaze direction determined in user's field of view
  • Anchor point generated based on gaze direction
  • Ground plane surface identified
  • Distance from surface to anchor point determined
  • AR content generated based on distance
  • AR content rendered for display on eyewear device

Potential Applications

The technology could be used in various industries such as gaming, navigation, education, and training to provide users with interactive and immersive experiences.

Problems Solved

The technology solves the problem of accurately determining a user's gaze direction and incorporating AR content into their field of view based on that direction.

Benefits

The benefits of this technology include enhanced user experiences, improved accuracy in displaying AR content, and increased engagement in various applications.

Potential Commercial Applications

Potential commercial applications of this technology include AR gaming, virtual tours, navigation assistance, and interactive training programs.

Possible Prior Art

One possible prior art could be similar technologies used in AR glasses or headsets that track user gaze and display corresponding AR content.

Unanswered Questions

How does the technology handle different lighting conditions?

The technology may use sensors to adjust for varying lighting conditions and ensure accurate gaze direction detection.

Can the technology be integrated with other devices or platforms?

The technology may have the potential to be integrated with other devices or platforms through APIs or software development kits for expanded functionality and compatibility.


Original Abstract Submitted

The subject technology determines a gaze direction in a field of view of a user using an eyewear device. The subject technology generates an anchor point in the field of view based at least in part on the determined gaze direction. The subject technology identifies a surface corresponding to a ground plane in the field of view. The subject technology determines a distance from the identified surface to the anchor point. The subject technology generates AR content based at least in part on the determined distance. The subject technology renders the generated AR content in the field of view for display by the eyewear device.