Meta platforms technologies, llc (20240338086). Virtual Selections Using Multiple Input Modalities simplified abstract

From WikiPatents
Jump to navigation Jump to search

Virtual Selections Using Multiple Input Modalities

Organization Name

meta platforms technologies, llc

Inventor(s)

Aaron Faucher of Torrance CA (US)

Pol Pla I Conesa of Portland OR (US)

Daniel Rosas of Auburn WA (US)

Nathan Aschenbach of Seattle WA (US)

Virtual Selections Using Multiple Input Modalities - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240338086 titled 'Virtual Selections Using Multiple Input Modalities

The present disclosure pertains to triggering virtual keyboard selections using multiple input modalities in an artificial reality environment. An interface manager can display a virtual keyboard to a user and track user eye gaze input and hand input to resolve character selections on the virtual keyboard.

  • Interface manager displays a virtual keyboard in an artificial reality environment.
  • Tracks user eye gaze input and hand input (e.g., hand or finger motion).
  • Resolves character selections on the virtual keyboard based on user gaze input and hand motion meeting trigger criteria.

Potential Applications: - Virtual reality gaming - Augmented reality applications - Accessibility tools for individuals with limited mobility

Problems Solved: - Enhances user interaction in artificial reality environments - Improves efficiency in selecting characters on virtual keyboards

Benefits: - Enhanced user experience in artificial reality environments - Increased accessibility for users with limited mobility - Streamlined character selection process on virtual keyboards

Commercial Applications: Title: Enhanced Virtual Keyboard Selection Technology This technology can be utilized in virtual reality gaming systems, augmented reality applications, and accessibility tools for individuals with limited mobility. The market implications include improved user experience, increased efficiency, and expanded accessibility in artificial reality environments.

Questions about Virtual Keyboard Selection Technology: 1. How does this technology improve user interaction in artificial reality environments? This technology enhances user interaction by tracking eye gaze and hand input to streamline character selection on virtual keyboards, making the process more efficient and user-friendly.

2. What are the potential commercial uses of this technology? This technology can be applied in virtual reality gaming systems, augmented reality applications, and accessibility tools for individuals with limited mobility, offering enhanced user experience and increased accessibility.


Original Abstract Submitted

aspects of the present disclosure are directed to triggering virtual keyboard selections using multiple input modalities. an interface manager can display an interface, such as a virtual keyboard, to a user in an artificial reality environment. implementations of the interface manager can track user eye gaze input and user hand input (e.g., hand or finger motion). the interface manager can resolve a character selection on the virtual keyboard according to the tracked user gaze input based on detection that the user's hand motion meets a trigger criteria. for example, the interface manager can: detect that the tracked user hand motion meets the trigger criteria at a given point in time; and resolve a selection from the virtual keyboard (e.g., selection of a displayed character) according to the tracked user gaze on the virtual keyboard at the given point in time.