18535940. CONTROLLING LOCOMOTION WITHIN AN ARTIFICIAL-REALITY APPLICATION USING HAND GESTURES, AND METHODS AND SYSTEMS OF USE THEREOF simplified abstract (META PLATFORMS TECHNOLOGIES, LLC)

From WikiPatents
Revision as of 16:07, 14 June 2024 by Wikipatents (talk | contribs) (Creating a new page)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

CONTROLLING LOCOMOTION WITHIN AN ARTIFICIAL-REALITY APPLICATION USING HAND GESTURES, AND METHODS AND SYSTEMS OF USE THEREOF

Organization Name

META PLATFORMS TECHNOLOGIES, LLC

Inventor(s)

Brandon Furtwangler of Issaquah WA (US)

CONTROLLING LOCOMOTION WITHIN AN ARTIFICIAL-REALITY APPLICATION USING HAND GESTURES, AND METHODS AND SYSTEMS OF USE THEREOF - A simplified explanation of the abstract

This abstract first appeared for US patent application 18535940 titled 'CONTROLLING LOCOMOTION WITHIN AN ARTIFICIAL-REALITY APPLICATION USING HAND GESTURES, AND METHODS AND SYSTEMS OF USE THEREOF

Simplified Explanation: The patent application describes a method for adjusting a user's position within an artificial reality application using hand gestures.

  • Head-wearable device displays user's position in artificial reality.
  • Hand gesture activates positional-control user interface.
  • User can change position based on hand gestures.
  • Updated position is displayed in the artificial reality environment.

Key Features and Innovation: - Utilizes hand gestures to control user position in artificial reality. - Provides a user-friendly interface for adjusting position within the environment. - Enhances user experience by allowing for intuitive control of movement. - Integrates seamlessly with head-wearable devices for a more immersive experience.

Potential Applications: - Virtual reality gaming - Training simulations - Architectural design - Medical simulations - Virtual tours

Problems Solved: - Difficulty in controlling user position within artificial reality environments. - Lack of intuitive interfaces for adjusting user position. - Limited options for user interaction in virtual environments.

Benefits: - Enhanced user experience in artificial reality applications. - Improved control over user position for a more immersive experience. - Intuitive interface for adjusting position within the environment. - Increased engagement and interaction in virtual reality settings.

Commercial Applications: The technology can be used in various industries such as gaming, training, design, healthcare, and tourism to enhance user experiences and provide more interactive virtual environments.

Prior Art: Prior art related to this technology may include patents or research on gesture-based control systems in virtual reality applications, user interfaces for adjusting user position in artificial reality, and methods for enhancing user interaction in virtual environments.

Frequently Updated Research: Research on advancements in gesture recognition technology, user interface design for virtual reality applications, and user experience studies in artificial reality environments may be relevant to this technology.

Questions about Hand Gesture Control in Artificial Reality: 1. How does hand gesture control improve user interaction in artificial reality applications? 2. What are the potential limitations of using hand gestures for adjusting user position in virtual environments?


Original Abstract Submitted

Systems and methods are provided for adjusting a representation of a user's position within an artificial-reality application using a hand gesture. One example method includes, while displaying, via a head-wearable device worn by a user, a representation of a user's position within an artificial-reality environment, in response to receiving an indication that a positional-control activation hand gesture has been performed, displaying a positional-control user interface (UI) overlaid on a portion of the artificial-reality environment, the positional-control UI including a positional-control UI element configured to perform a positional-control action. The example method further includes, while displaying the positional-control UI, in response to receiving an indication that the positional-control UI element has been selected, via a positional-control input hand gesture, causing a change in the representation of the user's position within the artificial-reality environment based on the positional-control action, and displaying a changed representation of the user's position within the artificial-reality environment.