17931645. AUGMENTED REALITY AT A FRONT-END DEVICE simplified abstract (Capital One Services, LLC)

From WikiPatents
Jump to navigation Jump to search

AUGMENTED REALITY AT A FRONT-END DEVICE

Organization Name

Capital One Services, LLC

Inventor(s)

Joshua Edwards of Philadelphia PA (US)

Michael Mossoba of Great Falls VA (US)

Tyler Maiman of MELVILLE NY (US)

AUGMENTED REALITY AT A FRONT-END DEVICE - A simplified explanation of the abstract

This abstract first appeared for US patent application 17931645 titled 'AUGMENTED REALITY AT A FRONT-END DEVICE

Simplified Explanation

The abstract describes a system that can interpret hand gestures captured by an augmented reality (AR) device to control a front-end device. Here are the key points:

  • System receives a request from the AR device to access its optical sensor.
  • System receives output from the optical sensor indicating a hand gesture by the user.
  • System determines the corresponding input key on the front-end device based on the hand gesture.
  • System performs an action on the front-end device based on the input key.

Potential Applications

This technology could be used in various industries such as gaming, virtual reality simulations, remote control systems, and interactive presentations.

Problems Solved

This technology eliminates the need for physical controllers or touchscreens, providing a more intuitive and immersive user experience.

Benefits

- Enhanced user interaction with devices - Hands-free control of front-end devices - Improved user experience in AR applications

Potential Commercial Applications

"Gesture-Controlled AR Technology for Enhanced User Experience"

Possible Prior Art

One possible prior art could be the use of hand gestures in gaming consoles like the Xbox Kinect, which also interprets physical movements for control.

Unanswered Questions

How accurate is the system in interpreting hand gestures?

The abstract does not provide details on the accuracy of the system in recognizing and translating hand gestures into input keys. Further information on the system's precision would be beneficial.

What types of hand gestures can the system recognize?

The abstract does not specify the range or types of hand gestures that the system can interpret. Understanding the variety of gestures supported by the system would be essential for assessing its usability in different applications.


Original Abstract Submitted

In some implementations, a system may receive, at a front-end device and from an augmented reality (AR) device associated with a user, a request to use the front-end device. The system may transmit, to the AR device, a request to access an optical sensor of the AR device. The system may receive, from the AR device, an output, from the optical sensor, that indicates a hand gesture performed by the user of the AR device. Accordingly, the system may determine, based on the hand gesture, a corresponding input key of the front-end device. The system may perform an action based on the corresponding input key.