18062938. ENHANCED INPUT USING RECOGNIZED GESTURES simplified abstract (QUALCOMM Incorporated)

From WikiPatents
Jump to navigation Jump to search

ENHANCED INPUT USING RECOGNIZED GESTURES

Organization Name

QUALCOMM Incorporated

Inventor(s)

Evan Hildreth of Dundas (CA)

Francis Bernard Macdougall of Milton (CA)

ENHANCED INPUT USING RECOGNIZED GESTURES - A simplified explanation of the abstract

This abstract first appeared for US patent application 18062938 titled 'ENHANCED INPUT USING RECOGNIZED GESTURES

Simplified Explanation

The abstract of the patent application describes a technology that allows a user to interact with a graphical user interface (GUI) by moving a representation of themselves on the screen. The GUI has a central region and interaction elements placed outside of it. These interaction elements are hidden until the user's representation aligns with the central region. The system recognizes the user's gestures and adjusts the display of the GUI accordingly, while also outputting an application control.

  • The technology enables user interaction with a GUI through the movement of a user representation.
  • The GUI consists of a central region and hidden interaction elements placed outside of it.
  • The hidden interaction elements become visible when the user's representation aligns with the central region.
  • The system recognizes the user's gestures to alter the display of the GUI and output an application control.

Potential Applications

  • User interfaces for mobile devices, computers, or other electronic devices.
  • Gaming applications that utilize gesture-based controls.
  • Virtual reality or augmented reality environments where users can interact with virtual objects.

Problems Solved

  • Enhances user interaction with a GUI by incorporating gesture recognition and dynamic display adjustments.
  • Provides a more intuitive and immersive user experience.
  • Solves the problem of cluttered interfaces by hiding interaction elements until they are relevant.

Benefits

  • Improved user experience by allowing users to interact with a GUI using gestures and movements.
  • Simplifies the interface by hiding unnecessary interaction elements until they are needed.
  • Enables more efficient and intuitive control of applications and systems.


Original Abstract Submitted

A representation of a user can move with respect to a graphical user interface based on input of a user. The graphical user interface comprises a central region and interaction elements disposed outside of the central region. The interaction elements are not shown until the representation of the user is aligned with the central region. A gesture of the user is recognized, and, based on the recognized gesture, the display of the graphical user interface is altered and an application control is outputted.