Unknown Organization (20240295922). GESTURE BASED USER INTERFACES, APPARATUSES AND SYSTEMS USING EYE TRACKING, HEAD TRACKING, HAND TRACKING, FACIAL EXPRESSIONS AND OTHER USER ACTIONS simplified abstract

From WikiPatents
Jump to navigation Jump to search

GESTURE BASED USER INTERFACES, APPARATUSES AND SYSTEMS USING EYE TRACKING, HEAD TRACKING, HAND TRACKING, FACIAL EXPRESSIONS AND OTHER USER ACTIONS

Organization Name

Unknown Organization

Inventor(s)

Uday Parshionikar of Mason OH (US)

GESTURE BASED USER INTERFACES, APPARATUSES AND SYSTEMS USING EYE TRACKING, HEAD TRACKING, HAND TRACKING, FACIAL EXPRESSIONS AND OTHER USER ACTIONS - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240295922 titled 'GESTURE BASED USER INTERFACES, APPARATUSES AND SYSTEMS USING EYE TRACKING, HEAD TRACKING, HAND TRACKING, FACIAL EXPRESSIONS AND OTHER USER ACTIONS

The patent application discloses user interaction concepts, principles, and algorithms for gestures involving facial expressions, motion or orientation of body parts, eye gaze, tightening muscles, mental activity, and other user actions. It also includes concepts, principles, and algorithms for enabling hands-free and voice-free interaction with electronic devices. Apparatuses, systems, computer implementable methods, and non-transient computer storage media storing instructions implementing the disclosed concepts are also included. The patent further discloses gestures for systems using eye gaze and head tracking that can be used with augmented, mixed, or virtual reality, mobile, or desktop computing. Additionally, the use of periods of limited activity and consecutive user actions in orthogonal axes, generation of command signals based on start and end triggers, and methods for coarse as well as fine modification of objects are disclosed.

  • User interaction concepts, principles, and algorithms for various gestures
  • Hands-free and voice-free interaction with electronic devices
  • Apparatuses, systems, computer implementable methods, and storage media storing instructions
  • Gestures for systems using eye gaze and head tracking
  • Use of periods of limited activity and consecutive user actions
    • Potential Applications:**

- Virtual reality systems - Augmented reality applications - Hands-free interfaces for electronic devices - Gaming technology - Accessibility tools for individuals with disabilities

    • Problems Solved:**

- Enhancing user interaction with electronic devices - Providing alternative input methods - Improving user experience in virtual and augmented reality environments

    • Benefits:**

- Increased accessibility for users - Enhanced user experience - Improved efficiency in interacting with electronic devices - Potential for new innovative applications in various industries

    • Commercial Applications:**

"Enhancing User Interaction in Virtual Reality Systems: A New Approach to Gestures and Commands"

    • Questions about User Interaction Concepts:**

1. How do these gestures improve user experience in virtual reality environments? 2. What are the potential implications of hands-free and voice-free interaction in gaming technology?

    • Frequently Updated Research:**

Stay updated on the latest advancements in user interaction concepts and technologies to ensure optimal implementation in various applications.


Original Abstract Submitted

user interaction concepts, principles and algorithms for gestures involving facial expressions, motion or orientation of body parts, eye gaze, tightening muscles, mental activity, and other user actions are disclosed. user interaction concepts, principles and algorithms for enabling hands-free and voice-free interaction with electronic devices are disclosed. apparatuses, systems, computer implementable methods, and non-transient computer storage media storing instructions, implementing the disclosed concepts, principles and algorithms are disclosed. gestures for systems using eye gaze and head tracking that can be used with augmented, mixed or virtual reality, mobile or desktop computing are disclosed. use of periods of limited activity and consecutive user actions in orthogonal axes is disclosed. generation of command signals based on start and end triggers is disclosed. methods for coarse as well as fine modification of objects are disclosed.