Snap Inc. (20240248546). CONTROLLING AUGMENTED REALITY EFFECTS THROUGH MULTI-MODAL HUMAN INTERACTION simplified abstract

From WikiPatents
Jump to navigation Jump to search

CONTROLLING AUGMENTED REALITY EFFECTS THROUGH MULTI-MODAL HUMAN INTERACTION

Organization Name

Snap Inc.

Inventor(s)

Jonathan Solichin of Arcadia CA (US)

Xinyao Wang of San Francisco CA (US)

CONTROLLING AUGMENTED REALITY EFFECTS THROUGH MULTI-MODAL HUMAN INTERACTION - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240248546 titled 'CONTROLLING AUGMENTED REALITY EFFECTS THROUGH MULTI-MODAL HUMAN INTERACTION

The patent application describes a multi-modal interaction system that enhances augmented reality experiences on a computer device.

  • The system displays AR objects on a graphical user interface and provides textual cues for user guidance.
  • Users can interact with the AR objects using hand gestures and voice commands.
  • The system modifies the AR objects based on the user's gestures and commands.
  • The modified AR objects are then displayed on the GUI for the user to interact with.

Potential Applications: This technology can be used in gaming, education, training simulations, and virtual tours.

Problems Solved: Enhances user interaction with AR experiences, improves user engagement, and provides a more immersive experience.

Benefits: Increased user engagement, enhanced user experience, improved interactivity, and more intuitive controls.

Commercial Applications: This technology can be utilized in the gaming industry, educational software development, virtual reality training programs, and tourism applications.

Prior Art: Researchers can explore existing patents related to multi-modal interaction systems, AR technology, and user interface design.

Frequently Updated Research: Stay informed about advancements in AR technology, user interface design, and interactive systems to enhance the functionality of the multi-modal interaction system.

Questions about Multi-Modal Interaction Systems: 1. How does this technology improve user engagement with augmented reality experiences? 2. What are the potential commercial applications of this multi-modal interaction system?


Original Abstract Submitted

systems and methods herein describe a multi-modal interaction system. the multi-modal interaction system, receives a selection of an augmented reality (ar) experience within an application on a computer device, displays a set of ar objects associated with the ar experience on a graphical user interface (gui) of the computer device, display textual cues associated with the set of augmented reality objects on the gui, receives a hand gesture and a voice command, modifies a subset of augmented reality objects of the set of augmented reality objects based on the hand gesture and the voice command, and displays the modified subset of augmented reality objects on the gui.