18611141. GESTURE-BASED SHARED AR SESSION CREATION simplified abstract (Snap Inc.)
Contents
- 1 GESTURE-BASED SHARED AR SESSION CREATION
- 1.1 Organization Name
- 1.2 Inventor(s)
- 1.3 GESTURE-BASED SHARED AR SESSION CREATION - A simplified explanation of the abstract
- 1.4 Simplified Explanation
- 1.5 Key Features and Innovation
- 1.6 Potential Applications
- 1.7 Problems Solved
- 1.8 Benefits
- 1.9 Commercial Applications
- 1.10 Prior Art
- 1.11 Frequently Updated Research
- 1.12 Questions about Gesture-Based Shared AR Session Creation Technology
- 1.13 Original Abstract Submitted
GESTURE-BASED SHARED AR SESSION CREATION
Organization Name
Inventor(s)
Piers George Cowburn of London (GB)
[[:Category:Isac Andreas M�ller Sandvik of London (GB)|Isac Andreas M�ller Sandvik of London (GB)]][[Category:Isac Andreas M�ller Sandvik of London (GB)]]
GESTURE-BASED SHARED AR SESSION CREATION - A simplified explanation of the abstract
This abstract first appeared for US patent application 18611141 titled 'GESTURE-BASED SHARED AR SESSION CREATION
Simplified Explanation
The patent application describes a method for creating a shared Augmented Reality (AR) session based on a gesture. This involves analyzing motion data from one user's device, generated by observing another user's gesture, and comparing it with motion data captured by the second user's device to determine a match.
- Server receives observed motion data from the first user's device.
- First device generates observed motion data by analyzing images of the second user performing a gesture.
- Server receives captured motion data from the second user's device.
- Captured motion data is recorded by a sensor in the second device.
- Server compares observed motion data with captured motion data to determine a match.
Key Features and Innovation
- Creation of a shared AR session based on a gesture.
- Analysis of motion data from one user's device.
- Comparison of observed motion data with captured motion data.
- Utilization of sensors in devices to record motion data.
Potential Applications
This technology can be used in:
- Collaborative AR experiences.
- Virtual training sessions.
- Interactive gaming applications.
- Remote communication platforms.
- Gesture-based control systems.
Problems Solved
- Facilitates seamless sharing of AR experiences.
- Enhances user interaction in virtual environments.
- Improves gesture recognition accuracy.
- Enables real-time collaboration between users.
- Enhances user engagement in AR applications.
Benefits
- Enhanced user experience in AR environments.
- Improved accuracy in gesture recognition.
- Facilitates real-time collaboration between users.
- Enables interactive and engaging virtual experiences.
- Enhances the usability of AR applications.
Commercial Applications
Title: "Gesture-Based Shared AR Session Creation Technology" This technology can be applied in various commercial sectors such as:
- Entertainment and gaming industry.
- Education and training sector.
- Communication and collaboration platforms.
- Healthcare for remote consultations.
- Industrial applications for virtual simulations.
Prior Art
Readers can explore prior art related to shared AR sessions, gesture recognition technologies, and collaborative AR applications to gain a deeper understanding of the technological landscape in this field.
Frequently Updated Research
Stay updated on the latest advancements in shared AR technologies, gesture recognition systems, and collaborative virtual environments to leverage the full potential of this innovative technology.
1. How does this technology improve user interaction in virtual environments?
- This technology enhances user interaction by enabling real-time collaboration and shared experiences based on gestures.
2. What are the potential commercial applications of this gesture-based shared AR session creation technology?
- The technology can be utilized in entertainment, education, communication, healthcare, and industrial sectors for various applications.
Original Abstract Submitted
Method of creating shared AR session based on a gesture starts with server receiving observed motion data from first device associated with first user. First device generating observed motion data based on an analysis of data stream comprising images of second user performing a gesture. Second user being associated with second device. Server receiving from second device captured motion data that corresponds to the gesture. Captured motion data being recorded by a sensor included in second device. Server determines whether there is a match between observed motion data from first device and captured motion data from second device.