INTERNATIONAL BUSINESS MACHINES CORPORATION (20240303931). GENERATING 3D HAND KEYPOINTS FOR A MIXED REALITY AVATAR simplified abstract
Contents
- 1 GENERATING 3D HAND KEYPOINTS FOR A MIXED REALITY AVATAR
- 1.1 Organization Name
- 1.2 Inventor(s)
- 1.3 GENERATING 3D HAND KEYPOINTS FOR A MIXED REALITY AVATAR - A simplified explanation of the abstract
- 1.4 Simplified Explanation
- 1.5 Potential Applications
- 1.6 Problems Solved
- 1.7 Benefits
- 1.8 Commercial Applications
- 1.9 Prior Art
- 1.10 Frequently Updated Research
- 1.11 Questions about Mixed Reality Hand Tracking
- 1.12 Original Abstract Submitted
GENERATING 3D HAND KEYPOINTS FOR A MIXED REALITY AVATAR
Organization Name
INTERNATIONAL BUSINESS MACHINES CORPORATION
Inventor(s)
Wei Jun Zheng of Shanghai (CN)
GENERATING 3D HAND KEYPOINTS FOR A MIXED REALITY AVATAR - A simplified explanation of the abstract
This abstract first appeared for US patent application 20240303931 titled 'GENERATING 3D HAND KEYPOINTS FOR A MIXED REALITY AVATAR
Simplified Explanation
The patent application describes a method, computer system, and computer program product for mixed reality. It involves generating 3D keypoints of a user's visible and uncapturable hand joints, refining the uncapturable hand joints using a generative model, and rendering the 3D keypoints for the user's uncapturable hand using a virtual hand modeler.
- Receiving 3D hand keypoints of a user's visible hand joints
- Using random noise as initial keypoints for the uncapturable hand joints
- Inputting received and initial keypoints into a 3D hand joint generative model
- Refining the uncapturable hand joints using the generative model
- Identifying synchronization between generated keypoints of the uncapturable hand and the visible hand
- Rendering the generated 3D keypoints for the user's uncapturable hand joints
Potential Applications
This technology can be used in virtual reality applications, gaming, medical simulations, and interactive design tools.
Problems Solved
The technology addresses the challenge of accurately capturing and rendering hand movements in mixed reality environments.
Benefits
- Improved realism in virtual environments - Enhanced user interaction and immersion - Precise hand tracking for various applications
Commercial Applications
Title: Enhanced Virtual Reality Hand Tracking Technology This technology can be utilized in virtual reality gaming, medical training simulations, interactive design software, and virtual collaboration platforms. It has the potential to enhance user experiences and increase the realism of virtual environments, leading to broader adoption in the entertainment, healthcare, and design industries.
Prior Art
Further research can be conducted in the field of hand tracking technology in mixed reality environments to explore existing solutions and advancements.
Frequently Updated Research
Researchers are constantly working on improving hand tracking technology in mixed reality to enhance user experiences and interaction in virtual environments.
Questions about Mixed Reality Hand Tracking
How does this technology improve user interaction in virtual reality environments?
This technology enhances user interaction by accurately capturing and rendering hand movements, allowing for more realistic and immersive experiences.
What are the potential applications of this technology beyond gaming?
This technology can be applied in medical simulations, interactive design tools, virtual collaboration platforms, and various other industries to improve user experiences and enhance realism in virtual environments.
Original Abstract Submitted
according to one embodiment, a method, computer system, and computer program product for mixed reality is provided. the present invention may include receiving 3d hand keypoints (keypoints) of a user's visible hand joints from the user's capturable hand, and visible hand joints, if any, from the user's uncapturable hand; using random noise sampled with a unit normal distribution as initial keypoints for the uncapturable hand joints from the user's uncapturable hand; inputting the received and the initial keypoints, in a preset order, into a trained 3d hand joint generative model; performing an iterative refinement of the uncapturable hand joints from the user's uncapturable hand using the trained 3d hand joint generative model; identifying whether generated keypoints of the user's uncapturable hand are synchronized with the keypoints of the user's capturable hand; and rendering the generated 3d keypoints for the user's uncapturable hand joints using a 3d virtual hand modeler.