20240020934. BLENDING BODY MESH INTO EXTERNAL MESH simplified abstract (Snap Inc.)

From WikiPatents
Jump to navigation Jump to search

BLENDING BODY MESH INTO EXTERNAL MESH

Organization Name

Snap Inc.

Inventor(s)

Matan Zohar of Rishon LeZion (IL)

Yanli Zhao of London (GB)

Brian Fulkerson of London (GB)

BLENDING BODY MESH INTO EXTERNAL MESH - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240020934 titled 'BLENDING BODY MESH INTO EXTERNAL MESH

Simplified Explanation

The patent application describes methods and systems for blending a three-dimensional (3D) body mesh of a real-world object into an external mesh associated with an augmented reality (AR) element in a video. The process involves tracking the movement of the real-world object across frames of the video and establishing a correspondence between a portion of the 3D body mesh and the external mesh. The blending parameter is used to blend the portion of the 3D body mesh into the external mesh, while simultaneously modifying the video to depict the real-world object changing into the AR element.

  • The patent application focuses on blending a 3D body mesh of a real-world object into an external mesh associated with an AR element in a video.
  • The movement of the real-world object is tracked across frames of the video to generate the 3D body mesh.
  • A correspondence is established between a portion of the 3D body mesh and the external mesh.
  • The blending parameter is used to blend the portion of the 3D body mesh into the external mesh.
  • The video is modified to show the real-world object changing into the AR element as the blending occurs.

Potential applications of this technology:

  • Augmented reality experiences in videos and movies.
  • Virtual try-on experiences for fashion and cosmetics.
  • Interactive advertising campaigns.
  • Gaming and virtual reality applications.
  • Training simulations and educational content.

Problems solved by this technology:

  • Seamlessly blending real-world objects into augmented reality elements in videos.
  • Enhancing the visual quality and realism of augmented reality experiences.
  • Simplifying the process of integrating augmented reality elements into videos.
  • Allowing for dynamic and interactive transformations of real-world objects into augmented reality elements.

Benefits of this technology:

  • Improved user engagement and immersion in augmented reality experiences.
  • Enhanced visual effects and realism in videos and movies.
  • Increased flexibility and creativity in blending real-world objects with augmented reality elements.
  • Simplified production process for creating augmented reality content in videos.


Original Abstract Submitted

methods and systems are disclosed for performing operations comprising: receiving a video that includes a depiction of a real-world object; generating a three-dimensional (3d) body mesh associated with the real-world object that tracks movement of the real-world object across frames of the video; obtaining an external mesh associated with an augmented reality (ar) element; automatically establishing a correspondence between a portion of the 3d body mesh associated with the real-world object and the external mesh; blending the portion of the 3d body mesh into the external mesh according to a blending parameter; and as the portion of the 3d body mesh is being blended into the external mesh, modifying the video to depict a portion of the real-world object being changed into the ar element.