Snap Inc. (20240338913). CONTROLLING INTERACTIVE FASHION BASED ON BODY GESTURES simplified abstract

From WikiPatents
Jump to navigation Jump to search

CONTROLLING INTERACTIVE FASHION BASED ON BODY GESTURES

Organization Name

Snap Inc.

Inventor(s)

Itamar Berger of Hod Hasharon (IL)

Gal Dudovitch of Tel Aviv (IL)

Gal Sasson of Kibbutz Ayyelet Hashahar (IL)

Ma'ayan Mishin Shuvi of Tel Aviv (IL)

Matan Zohar of Rishon LeZion (IL)

CONTROLLING INTERACTIVE FASHION BASED ON BODY GESTURES - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240338913 titled 'CONTROLLING INTERACTIVE FASHION BASED ON BODY GESTURES

The patent application describes methods and systems for enhancing videos featuring fashion items through augmented reality elements based on gestures performed by individuals in the video.

  • Receiving a video with a person wearing a fashion item
  • Generating a segmentation of the fashion item worn by the person
  • Applying augmented reality elements to the fashion item based on the segmentation
  • Detecting gestures performed by the person in the video
  • Modifying the augmented reality elements based on the gestures

Potential Applications: - Fashion industry for virtual try-ons - Entertainment industry for interactive videos - Marketing and advertising for engaging content

Problems Solved: - Enhancing user engagement with fashion content - Providing interactive and personalized experiences - Improving virtual shopping experiences

Benefits: - Increased user interaction and engagement - Enhanced visual representation of fashion items - Personalized and interactive content

Commercial Applications: "Augmented Reality Fashion Enhancement Technology: Revolutionizing Virtual Try-Ons and Interactive Videos"

Questions about the technology: 1. How does the technology differentiate between different gestures performed by individuals in the video? 2. What are the limitations of applying augmented reality elements to fashion items in videos?


Original Abstract Submitted

methods and systems are disclosed for performing operations comprising: receiving a video that includes a depiction of a person wearing a fashion item; generating a segmentation of the fashion item worn by the person depicted in the video; applying one or more augmented reality elements to the fashion item worn by the person based on the segmentation of the fashion item worn by the person; detecting a gesture performed by the person in the video; and modifying the one or more augmented reality elements that have been applied to the fashion item worn by the person based on the gesture performed by the person.