18353579. REAL-TIME PHOTOREALISTIC VIEW RENDERING ON AUGMENTED REALITY (AR) DEVICE simplified abstract (SAMSUNG ELECTRONICS CO., LTD.)

From WikiPatents
Jump to navigation Jump to search

REAL-TIME PHOTOREALISTIC VIEW RENDERING ON AUGMENTED REALITY (AR) DEVICE

Organization Name

SAMSUNG ELECTRONICS CO., LTD.

Inventor(s)

Yingen Xiong of Mountain View CA (US)

Christopher A. Peri of Mountain View CA (US)

REAL-TIME PHOTOREALISTIC VIEW RENDERING ON AUGMENTED REALITY (AR) DEVICE - A simplified explanation of the abstract

This abstract first appeared for US patent application 18353579 titled 'REAL-TIME PHOTOREALISTIC VIEW RENDERING ON AUGMENTED REALITY (AR) DEVICE

Simplified Explanation

The patent application describes a method for rendering scenes on an augmented reality (AR) device using sparse feature vectors. Here is a simplified explanation of the abstract:

  • The method involves capturing images of a scene and obtaining position data of the device that captures the images.
  • Position data and direction data associated with camera rays passing through keyframes of the images are determined.
  • A position-dependent multilayer perceptron (MLP) and a direction-dependent MLP are used to create sparse feature vectors.
  • The sparse feature vectors are stored in at least one data structure.
  • When a request is received to render the scene on an AR device with a specific viewing direction, the scene is rendered using the sparse feature vectors in the data structure.

Potential Applications:

  • Augmented reality devices: The method can be used in AR devices to render scenes based on the viewing direction, enhancing the user's experience.
  • Gaming: The technology can be applied in gaming to create immersive AR environments where the scenes are rendered based on the player's viewing direction.

Problems Solved:

  • Efficient rendering: The method solves the problem of efficiently rendering scenes on AR devices by using sparse feature vectors, reducing the computational load.
  • Real-time rendering: By using position and direction data, the method enables real-time rendering of scenes on AR devices, providing a seamless experience for the user.

Benefits:

  • Improved user experience: The method allows for more accurate and realistic rendering of scenes on AR devices, enhancing the user's immersion in the augmented reality environment.
  • Reduced computational load: By using sparse feature vectors, the method reduces the computational resources required for rendering, making it more efficient and enabling real-time rendering on AR devices.


Original Abstract Submitted

A method includes obtaining images of a scene and corresponding position data of a device that captures the images. The method also includes determining position data and direction data associated with camera rays passing through keyframes of the images. The method further includes using a position-dependent multilayer perceptron (MLP) and a direction-dependent MLP to create sparse feature vectors. The method also includes storing the sparse feature vectors in at least one data structure. The method further includes receiving a request to render the scene on an augmented reality (AR) device associated with a viewing direction. In addition, the method includes rendering the scene associated with the viewing direction using the sparse feature vectors in the at least one data structure.