US Patent Application 17736648. RGB-IR DATA PROCESSING FOR AUTONOMOUS SYSTEMS AND APPLICATIONS simplified abstract

From WikiPatents
Revision as of 03:20, 4 December 2023 by Wikipatents (talk | contribs) (Creating a new page)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

RGB-IR DATA PROCESSING FOR AUTONOMOUS SYSTEMS AND APPLICATIONS

Organization Name

NVIDIA Corporation

Inventor(s)

Samuel Hung of Santa Clara CA (US)

Sean Midthun Pieper of Waldport OR (US)

Eric Dujardin of San Jose CA (US)

Sung Hyun Hwang of San Jose CA (US)

RGB-IR DATA PROCESSING FOR AUTONOMOUS SYSTEMS AND APPLICATIONS - A simplified explanation of the abstract

This abstract first appeared for US patent application 17736648 titled 'RGB-IR DATA PROCESSING FOR AUTONOMOUS SYSTEMS AND APPLICATIONS

Simplified Explanation

The patent application describes a system that processes image data from an RGB-IR sensor in an automobile.

  • The system blends infrared (IR) data and visible light data to generate optimal images based on current light levels.
  • A scene detection value is computed by comparing the IR values and visible light values in the image data.
  • The system determines corrections to apply to the image data, such as infrared correction, color correction, and color saturation.
  • The image data is transformed based on the determined corrections.
  • The transformed image data provides more information for low light scenes, resulting in higher quality images.


Original Abstract Submitted

A system, such as for use in an automobile, is configured to process image data that includes infrared values and visible light values (e.g., data generated by a red, green, blue, infrared (RGB-IR) sensor). The system determines how to blend IR data and visible light data together to generate optimal images according to current light levels. In embodiments, the system computes a scene detection value for the image data based on a comparison between the infrared values and the visible light values. The system can then determine an amount of infrared correction, a color correction factor, a color saturation factor, etc. to apply to the image data. The system then transforms the image data based on the amount of infrared correction, the color correction factor, the color saturation factor, etc. The transformed image data includes more information for low light scenes than is traditionally available, and thus produces higher quality images in embodiments.