18461292. CHANNEL-WISE AUTOREGRESSIVE ENTROPY MODELS FOR IMAGE COMPRESSION simplified abstract (GOOGLE LLC)

From WikiPatents
Jump to navigation Jump to search

CHANNEL-WISE AUTOREGRESSIVE ENTROPY MODELS FOR IMAGE COMPRESSION

Organization Name

GOOGLE LLC

Inventor(s)

David Charles Minnen of Mountain View CA (US)

Saurabh Singh of Mountain View CA (US)

CHANNEL-WISE AUTOREGRESSIVE ENTROPY MODELS FOR IMAGE COMPRESSION - A simplified explanation of the abstract

This abstract first appeared for US patent application 18461292 titled 'CHANNEL-WISE AUTOREGRESSIVE ENTROPY MODELS FOR IMAGE COMPRESSION

Simplified Explanation

The patent application describes methods, systems, and apparatus for channel-wise autoregressive entropy models. These models are used to process data and generate compressed representations of the data. Here are the key points:

  • The method involves using a first encoder neural network to generate a latent representation of the data.
  • The latent representation is then processed by a quantizer and a second encoder neural network to generate a quantized latent representation of the data and a latent representation of an entropy model.
  • The quantized latent representation is further processed into multiple slices of quantized latent representations, arranged in an ordinal sequence.
  • A hyperprior processing network generates hyperprior parameters and a compressed representation of these parameters.
  • For each slice of the quantized latent representations, a corresponding compressed representation is generated using a slice processing network.
  • The combination of these compressed representations forms a compressed representation of the data.

Potential applications of this technology:

  • Data compression: The methods described can be used to compress data, reducing its size while preserving important information.
  • Image and video compression: This technology can be applied to compress images and videos, enabling efficient storage and transmission.
  • Data transmission: Compressed representations can be transmitted over networks more quickly and with less bandwidth usage.

Problems solved by this technology:

  • Efficient compression: The methods described provide a way to compress data effectively, reducing storage and transmission requirements.
  • Preserving important information: The techniques used ensure that important information is retained in the compressed representation.

Benefits of this technology:

  • Reduced storage requirements: Compressed representations take up less space, allowing for more efficient storage of large amounts of data.
  • Faster data transmission: Compressed representations can be transmitted more quickly over networks, reducing latency and improving overall performance.
  • Preserved data quality: Despite compression, the important information in the data is retained, ensuring minimal loss of quality.


Original Abstract Submitted

Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for channel-wise autoregressive entropy models. In one aspect, a method includes processing data using a first encoder neural network to generate a latent representation of the data. The latent representation of data is processed by a quantizer and a second encoder neural network to generate a quantized latent representation of data and a latent representation of an entropy model. The latent representation of data is further processed into a plurality of slices of quantized latent representations of data wherein the slices are arranged in an ordinal sequence. A hyperprior processing network generates a hyperprior parameters and a compressed representation of the hyperprior parameters. For each slice, a corresponding compressed representation is generated using a corresponding slice processing network wherein a combination of the compressed representations form a compressed representation of the data.