18063804. Automatic Memory Management for Compute Graphs simplified abstract (Google LLC)

From WikiPatents
Jump to navigation Jump to search

Automatic Memory Management for Compute Graphs

Organization Name

Google LLC

Inventor(s)

Ashish Saxena of Ilford (GB)

Vinsensius B. Vega S. Naryanto of Zurich (CH)

Matej Rizman of London (GB)

Pavel Shmakov of London (GB)

Juan Antonio Navarro Perez of London (GB)

Cyril Chimisov of London (GB)

Automatic Memory Management for Compute Graphs - A simplified explanation of the abstract

This abstract first appeared for US patent application 18063804 titled 'Automatic Memory Management for Compute Graphs

The method described in the abstract involves analyzing a compute graph to compute a first tensor, identifying a reduction operation in at least one dimension of the tensor, locating a cut point at the operation to divide the graph, and determining multiple slices of the tensor. This process includes backpropagating the cut point to create graph pieces for the first portion, defining second graph pieces to combine outputs, and executing the graph pieces to process the compute graph.

  • The method involves analyzing a compute graph to compute a first tensor.
  • It identifies a reduction operation in at least one dimension of the tensor.
  • A cut point is located at the operation to divide the graph into portions.
  • Multiple slices of the tensor are determined for processing.
  • Backpropagation is used to create graph pieces for the first portion.
  • Second graph pieces are defined to combine outputs of the first graph pieces.
  • The method executes the graph pieces to process the compute graph effectively.

Potential Applications: - This method can be applied in machine learning algorithms for efficient tensor computations. - It can enhance the performance of neural networks by optimizing graph processing.

Problems Solved: - Streamlining the computation process of tensors in complex graphs. - Improving the efficiency of backpropagation in neural networks.

Benefits: - Faster computation of tensors in complex graphs. - Enhanced performance and accuracy in neural network operations.

Commercial Applications: Title: Enhanced Tensor Computation Method for Neural Networks This technology can be utilized in various industries such as: - Artificial intelligence - Data analytics - Image and speech recognition systems

Questions about the technology: 1. How does this method improve the efficiency of tensor computations in neural networks? 2. What are the key advantages of using backpropagation in graph processing for tensor computations?


Original Abstract Submitted

A method includes obtaining a compute graph for computing a first tensor, identifying in the graph a reduction operation in at least one dimension of the first tensor, locating, at the operation, a cut point that cuts the graph into first and second portions, and determining a plurality of slices of the first tensor. The method also includes backpropagating the cut point through the graph to define a plurality of first graph pieces for the first portion, each particular first graph piece representing a computation of a particular slice of the plurality of slices based on a particular portion of a plurality of portions of a second tensor. The method further includes defining one or more second graph pieces to combine outputs of the first graph pieces, and executing the first graph pieces and the second graph pieces to execute the first portion of the compute graph.