17391718. FOLDING COLUMN ADDER ARCHITECTURE FOR DIGITAL COMPUTE IN MEMORY simplified abstract (QUALCOMM Incorporated)

From WikiPatents
Jump to navigation Jump to search

FOLDING COLUMN ADDER ARCHITECTURE FOR DIGITAL COMPUTE IN MEMORY

Organization Name

QUALCOMM Incorporated

Inventor(s)

Mustafa Badaroglu of Leuven (BE)

Zhongze Wang of San Diego CA (US)

FOLDING COLUMN ADDER ARCHITECTURE FOR DIGITAL COMPUTE IN MEMORY - A simplified explanation of the abstract

This abstract first appeared for US patent application 17391718 titled 'FOLDING COLUMN ADDER ARCHITECTURE FOR DIGITAL COMPUTE IN MEMORY

Simplified Explanation

The abstract describes an apparatus for performing machine learning tasks, specifically in-memory computation architectures. The apparatus includes a circuit with multiple memory cells on each column of a memory, which store bits representing weights of a neural network. Addition circuits are coupled to each column, and a first adder circuit and an accumulator are used to process the outputs of the addition circuits.

  • The apparatus is designed for performing machine learning tasks using computation-in-memory architectures.
  • It includes a circuit with memory cells that store weights of a neural network.
  • Addition circuits are used to process the data stored in the memory cells.
  • A first adder circuit and an accumulator are employed to further process the outputs of the addition circuits.

Potential Applications

  • Machine learning tasks
  • Neural network training and inference
  • Pattern recognition
  • Data analysis and processing

Problems Solved

  • Efficient computation-in-memory for machine learning tasks
  • Reducing data movement and energy consumption
  • Accelerating neural network training and inference
  • Enabling real-time and low-latency processing

Benefits

  • Improved performance and efficiency in machine learning tasks
  • Reduced data movement and energy consumption
  • Faster neural network training and inference
  • Real-time and low-latency processing capabilities


Original Abstract Submitted

Certain aspects provide an apparatus for performing machine learning tasks, and in particular, to computation-in-memory architectures. One aspect provides a circuit for in-memory computation. The circuit generally includes: a plurality of memory cells on each of multiple columns of a memory, the plurality of memory cells being configured to store multiple bits representing weights of a neural network, wherein the plurality of memory cells on each of the multiple columns are on different word-lines of the memory; multiple addition circuits, each coupled to a respective one of the multiple columns; a first adder circuit coupled to outputs of at least two of the multiple addition circuits; and an accumulator coupled to an output of the first adder circuit.