17900018. PROCESSING-IN-MEMORY SYSTEM WITH DEEP LEARNING ACCELERATOR FOR ARTIFICIAL INTELLIGENCE simplified abstract (Micron Technology, Inc.)

From WikiPatents
Jump to navigation Jump to search

PROCESSING-IN-MEMORY SYSTEM WITH DEEP LEARNING ACCELERATOR FOR ARTIFICIAL INTELLIGENCE

Organization Name

Micron Technology, Inc.

Inventor(s)

Xinyu Wu of Boise ID (US)

Timothy Paul Finkbeiner of Boise ID (US)

Peter Lawrence Brown of Eagle ID (US)

Troy Dale Larsen of Meridian ID (US)

Glen Earl Hush of Boise ID (US)

Troy Allen Manning of Meridian ID (US)

PROCESSING-IN-MEMORY SYSTEM WITH DEEP LEARNING ACCELERATOR FOR ARTIFICIAL INTELLIGENCE - A simplified explanation of the abstract

This abstract first appeared for US patent application 17900018 titled 'PROCESSING-IN-MEMORY SYSTEM WITH DEEP LEARNING ACCELERATOR FOR ARTIFICIAL INTELLIGENCE

Simplified Explanation

The abstract describes a system where an artificial intelligence system utilizes a memory device to process image data from a camera using a neural network, with the memory device containing DRAM, SRAM, and a processor for running the neural network.

  • Memory device used for providing inference results in an artificial intelligence system
  • Image data from a camera is stored in the memory device
  • Memory device includes DRAM, SRAM, and a processor for running a neural network
  • Neural network processes the image data to provide an inference result
  • Memory device has the same form factor as a conventional DRAM device
  • Memory device includes a MAC engine for neural network computations
      1. Potential Applications

- Artificial intelligence systems - Image recognition and processing - Autonomous vehicles - Robotics

      1. Problems Solved

- Efficient processing of image data - Integration of memory and processing capabilities in a single device - Improved performance of neural networks

      1. Benefits

- Faster inference results - Reduced latency in processing image data - Compact form factor for memory device - Enhanced efficiency in neural network computations


Original Abstract Submitted

Systems, methods, and apparatus related to memory devices. In one approach, an artificial intelligence system uses a memory device to provide inference results. Image data from a camera is provided to the memory device. The memory device stores the image data received from the camera. The memory device includes dynamic random access memory (DRAM), and static random access memory (SRAM). The memory device also includes a processor to run a neural network. The neural network uses the image data as input. An output from the neural network provides an inference result. In one example, the memory device has a same form factor as a conventional DRAM device. The memory device includes a multiply-accumulate (MAC) engine that supports computations for the neural network.