Samsung electronics co., ltd. (20240220786). NEURAL NETWORK ACCELERATOR simplified abstract

From WikiPatents
Jump to navigation Jump to search

NEURAL NETWORK ACCELERATOR

Organization Name

samsung electronics co., ltd.

Inventor(s)

Sungju Ryu of Busan (KR)

Hyungjun Kim of Pohang-si (KR)

Jae-Joon Kim of Pohang-si (KR)

NEURAL NETWORK ACCELERATOR - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240220786 titled 'NEURAL NETWORK ACCELERATOR

The abstract describes a neural network accelerator that includes various operators such as bit operators, adders, shifters, and accumulators to perform operations on input feature data and weight data to generate output feature data.

  • First bit operator performs multiplication on specific bits of input feature data and weight data.
  • Second bit operator performs multiplication on different bits of input feature data and weight data.
  • Adder generates an addition result based on the multiplication results.
  • Shifter shifts the addition result by a specified number of digits based on a shift value.
  • Accumulator generates the final output feature data based on the shifted addition result.

Potential Applications: - Accelerating neural network operations in various applications such as image recognition, natural language processing, and autonomous driving systems.

Problems Solved: - Enhancing the efficiency and speed of neural network computations. - Improving the performance of machine learning models in real-time applications.

Benefits: - Faster processing of neural network operations. - Increased accuracy and reliability of machine learning algorithms. - Reduced power consumption in neural network computations.

Commercial Applications: - This technology can be utilized in hardware accelerators for data centers, edge computing devices, and IoT devices to improve the performance of AI applications.

Questions about Neural Network Accelerator: 1. How does the neural network accelerator improve the efficiency of machine learning models? 2. What are the potential implications of using this technology in autonomous driving systems?

Frequently Updated Research: - Stay updated on the latest advancements in neural network accelerators and hardware acceleration technologies to enhance the performance of AI applications.


Original Abstract Submitted

disclosed is a neural network accelerator including a first bit operator generating a first multiplication result by performing multiplication on first feature 5 bits of input feature data and first weight bits of weight data, a second bit operator generating a second multiplication result by performing multiplication on second feature bits of the input feature data and second weight bits of the weight data, an adder generating an addition result by performing addition based on the first multiplication result and the second multiplication result, a shifter shifting a number 10 of digits of the addition result depending on a shift value to generate a shifted addition result and an accumulator generating output feature data based on the shifted addition result.