17931538. LEARNED COLUMN-WEIGHTS FOR RAPID-ESTIMATION OF PROPERTIES OF AN ENTIRE EXCITATION VECTOR simplified abstract (INTERNATIONAL BUSINESS MACHINES CORPORATION)
Contents
- 1 LEARNED COLUMN-WEIGHTS FOR RAPID-ESTIMATION OF PROPERTIES OF AN ENTIRE EXCITATION VECTOR
- 1.1 Organization Name
- 1.2 Inventor(s)
- 1.3 LEARNED COLUMN-WEIGHTS FOR RAPID-ESTIMATION OF PROPERTIES OF AN ENTIRE EXCITATION VECTOR - A simplified explanation of the abstract
- 1.4 Simplified Explanation
- 1.5 Potential Applications
- 1.6 Problems Solved
- 1.7 Benefits
- 1.8 Potential Commercial Applications
- 1.9 Original Abstract Submitted
LEARNED COLUMN-WEIGHTS FOR RAPID-ESTIMATION OF PROPERTIES OF AN ENTIRE EXCITATION VECTOR
Organization Name
INTERNATIONAL BUSINESS MACHINES CORPORATION
Inventor(s)
Geoffrey Burr of Cupertino CA (US)
Malte Johannes Rasch of Chappaqua NY (US)
LEARNED COLUMN-WEIGHTS FOR RAPID-ESTIMATION OF PROPERTIES OF AN ENTIRE EXCITATION VECTOR - A simplified explanation of the abstract
This abstract first appeared for US patent application 17931538 titled 'LEARNED COLUMN-WEIGHTS FOR RAPID-ESTIMATION OF PROPERTIES OF AN ENTIRE EXCITATION VECTOR
Simplified Explanation
The method described in the patent application involves using a predicted representation of scalar values in an artificial neural network to avoid the computation needed to compute an exact representation of those scalar values from the output data vector.
- Incoming excitation vector received at a neural network weight layer.
- Artificial neural network requires computation of scalar values across an output data vector.
- Predicted representation of scalar values used during forward inference to apply operations to the output data vector.
- Avoids computation needed to compute an exact representation of scalar values from the output data vector.
Potential Applications
This technology could be applied in various fields such as image recognition, natural language processing, and financial forecasting where artificial neural networks are used for data analysis and decision-making processes.
Problems Solved
This innovation helps in reducing computational complexity and improving the efficiency of artificial neural networks by using predicted representations of scalar values, thus saving time and resources.
Benefits
- Faster inference process in artificial neural networks. - Reduction in computational resources required. - Improved performance and accuracy of neural network operations.
Potential Commercial Applications
Optimizing Neural Network Operations for Efficiency and Speed
Original Abstract Submitted
A method includes receiving, at a neural network weight layer of an artificial neural network, an incoming excitation vector. The artificial neural network includes one or more operations requiring one or more scalar values, such as a mean or a standard deviation, to be computed across an output data vector of the artificial neural network. The method further includes using a predicted representation of the one or more scalar values during forward inference of the artificial neural network by the incoming excitation vector to apply the one or more operations to the output data vector, thus avoiding any computation needed to compute an exact representation of the one or more scalar values from the output data vector.