17806188. INTERPRETABLE NEURAL NETWORK ARCHITECTURE USING CONTINUED FRACTIONS simplified abstract (INTERNATIONAL BUSINESS MACHINES CORPORATION)

From WikiPatents
Jump to navigation Jump to search

INTERPRETABLE NEURAL NETWORK ARCHITECTURE USING CONTINUED FRACTIONS

Organization Name

INTERNATIONAL BUSINESS MACHINES CORPORATION

Inventor(s)

Isha Puri of Chappaqua NY (US)

Amit Dhurandhar of Yorktown Heights NY (US)

Tejaswini Pedapati of White Plains NY (US)

Karthikeyan Shanmugam of Elmsford NY (US)

Dennis Wei of Sunnyvale CA (US)

Kush Raj Varshney of Ossining NY (US)

INTERPRETABLE NEURAL NETWORK ARCHITECTURE USING CONTINUED FRACTIONS - A simplified explanation of the abstract

This abstract first appeared for US patent application 17806188 titled 'INTERPRETABLE NEURAL NETWORK ARCHITECTURE USING CONTINUED FRACTIONS

Simplified Explanation

The abstract describes a method, neural network, and computer program product for training neural networks using continued fractions architectures. The method involves inputting data into the neural network and training it through multiple layers to generate output data. Each layer calculates linear functions of its input and generates an output for the subsequent layer.

  • The method involves training neural networks with continued fractions architectures.
  • Input data is provided to each layer along with output data from the previous layer.
  • Each layer calculates linear functions of its input and generates an output for the subsequent layer.
  • The neural network outputs the final output data.

Potential Applications

  • This technology can be applied in various fields where neural networks are used, such as image recognition, natural language processing, and financial forecasting.
  • It can enhance the performance and accuracy of neural networks in these applications.

Problems Solved

  • Continued fractions architectures provide a new approach to training neural networks.
  • This method allows for more efficient and effective training of neural networks.
  • It addresses the challenge of improving the performance and accuracy of neural networks.

Benefits

  • The use of continued fractions architectures can lead to improved performance and accuracy of neural networks.
  • Training neural networks with this method can result in faster convergence and better generalization.
  • It provides a novel approach to training neural networks, expanding the possibilities for their application.


Original Abstract Submitted

A method, a neural network, and a computer program product are provided that provide training of neural networks with continued fractions architectures. The method includes receiving, as input to a neural network, input data and training the input data through a plurality of continued fractions layers of the neural network to generate output data. The input data is provided to each of the continued fractions layers as well as output data from a previous layer. The method further includes outputting, from the neural network, the output data. Each continued fractions layer of the continued fractions layers is configured to calculate one or more linear functions of its respective input and to generate an output that is used as the input for a subsequent continued fractions layer, each continued fractions layer configured to generate an output that is used as the input for a subsequent layer.