18530552. FEDERATED TRAINING FOR A NEURAL NETWORK WITH REDUCED COMMUNICATION REQUIREMENT simplified abstract (Robert Bosch GmbH)

From WikiPatents
Jump to navigation Jump to search

FEDERATED TRAINING FOR A NEURAL NETWORK WITH REDUCED COMMUNICATION REQUIREMENT

Organization Name

Robert Bosch GmbH

Inventor(s)

Andres Mauricio Munoz Delgado of SCHOENAICH (DE)

FEDERATED TRAINING FOR A NEURAL NETWORK WITH REDUCED COMMUNICATION REQUIREMENT - A simplified explanation of the abstract

This abstract first appeared for US patent application 18530552 titled 'FEDERATED TRAINING FOR A NEURAL NETWORK WITH REDUCED COMMUNICATION REQUIREMENT

The abstract describes a method for generating a training contribution for a neural network on a client node for federated training. The method involves receiving a set of parameters characterizing the neural network, supplying the network with training examples labeled with target outputs, evaluating deviations from the target outputs using a cost function, optimizing the parameters to improve the evaluation, selecting relevant parameters based on a criterion, ascertaining proposed changes for the selected parameters as the training contribution, and transmitting these changes to a server node.

  • The method involves optimizing parameters of a neural network for improved performance.
  • Training examples with target outputs are used to evaluate the network's behavior.
  • Relevant parameters are selected based on predefined criteria.
  • Proposed changes for selected parameters are ascertained as the training contribution.
  • The training contribution is then transmitted to a server node.

Potential Applications: This method can be applied in various fields such as machine learning, artificial intelligence, and data analytics where neural networks are used for training and optimization tasks.

Problems Solved: This method addresses the challenge of efficiently optimizing neural network parameters for improved performance in a federated training setting.

Benefits: The method allows for the generation of training contributions that can enhance the overall performance of neural networks in federated training scenarios, leading to more accurate and efficient models.

Commercial Applications: "Optimizing Neural Network Parameters for Federated Training in Machine Learning Applications"

Frequently Updated Research: Stay updated on the latest advancements in federated learning techniques and optimization algorithms for neural networks to further enhance training contributions and model performance.

Questions about Optimizing Neural Network Parameters for Federated Training: 1. How does this method contribute to the efficiency of neural network training in federated settings? 2. What are the key criteria used to select relevant parameters for optimization in this method?


Original Abstract Submitted

A method for generating a training contribution for a neural network on a client node for a federated training of the neural network. In the method, a complete set of parameters characterizing the behavior of the neural network is received; the parameterized neural network is supplied with training examples from a predefined set so that the neural network in each case delivers outputs, wherein the training examples are labeled with target outputs; deviations of the outputs from the respective target outputs are evaluated with a predefined cost function; the parameters of the neural network are optimized with the aim of improving the evaluation by the cost function; a set of particularly relevant parameters is selected based on a predefined criterion; for the selected parameters, proposed changes are ascertained as the sought training contribution based on the result of the optimization; the proposed changes are transmitted to a server node.