Huawei technologies co., ltd. (20240320473). COMMUNICATION METHOD AND APPARATUS, STORAGE MEDIUM, AND PROGRAM PRODUCT simplified abstract
Contents
- 1 COMMUNICATION METHOD AND APPARATUS, STORAGE MEDIUM, AND PROGRAM PRODUCT
- 1.1 Organization Name
- 1.2 Inventor(s)
- 1.3 COMMUNICATION METHOD AND APPARATUS, STORAGE MEDIUM, AND PROGRAM PRODUCT - A simplified explanation of the abstract
- 1.4 Simplified Explanation
- 1.5 Key Features and Innovation
- 1.6 Potential Applications
- 1.7 Problems Solved
- 1.8 Benefits
- 1.9 Commercial Applications
- 1.10 Prior Art
- 1.11 Frequently Updated Research
- 1.12 Questions about Neural Network Training Security
- 1.13 Original Abstract Submitted
COMMUNICATION METHOD AND APPARATUS, STORAGE MEDIUM, AND PROGRAM PRODUCT
Organization Name
Inventor(s)
COMMUNICATION METHOD AND APPARATUS, STORAGE MEDIUM, AND PROGRAM PRODUCT - A simplified explanation of the abstract
This abstract first appeared for US patent application 20240320473 titled 'COMMUNICATION METHOD AND APPARATUS, STORAGE MEDIUM, AND PROGRAM PRODUCT
Simplified Explanation
This patent application describes a method and apparatus for improving security in a neural network training process. The method involves sending training information from a server to multiple terminals, aggregating local models from these terminals, and updating a global model based on the aggregated data.
Key Features and Innovation
- Server sends training information including a global model and identifiers of participating terminals.
- Terminals send local models based on the global model and a shared key.
- Server aggregates local models from multiple terminals to update the global model.
- Enhances security in neural network training by improving data aggregation process.
Potential Applications
This technology can be applied in various fields such as:
- Machine learning
- Artificial intelligence
- Data security
Problems Solved
- Enhances security in neural network training processes.
- Improves data aggregation and model updating methods.
- Facilitates collaboration between multiple terminals in a training process.
Benefits
- Increased security in neural network training.
- Improved accuracy of global models.
- Enhanced collaboration between terminals.
Commercial Applications
- This technology can be utilized in industries such as:
- Cybersecurity
- Data analytics
- Cloud computing
Prior Art
Readers can explore prior research on neural network security and data aggregation methods in the field of machine learning.
Frequently Updated Research
Stay updated on the latest advancements in neural network security and data aggregation techniques to enhance the effectiveness of this technology.
Questions about Neural Network Training Security
How does this technology improve security in neural network training processes?
This technology enhances security by securely aggregating local models from multiple terminals and updating a global model based on shared keys.
What are the potential applications of this innovation beyond neural network training?
This technology can be applied in various fields such as machine learning, artificial intelligence, and data security to enhance collaboration and data security measures.
Original Abstract Submitted
this application discloses a communication method and apparatus, a storage medium, and a program product. the method includes: a server sends training information. the training information includes a global model in a previous round and identifiers of at least two second terminals that participate in a current round of training. a first terminal sends a local model of the first terminal. the local model is obtained based on the global model in the previous round and a shared key. the server receives local models of the at least two second terminals, and aggregates the local models of the at least two second terminals based on a shared key between the at least two second terminals, to obtain an updated global model in the current round. according to the solutions of this application, security in a neural network training process is improved.