18524852. PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS simplified abstract (NEC Corporation)
Contents
- 1 PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS
- 1.1 Organization Name
- 1.2 Inventor(s)
- 1.3 PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS - A simplified explanation of the abstract
- 1.4 Simplified Explanation
- 1.5 Potential Applications
- 1.6 Problems Solved
- 1.7 Benefits
- 1.8 Potential Commercial Applications
- 1.9 Possible Prior Art
- 1.10 Unanswered Questions
- 1.11 Original Abstract Submitted
PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS
Organization Name
Inventor(s)
Roberto Gonzales Sanchez of Heidelberg (DE)
Vittorio Prodomo of Leganes (ES)
Marco Gramaglia of Leganes (ES)
PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS - A simplified explanation of the abstract
This abstract first appeared for US patent application 18524852 titled 'PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS
Simplified Explanation
The abstract describes a method for training a shared machine learning model by generating a data transformation function, sharing it with other entities, creating private datasets, receiving private datasets from other entities, and training the ML model.
- The method involves generating a data transformation function by a first entity.
- The first entity shares the data transformation function with one or more second entities.
- The first entity creates a private dataset by applying the data transformation function to its own dataset.
- The first entity receives private datasets from the second entities, which were created using the data transformation function.
- The ML model is trained using the private datasets to produce a trained ML model.
Potential Applications
This technology can be applied in collaborative machine learning projects where multiple entities need to train a shared model while keeping their data private.
Problems Solved
This method solves the problem of sharing data for training machine learning models without compromising the privacy of individual datasets.
Benefits
The benefits of this technology include improved data privacy, collaborative model training, and the ability to leverage diverse datasets for better ML model performance.
Potential Commercial Applications
One potential commercial application of this technology is in healthcare, where multiple hospitals can collaborate to train a shared ML model for disease diagnosis while keeping patient data private.
Possible Prior Art
Prior art in the field of federated learning and collaborative machine learning may exist, where similar techniques are used to train models on decentralized data sources.
Unanswered Questions
The article does not provide details on the security measures in place to protect the shared data transformation function from unauthorized access or tampering.
The article does not discuss any potential limitations or challenges that may arise when training a shared ML model with private datasets from different entities.
Original Abstract Submitted
Systems and method for training a shared machine learning (ML) model. A method includes generating, by a first entity, a data transformation function; sharing, by the first entity, the data transformation function with one or more second entities; creating a first private dataset, by the first entity, by applying the data transformation function to a first dataset of the first entity; receiving one or more second private datasets, by the first entity, from the one or more second entities, each second private dataset having been created by applying the data transformation function to a second dataset of the second entity; and training a machine learning (ML) model using the first private dataset and the one or more second private datasets to produce a trained ML model.