Nec corporation (20240095602). PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS simplified abstract

From WikiPatents
Jump to navigation Jump to search

PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS

Organization Name

nec corporation

Inventor(s)

Roberto Gonzales Sanchez of Heidelberg (DE)

Vittorio Prodomo of Leganes (ES)

Marco Gramaglia of Leganes (ES)

PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240095602 titled 'PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS

Simplified Explanation

The abstract describes a method for training a shared machine learning model by generating a data transformation function, sharing it with other entities, creating private datasets, and training the model using these datasets.

  • Explanation of the patent:
  • A data transformation function is created by a first entity.
  • This function is shared with one or more second entities.
  • The first entity creates a private dataset by applying the function to its own dataset.
  • The first entity receives private datasets from the second entities, which were created using the shared function.
  • A machine learning model is trained using these private datasets to produce a trained model.

Potential Applications

This technology can be applied in collaborative machine learning projects where multiple entities need to train a shared model while keeping their data private.

Problems Solved

This method allows entities to collaborate on training a machine learning model without sharing their raw data, addressing privacy concerns and enabling joint model training.

Benefits

- Enhanced privacy protection for sensitive data - Efficient collaboration on machine learning projects - Improved model performance through diverse training data sources

Potential Commercial Applications

"Collaborative Machine Learning Model Training for Data Privacy Protection"

Possible Prior Art

Prior art in the field of federated learning, where models are trained across multiple decentralized devices while keeping data local, may be relevant to this patent application.

Unanswered Questions

How does this method handle data synchronization issues between entities during model training?

The abstract does not provide details on how data synchronization challenges are addressed when training the shared machine learning model.

What measures are in place to ensure the security of the shared data transformation function?

The abstract does not mention any specific security measures implemented to protect the shared data transformation function from unauthorized access or tampering.


Original Abstract Submitted

systems and method for training a shared machine learning (ml) model. a method includes generating, by a first entity, a data transformation function; sharing, by the first entity, the data transformation function with one or more second entities; creating a first private dataset, by the first entity, by applying the data transformation function to a first dataset of the first entity; receiving one or more second private datasets, by the first entity, from the one or more second entities, each second private dataset having been created by applying the data transformation function to a second dataset of the second entity; and training a machine learning (ml) model using the first private dataset and the one or more second private datasets to produce a trained ml model.