18522325. PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS simplified abstract (NEC Corporation)

From WikiPatents
Jump to navigation Jump to search

PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS

Organization Name

NEC Corporation

Inventor(s)

Roberto Gonzalez Sanchez of Heidelberg (DE)

Vittorio Prodomo of Leganes (ES)

Marco Gramaglia of Leganes (ES)

PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS - A simplified explanation of the abstract

This abstract first appeared for US patent application 18522325 titled 'PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS

Simplified Explanation

The abstract describes a method for training a shared machine learning model by generating a data transformation function, sharing it with other entities, creating private datasets, receiving private datasets from other entities, and training the ML model using the private datasets.

  • Explanation:
  • A data transformation function is created by a first entity.
  • The function is shared with one or more second entities.
  • Private datasets are created by applying the function to original datasets.
  • Private datasets from other entities are received.
  • A machine learning model is trained using the private datasets to produce a trained ML model.

Potential Applications

This technology can be applied in collaborative machine learning projects where multiple entities need to train a shared model while keeping their original data private.

Problems Solved

This technology solves the problem of sharing data for training machine learning models without compromising the privacy of the original datasets.

Benefits

The benefits of this technology include improved collaboration in machine learning projects, increased data privacy, and the ability to train more accurate models using diverse datasets.

Potential Commercial Applications

One potential commercial application of this technology is in healthcare, where multiple hospitals can collaborate to train a shared model for disease diagnosis while keeping patient data private.

Possible Prior Art

One possible prior art for this technology is federated learning, where models are trained across multiple devices without exchanging raw data.

Unanswered Questions

How does this technology handle data security and privacy concerns?

The abstract mentions sharing data transformation functions and private datasets, but it does not provide details on how data security and privacy are ensured during this process.

What types of machine learning models can be trained using this method?

The abstract mentions training a machine learning model, but it does not specify the types of models that can be trained using this method.


Original Abstract Submitted

Systems and method for training a shared machine learning (ML) model. A method includes generating, by a first entity, a data transformation function; sharing, by the first entity, the data transformation function with one or more second entities; creating a first private dataset, by the first entity, by applying the data transformation function to a first dataset of the first entity; receiving one or more second private datasets, by the first entity, from the one or more second entities, each second private dataset having been created by applying the data transformation function to a second dataset of the second entity; and training a machine learning (ML) model using the first private dataset and the one or more second private datasets to produce a trained ML model.