Nec corporation (20240095601). PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS simplified abstract

From WikiPatents
Jump to navigation Jump to search

PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS

Organization Name

nec corporation

Inventor(s)

Roberto Gonzalez Sanchez of Heidelberg (DE)

Vittorio Prodomo of Leganes (ES)

Marco Gramaglia of Leganes (ES)

PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240095601 titled 'PRIVACY PRESERVING JOINT TRAINING OF MACHINE LEARNING MODELS

Simplified Explanation

The abstract describes a system and method for training a shared machine learning model by generating a data transformation function, sharing it with other entities, creating private datasets, receiving private datasets from other entities, and training the ML model.

  • Explanation of the patent:
  • The method involves creating a data transformation function by a first entity.
  • The first entity shares this function with other entities.
  • The first entity creates a private dataset by applying the function to its own dataset.
  • The first entity receives private datasets from other entities, which were created using the shared function.
  • The ML model is trained using the private datasets to produce a trained model.

Potential Applications

This technology can be applied in collaborative machine learning projects where multiple entities need to train a shared model using their private datasets.

Problems Solved

This technology solves the problem of securely training a shared ML model using private datasets from multiple entities without compromising data privacy.

Benefits

The benefits of this technology include improved collaboration in machine learning projects, enhanced data privacy protection, and more efficient model training using diverse datasets.

Potential Commercial Applications

A potential commercial application of this technology could be in the healthcare industry, where multiple hospitals or research institutions can collaborate to train a shared ML model for medical diagnosis while keeping patient data secure.

Possible Prior Art

One possible prior art for this technology could be federated learning, where multiple parties collaborate to train a shared model without sharing raw data.

Unanswered Questions

How does this technology ensure data privacy when sharing datasets among multiple entities?

This technology ensures data privacy by sharing only the data transformation function instead of raw datasets, allowing entities to train a shared model without exposing sensitive information.

What are the computational requirements for training a shared ML model using this method?

The computational requirements for training a shared ML model using this method may vary depending on the size and complexity of the datasets involved, as well as the number of entities collaborating in the training process.


Original Abstract Submitted

systems and method for training a shared machine learning (ml) model. a method includes generating, by a first entity, a data transformation function; sharing, by the first entity, the data transformation function with one or more second entities; creating a first private dataset, by the first entity, by applying the data transformation function to a first dataset of the first entity; receiving one or more second private datasets, by the first entity, from the one or more second entities, each second private dataset having been created by applying the data transformation function to a second dataset of the second entity; and training a machine learning (ml) model using the first private dataset and the one or more second private datasets to produce a trained ml model.