Nec corporation (20240127116). FEDERATED LEARNING APPARATUS, SERVER APPARATUS, FEDERATED LEARNING SYSTEM, FEDERATED LEARNING METHOD, AND RECORDING MEDIUM simplified abstract

From WikiPatents
Revision as of 03:21, 26 April 2024 by Wikipatents (talk | contribs) (Creating a new page)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

FEDERATED LEARNING APPARATUS, SERVER APPARATUS, FEDERATED LEARNING SYSTEM, FEDERATED LEARNING METHOD, AND RECORDING MEDIUM

Organization Name

nec corporation

Inventor(s)

Isamu Teranishi of Tokyo (JP)

FEDERATED LEARNING APPARATUS, SERVER APPARATUS, FEDERATED LEARNING SYSTEM, FEDERATED LEARNING METHOD, AND RECORDING MEDIUM - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240127116 titled 'FEDERATED LEARNING APPARATUS, SERVER APPARATUS, FEDERATED LEARNING SYSTEM, FEDERATED LEARNING METHOD, AND RECORDING MEDIUM

Simplified Explanation

The abstract describes a federated learning apparatus that trains prediction models to generate information appropriate for a receiver. The apparatus includes sections for training models, transmitting parameter information, obtaining integrated parameter information, and updating the prediction model.

  • The training section trains a prediction model to predict evaluation values for users on evaluation targets.
  • The parameter information transmitting section sends parameter information to a server.
  • The parameter information obtaining section receives integrated parameter information from the server.
  • The updating section updates the prediction model using the integrated parameter information.

Potential Applications

This technology could be applied in personalized recommendation systems, targeted advertising, and user behavior analysis.

Problems Solved

This technology solves the problem of privacy concerns in centralized machine learning systems, as it allows for model training without sharing raw data.

Benefits

The benefits of this technology include improved data privacy, enhanced model accuracy through collaborative learning, and reduced communication costs.

Potential Commercial Applications

Potential commercial applications include personalized marketing campaigns, customized product recommendations, and targeted content delivery.

Possible Prior Art

One possible prior art for this technology is the concept of collaborative filtering in recommendation systems, where user preferences are inferred from similar users' preferences.

What are the potential security risks associated with transmitting parameter information to a server apparatus?

Transmitting parameter information to a server apparatus could pose security risks such as interception of sensitive data during transmission and unauthorized access to the information stored on the server.

How does the federated learning apparatus ensure data privacy while updating the prediction model with integrated parameter information?

The federated learning apparatus ensures data privacy by only transmitting parameter information to the server, which then integrates the information without directly accessing the raw data used for training the models. This process helps protect user privacy and sensitive information.


Original Abstract Submitted

to generate information appropriate for a receiver of the information, a federated learning apparatus includes: a training section which trains a first prediction model that predicts an evaluation value corresponding to a combination of a user and an evaluation target on which the evaluation value is not obtained, using a first training data set including (i) evaluation values of users on evaluation targets and (ii) attribute values of the evaluation targets; a parameter information transmitting section which transmits, to a server apparatus, first parameter information indicating the first prediction model; a parameter information obtaining section which obtains, from the server apparatus, integrated parameter information obtained by integrating the first parameter information and second parameter information indicating a second prediction model trained using a second training data set; and an updating section which updates the first prediction model by replacing the first parameter information with the integrated parameter information.