16815960. Active Federated Learning for Assistant Systems simplified abstract (Meta Platforms Technologies, LLC)

From WikiPatents
Jump to navigation Jump to search

Active Federated Learning for Assistant Systems

Organization Name

Meta Platforms Technologies, LLC

Inventor(s)

Kshitiz Malik of Palo Alto CA (US)

Seungwhan Moon of Seattle WA (US)

Honglei Liu of San Mateo CA (US)

Anuj Kumar of Santa Clara CA (US)

Hongyuan Zhan of Seattle WA (US)

Ahmed Aly of Kenmore WA (US)

Active Federated Learning for Assistant Systems - A simplified explanation of the abstract

This abstract first appeared for US patent application 16815960 titled 'Active Federated Learning for Assistant Systems

Simplified Explanation

The abstract describes a method for training a neural network model on multiple examples and calculating a user valuation associated with the client system.

  • Receiving a current version of a neural network model from remote servers
  • Training the neural network model on examples from a local data store
  • Calculating a user valuation representing the utility of training the model
  • Sending the trained model and user valuation back to the remote servers

Potential Applications

This technology could be applied in various fields such as:

  • Machine learning
  • Artificial intelligence
  • Data analysis

Problems Solved

This technology helps in:

  • Improving the efficiency of training neural network models
  • Enhancing user experience by calculating user valuations

Benefits

The benefits of this technology include:

  • Optimizing neural network model training
  • Providing a measure of utility for training tasks

Potential Commercial Applications

A potential commercial application for this technology could be in:

  • Data analytics software
  • AI development tools

Possible Prior Art

One possible prior art for this technology could be:

  • Existing methods for training neural network models

Unanswered Questions

1. How does the user valuation impact the selection of the client system for subsequent training? 2. What specific features of the neural network model are updated during training?


Original Abstract Submitted

In one embodiment, a method includes receiving, by a first client system, from one or more remote servers, a current version of a neural network model including multiple model parameters, training the neural network model on multiple examples retrieved from a local data store to generate multiple updated model parameters, wherein each of the examples includes one or more features and one or more labels, calculating a user valuation associated with the first client system, wherein the user valuation represents a measure of utility of training the neural network model on the multiple examples, and sending, to one or more of the remote servers, the trained neural network model and the user valuation, wherein the user valuation is associated with a likelihood of the first client system being selected for a subsequent training of the neural network model.