Telefonaktiebolaget lm ericsson (publ) (20240095539). DISTRIBUTED MACHINE LEARNING WITH NEW LABELS USING HETEROGENEOUS LABEL DISTRIBUTION simplified abstract

From WikiPatents
Jump to navigation Jump to search

DISTRIBUTED MACHINE LEARNING WITH NEW LABELS USING HETEROGENEOUS LABEL DISTRIBUTION

Organization Name

telefonaktiebolaget lm ericsson (publ)

Inventor(s)

Gudur Gautham Krishna of Chennai (IN)

Satheesh Kumar Perepu of Chennai (IN)

DISTRIBUTED MACHINE LEARNING WITH NEW LABELS USING HETEROGENEOUS LABEL DISTRIBUTION - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240095539 titled 'DISTRIBUTED MACHINE LEARNING WITH NEW LABELS USING HETEROGENEOUS LABEL DISTRIBUTION

Simplified Explanation

The abstract describes a method for distributed machine learning where a first dataset with labels is provided to multiple local computing devices. The local devices train local ML models and generate model probabilities values, which are then used to create a weights matrix for sampling and generating new model probabilities values.

  • Explanation:
  • First dataset with labels provided to local computing devices
  • Local ML models trained on the dataset
  • Model probabilities values generated by local models
  • Weights matrix generated using model probabilities values
  • New model probabilities values generated by sampling with weights matrix

Potential Applications

This technology can be applied in various fields such as:

  • Healthcare for analyzing medical data
  • Finance for fraud detection
  • Marketing for customer behavior analysis

Problems Solved

This technology helps in:

  • Efficient distributed machine learning
  • Collaborative model training
  • Improved accuracy with ensemble learning

Benefits

The benefits of this technology include:

  • Scalability with distributed computing
  • Enhanced model performance
  • Faster training process

Potential Commercial Applications

This technology can be commercially applied in:

  • Cloud computing services
  • Data analytics platforms
  • ML model training services

Possible Prior Art

One possible prior art for this technology could be:

  • Federated learning methods
  • Ensemble learning techniques

Unanswered Questions

How does this method handle data privacy and security concerns?

The abstract does not mention any specific mechanisms for ensuring data privacy and security during the distributed machine learning process.

What is the computational overhead of generating and using the weights matrix?

The abstract does not provide details on the computational resources required for generating and utilizing the weights matrix in the distributed machine learning process.


Original Abstract Submitted

a method for distributed machine learning (ml) which includes providing a first dataset including a first set of labels to a plurality of local computing devices including a first local computing device and a second local computing device. the method further includes receiving, from the first local computing device, a first set of ml model probabilities values from training a first local ml model using the first set of labels. the method further includes receiving, from the second local computing device, a second set of ml model probabilities values from training a second local ml model using the first set of labels and one or more labels different from any label in the first set of labels. the method further includes generating a weights matrix using the received first set of ml model probabilities values and the received second set of mil model probabilities values. the method further includes generating a third set of ml model probabilities values by sampling using the generated weights matrix.