US Patent Application 17828157. METHOD, ELECTRONIC DEVICE, AND COMPUTER PROGRAM PRODUCT FOR MODEL TRAINING simplified abstract

From WikiPatents
Jump to navigation Jump to search

METHOD, ELECTRONIC DEVICE, AND COMPUTER PROGRAM PRODUCT FOR MODEL TRAINING

Organization Name

Dell Products L.P.


Inventor(s)

Jiacheng Ni of Shanghai (CN)


Zijia Wang of WeiFang (CN)


Zhen Jia of Shanghai (CN)


METHOD, ELECTRONIC DEVICE, AND COMPUTER PROGRAM PRODUCT FOR MODEL TRAINING - A simplified explanation of the abstract

  • This abstract for appeared for US patent application number 17828157 Titled 'METHOD, ELECTRONIC DEVICE, AND COMPUTER PROGRAM PRODUCT FOR MODEL TRAINING'

Simplified Explanation

This abstract describes a method, electronic device, and computer program for model training. The method involves receiving a machine learning model and distilled samples from a cloud server on an edge device. The machine learning model is initially trained on the cloud server using initial samples, and the distilled samples are derived from these initial samples. The method also includes acquiring a new input sample on the edge device and retraining the machine learning model using the distilled samples and the new input sample. This approach improves the efficiency of updating the model and ultimately enhances its accuracy.


Original Abstract Submitted

Embodiments of the present disclosure provide a method, an electronic device, and a computer program product for model training. The method for model training includes: receiving, at an edge device, a machine learning model and distilled samples from a cloud server, wherein the machine learning model is trained on the basis of initial samples at the cloud server, and the distilled samples are obtained by distillation of the initial samples. The method further includes: acquiring, at the edge device, a newly collected input sample, and retraining, by the edge device, the machine learning model by using the distilled samples and the input sample. In this way, by updating a model using a distilled sample set at an edge device, the efficiency of updating the model can be improved, and then the accuracy of the model can be improved.