18157277. NEURAL NETWORK DISTILLATION METHOD AND APPARATUS simplified abstract (Huawei Technologies Co., Ltd.)

From WikiPatents
Jump to navigation Jump to search

NEURAL NETWORK DISTILLATION METHOD AND APPARATUS

Organization Name

Huawei Technologies Co., Ltd.

Inventor(s)

Pengxiang Cheng of Shenzhen (CN)

Zhenhua Dong of Shenzhen (CN)

Xiuqiang He of Shenzhen (CN)

Xiaolian Zhang of Shenzhen (CN)

Shi Yin of Shenzhen (CN)

Yuelin Hu of Shenzhen (CN)

NEURAL NETWORK DISTILLATION METHOD AND APPARATUS - A simplified explanation of the abstract

This abstract first appeared for US patent application 18157277 titled 'NEURAL NETWORK DISTILLATION METHOD AND APPARATUS

Simplified Explanation

The patent application describes a method and apparatus for neural network distillation in the field of artificial intelligence. Here are the key points:

  • The method involves obtaining a sample set consisting of biased and unbiased data sets.
  • The biased data set contains biased samples, while the unbiased data set contains unbiased samples.
  • A first distillation manner is determined based on the data features of the sample set.
  • In the first distillation manner, a teacher model is trained using the unbiased data set, and a student model is trained using the biased data set.
  • A first neural network is then trained based on the biased and unbiased data sets in the first distillation manner.
  • The goal is to obtain an updated first neural network through this training process.

Potential Applications:

  • This technology can be applied in various fields where neural networks are used, such as image recognition, natural language processing, and recommendation systems.
  • It can improve the performance and accuracy of neural networks by distilling knowledge from biased and unbiased data sets.

Problems Solved:

  • Neural networks often suffer from biases in the training data, which can lead to inaccurate predictions or discriminatory outcomes.
  • This technology addresses the problem of bias by incorporating both biased and unbiased data sets in the training process.
  • It aims to create a more balanced and accurate neural network model.

Benefits:

  • By training a teacher model using unbiased data and a student model using biased data, this method can help identify and mitigate biases in the neural network.
  • The distillation process allows for the transfer of knowledge from the teacher model to the student model, improving the overall performance of the neural network.
  • The use of both biased and unbiased data sets helps create a more comprehensive and fair neural network model.


Original Abstract Submitted

This application provides a neural network distillation method and apparatus in the field of artificial intelligence. The method includes: obtaining a sample set, where the sample set includes a biased data set and an unbiased data set, the biased data set includes biased samples, and the unbiased data set includes unbiased samples; determining a first distillation manner based on data features of the sample set, where, in the first distillation manner, a teacher model is trained by using the unbiased data set and a student model is trained by using the biased data set; and training a first neural network based on the biased data set and the unbiased data set in the first distillation manner, to obtain an updated first neural network.