Intel corporation (20240303485). APPARATUS, METHOD, DEVICE AND MEDIUM FOR LOSS BALANCING IN MULTI-TASK LEARNING simplified abstract

From WikiPatents
Jump to navigation Jump to search

APPARATUS, METHOD, DEVICE AND MEDIUM FOR LOSS BALANCING IN MULTI-TASK LEARNING

Organization Name

intel corporation

Inventor(s)

Wenjing Kang of Beijing (CN)

Xiaochuan Luo of Beijing (CN)

Xianchao Xu of Beijing (CN)

APPARATUS, METHOD, DEVICE AND MEDIUM FOR LOSS BALANCING IN MULTI-TASK LEARNING - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240303485 titled 'APPARATUS, METHOD, DEVICE AND MEDIUM FOR LOSS BALANCING IN MULTI-TASK LEARNING

The disclosure presents a system for balancing losses in multi-task learning (MTL) using a pre-trained neural network to initialize shared layers and adjusting task weights based on loss change rates and gradient magnitudes.

  • System for loss balancing in multi-task learning
  • Utilizes pre-trained neural network to initialize shared layers
  • Adjusts task weights based on loss change rates and gradient magnitudes
  • Custom interval calculation for mini-batch training steps
  • Enhances performance of deep neural networks in MTL

Potential Applications: This technology can be applied in various fields such as natural language processing, computer vision, and speech recognition where multi-task learning is utilized.

Problems Solved: Addresses the issue of imbalanced losses in multi-task learning which can lead to suboptimal performance in deep neural networks.

Benefits: Improves the efficiency and effectiveness of multi-task learning models by dynamically adjusting task weights based on loss change rates and gradient magnitudes.

Commercial Applications: This technology can be valuable in industries such as healthcare, finance, and e-commerce where multi-task learning is used to improve predictive modeling and decision-making processes.

Questions about Loss Balancing in MTL: 1. How does this technology improve the performance of deep neural networks in multi-task learning? 2. What are the key factors considered when adjusting task weights based on loss change rates and gradient magnitudes?

Frequently Updated Research: Stay informed about the latest advancements in multi-task learning and loss balancing techniques to enhance the performance of deep neural networks.


Original Abstract Submitted

the disclosure provides an apparatus, method, device, and medium for loss balancing in mtl. the apparatus includes interface circuitry and processor circuitry. the processor circuitry is configured to initialize parameters of shared layers of a deep neural network for mtl using a pre-trained neural network; determine a custom interval consisting of a designated number of mini-batch training steps and a designated window of n custom intervals (n>2); for each task, calculate a loss change rate between each pair of n−1 pairs of neighboring custom intervals within a designated window prior to a present custom interval and a gradient magnitude with respect to selected shared weights within the designated window prior to the present custom interval, and adjust, a weight of the task, based on the calculated loss change rate and gradient magnitude with respect to selected shared weights.