17920623. Supervised Contrastive Learning with Multiple Positive Examples simplified abstract (Google LLC)
Supervised Contrastive Learning with Multiple Positive Examples
Organization Name
Inventor(s)
Dilip Krishnan of Arlington MA (US)
Prannay Khosla of Cambridge MA (US)
Piotr Teterwak of Boston MA (US)
Aaron Yehuda Sarna of Cambridge MA (US)
Aaron Joseph Maschinot of Somerville MA (US)
Philip John Isola of Cambridge MA (US)
Yonglong Tian of Cambridge MA (US)
Chen Wang of Jersey City NJ (US)
Supervised Contrastive Learning with Multiple Positive Examples - A simplified explanation of the abstract
This abstract first appeared for US patent application 17920623 titled 'Supervised Contrastive Learning with Multiple Positive Examples
Simplified Explanation
The present disclosure introduces an improved training method for supervised contrastive learning, allowing for simultaneous training across multiple positive and negative examples. This method enhances the effectiveness of learning representations in the self-supervised setting.
- The disclosed methodology is a supervised version of the batch contrastive loss.
- It enables contrastive learning to be applied in the fully supervised setting.
- Learning can occur simultaneously across multiple positive examples.
- The technique improves the training process for supervised contrastive learning.
Potential Applications
This technology has potential applications in various fields, including:
- Natural language processing
- Computer vision
- Speech recognition
- Recommendation systems
- Anomaly detection
Problems Solved
The disclosed methodology addresses the following problems:
- Limited effectiveness of traditional supervised learning methods
- Difficulty in learning powerful representations in the self-supervised setting
- Inability to perform simultaneous training across multiple positive examples
Benefits
The benefits of this technology include:
- Enhanced learning of representations in the supervised setting
- Simultaneous training across multiple positive examples
- Improved performance in various applications, such as natural language processing and computer vision
Original Abstract Submitted
The present disclosure provides an improved training methodology that enables supervised contrastive learning to be simultaneously performed across multiple positive and negative training examples. In particular, example aspects of the present disclosure are directed to an improved, supervised version of the batch contrastive loss, which has been shown to be very effective at learning powerful representations in the self-supervised setting Thus, the proposed techniques adapt contrastive learning to the fully supervised setting and also enable learning to occur simultaneously across multiple positive examples.
- Google LLC
- Dilip Krishnan of Arlington MA (US)
- Prannay Khosla of Cambridge MA (US)
- Piotr Teterwak of Boston MA (US)
- Aaron Yehuda Sarna of Cambridge MA (US)
- Aaron Joseph Maschinot of Somerville MA (US)
- Ce Liu of Cambridge MA (US)
- Philip John Isola of Cambridge MA (US)
- Yonglong Tian of Cambridge MA (US)
- Chen Wang of Jersey City NJ (US)
- G06N3/09
- G06V10/74
- G06V10/776
- G06V10/82