17962330. TRAINING A BIDIRECTIONAL ENCODER REPRESENTATIONS FROM TRANSFORMERS (BERT) MODEL TO GENERATE A SOFTWARE REPRESENTATION FROM AN INTERMEDIATE REPRESENTATION OF A SOFTWARE PACKAGE simplified abstract (International Business Machines Corporation)
Contents
- 1 TRAINING A BIDIRECTIONAL ENCODER REPRESENTATIONS FROM TRANSFORMERS (BERT) MODEL TO GENERATE A SOFTWARE REPRESENTATION FROM AN INTERMEDIATE REPRESENTATION OF A SOFTWARE PACKAGE
- 1.1 Organization Name
- 1.2 Inventor(s)
- 1.3 TRAINING A BIDIRECTIONAL ENCODER REPRESENTATIONS FROM TRANSFORMERS (BERT) MODEL TO GENERATE A SOFTWARE REPRESENTATION FROM AN INTERMEDIATE REPRESENTATION OF A SOFTWARE PACKAGE - A simplified explanation of the abstract
- 1.4 Simplified Explanation
- 1.5 Potential Applications
- 1.6 Problems Solved
- 1.7 Benefits
- 1.8 Potential Commercial Applications
- 1.9 Possible Prior Art
- 1.10 Original Abstract Submitted
TRAINING A BIDIRECTIONAL ENCODER REPRESENTATIONS FROM TRANSFORMERS (BERT) MODEL TO GENERATE A SOFTWARE REPRESENTATION FROM AN INTERMEDIATE REPRESENTATION OF A SOFTWARE PACKAGE
Organization Name
International Business Machines Corporation
Inventor(s)
Soyeon Park of Gyeongsangbuk-Do (KR)
Dhilung Kirat of Hartsdale NY (US)
Sanjeev Das of White Plains NY (US)
Douglas Lee Schales of Ardsley NY (US)
Taesung Lee of Ridgefield CT (US)
Jiyong Jang of Chappaqua NY (US)
TRAINING A BIDIRECTIONAL ENCODER REPRESENTATIONS FROM TRANSFORMERS (BERT) MODEL TO GENERATE A SOFTWARE REPRESENTATION FROM AN INTERMEDIATE REPRESENTATION OF A SOFTWARE PACKAGE - A simplified explanation of the abstract
This abstract first appeared for US patent application 17962330 titled 'TRAINING A BIDIRECTIONAL ENCODER REPRESENTATIONS FROM TRANSFORMERS (BERT) MODEL TO GENERATE A SOFTWARE REPRESENTATION FROM AN INTERMEDIATE REPRESENTATION OF A SOFTWARE PACKAGE
Simplified Explanation
The abstract describes a method for training a bidirectional encoder representations from transformers (BERT) model to generate a software representation. The trained BERT model takes an intermediate representation (IR) of a software package as input and outputs a corresponding software representation.
- Explanation:
- Training a BERT model to generate software representations - Inputting an IR of a software package to the trained BERT model - Receiving a software representation as output from the BERT model
Potential Applications
This technology could be applied in: - Software development - Natural language processing tasks - Code analysis and understanding
Problems Solved
This technology helps in: - Improving software representation generation - Enhancing code understanding and analysis - Facilitating natural language processing tasks in software development
Benefits
The benefits of this technology include: - More accurate software representations - Better code understanding and analysis - Improved performance in natural language processing tasks related to software
Potential Commercial Applications
Potential commercial applications of this technology could be in: - Software development tools - Code analysis software - Natural language processing applications in the software industry
Possible Prior Art
One possible prior art could be the use of machine learning models for software representation generation, but the specific application of training a BERT model for this purpose may be novel.
Unanswered Questions
How does this technology compare to other methods of software representation generation using machine learning models?
This article does not provide a comparison with other methods, such as LSTM or CNN models, for software representation generation.
What are the limitations or challenges of implementing this technology in real-world software development environments?
The article does not address the potential challenges or limitations of integrating this technology into existing software development workflows.
Original Abstract Submitted
A computer-implemented method according to one embodiment includes training a bidirectional encoder representations from transformers (BERT) model to generate a software representation. An intermediate representation (IR) of a software package is input to the trained BERT model, and a software representation corresponding to the software package is received as output from the trained BERT model. A computer program product according to another embodiment includes a computer readable storage medium having program instructions embodied therewith. The program instructions are readable and/or executable by a computer to cause the computer to perform the foregoing method. A system according to another embodiment includes a processor, and logic integrated with the processor, executable by the processor, or integrated with and executable by the processor. The logic is configured to perform the foregoing method.