17791897. KNOWLEDGE GRAP PRE-TRAINING METHOD BASED ON STRUCTURAL CONTEXT INFOR simplified abstract (ZHEJIANG UNIVERSITY)
KNOWLEDGE GRAP PRE-TRAINING METHOD BASED ON STRUCTURAL CONTEXT INFOR
Organization Name
Inventor(s)
HUAJUN Chen of ZHEJIANG PROVINCE, P.R. (CN)
GANQIANG Ye of HANGZHOU, ZHEJIANG PROVINCE (CN)
KNOWLEDGE GRAP PRE-TRAINING METHOD BASED ON STRUCTURAL CONTEXT INFOR - A simplified explanation of the abstract
This abstract first appeared for US patent application 17791897 titled 'KNOWLEDGE GRAP PRE-TRAINING METHOD BASED ON STRUCTURAL CONTEXT INFOR
Simplified Explanation
The present invention describes a knowledge graph pre-training method that utilizes structural context information to optimize the representation of target triples. The method involves constructing an instance with context triples, encoding the context triples to obtain integration vectors, combining these vectors into a context vector sequence, and further encoding this sequence to obtain a structural representation vector for the target triple. A general task module is used to calculate the structural representation vector, update it based on label prediction values, and optimize it through training.
- Method for knowledge graph pre-training based on structural context information
- Construction of an instance with context triples
- Encoding of context triples to integration vectors
- Combination of integration vectors into a context vector sequence
- Encoding of the context vector sequence to obtain a structural representation vector for the target triple
- Utilization of a general task module for calculating and updating the structural representation vector through training
Potential Applications
The technology can be applied in various fields such as natural language processing, information retrieval, and knowledge graph analysis.
Problems Solved
1. Enhanced representation of target triples in knowledge graphs 2. Improved accuracy of label predictions for triples
Benefits
1. Better understanding of relationships in knowledge graphs 2. Increased efficiency in information retrieval tasks
Potential Commercial Applications
The technology can be utilized in search engines, recommendation systems, and data analytics platforms.
Possible Prior Art
Prior art may include methods for knowledge graph embedding and pre-training techniques in machine learning.
Unanswered Questions
How does this method compare to existing knowledge graph pre-training approaches?
The article does not provide a direct comparison with other pre-training methods in the field.
What are the specific use cases where this technology can outperform traditional methods?
The article does not delve into specific scenarios where this method can demonstrate superior performance compared to conventional approaches.
Original Abstract Submitted
Disclosed in the present invention is a knowledge graph pre-training method based on structural context information, the method comprising: for a target triple, constructing an instance comprising context triples, and adopting a triple integration module to encode each of the context triples in the instance to obtain an integration vector; combining the integration vectors for all context triples in the instance into a context vector sequence, and adopting a structural information module to encode the context vector sequence to obtain a structural representation vector for the triple; adopting a general task module to calculate the structural representation vector for the triple, and obtaining a label prediction value for the triples, updating the structural representation vector for the triple based on cross-entropy loss of the label prediction value for the triple and a label truth value for the triple until the completion of the training, so as to obtain an optimized structural representation vector for the target triple. The structural representation vector for the triple obtained by this method incorporates the context information.