18356958. DIRECTED GRAPH AUTOENCODER DEVICES AND METHODS simplified abstract (INTERNATIONAL BUSINESS MACHINES CORPORATION)

From WikiPatents
Jump to navigation Jump to search

DIRECTED GRAPH AUTOENCODER DEVICES AND METHODS

Organization Name

INTERNATIONAL BUSINESS MACHINES CORPORATION

Inventor(s)

Georgios Kollias of White Plains NY (US)

Vasileios Kalantzis of White Plains NY (US)

Tsuyoshi Ide of Harrison NY (US)

Aurelie Chloe Lozano of Scarsdale NY (US)

Naoki Abe of Rye NY (US)

DIRECTED GRAPH AUTOENCODER DEVICES AND METHODS - A simplified explanation of the abstract

This abstract first appeared for US patent application 18356958 titled 'DIRECTED GRAPH AUTOENCODER DEVICES AND METHODS

The abstract describes a directed graph autoencoder device that utilizes a graph convolutional layer to generate transformed dual vector representations of nodes in a graph.

  • The device includes memories and a processor implementing the graph convolutional layer.
  • The layer applies source and target weight matrices to input dual vector representations of nodes to generate transformed dual vector representations.
  • Input dual vector representations consist of source and target vector representations for each node.
  • The layer scales the transformed dual vector representations and performs message passing using the scaled representations.

Potential Applications: - Network analysis - Social network modeling - Recommendation systems

Problems Solved: - Efficient representation learning in graph data - Improved message passing in graph neural networks

Benefits: - Enhanced graph data processing - Better node representation learning - Increased accuracy in graph-based tasks

Commercial Applications: Title: "Advanced Graph Neural Network for Enhanced Data Processing" This technology can be applied in various industries such as finance, healthcare, and e-commerce for tasks like fraud detection, patient diagnosis, and personalized recommendations.

Questions about the technology: 1. How does the graph convolutional layer improve message passing in graph data? 2. What are the key advantages of using dual vector representations in node transformation?


Original Abstract Submitted

A directed graph autoencoder device includes one or more memories and a processor coupled to the one or more memories and configured to implement a graph convolutional layer. The graph convolutional layer comprises a plurality of nodes and is configured to generate transformed dual vector representations by applying a source weight matrix and a target weight matrix to input dual vector representations of the plurality of nodes. The input dual vector representations comprise, for each node of the plurality of nodes, a source vector representation that corresponds to the node in its role as a source and a target vector representation that corresponds to the node in its role as a target. The graph convolutional layer is further configured to scale the transformed dual vector representations to generate scaled transformed dual vector representations. The graph convolutional layer is further configured to perform message passing using the scaled transformed dual vector representations.