Emergent Mind

Gransformer: Transformer-based Graph Generation

(2203.13655)
Published Mar 25, 2022 in cs.LG

Abstract

Transformers have become widely used in modern models for various tasks such as natural language processing and machine vision. This paper proposes Gransformer, an algorithm for generating graphs based on the Transformer. We extend a simple autoregressive Transformer encoder to exploit the structural information of the given graph through efficient modifications. The attention mechanism is modified to consider the presence or absence of edges between each pair of nodes. We also introduce a graph-based familiarity measure between node pairs that applies to both the attention and the positional encoding. This measure of familiarity is based on message passing algorithms and contains structural information about the graph. Furthermore, the proposed measure is autoregressive, which allows our mode to acquire the necessary conditional probabilities in a single forward pass. In the output layer, we also use a masked autoencoder for density estimation to efficiently model the sequential generation of dependent edges. Moreover, since we use BFS node orderings, we propose a technique to prevent the model from generating isolated nodes without connection to preceding nodes. We evaluate this method on two real-world datasets and compare it with other state-of-the-art autoregressive graph generation methods. Experimental results have shown that the proposed method performs comparatively to these methods, including recurrent models and graph convolutional networks.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.