Emergent Mind

Abstract

The message-passing scheme is the core of graph representation learning. While most existing message-passing graph neural networks (MPNNs) are permutation-invariant in graph-level representation learning and permutation-equivariant in node- and edge-level representation learning, their expressive power is commonly limited by the 1-Weisfeiler-Lehman (1-WL) graph isomorphism test. Recently proposed expressive graph neural networks (GNNs) with specially designed complex message-passing mechanisms are not practical. To bridge the gap, we propose a plug-in Equivariant Distance ENcoding (EDEN) for MPNNs. EDEN is derived from a series of interpretable transformations on the graph's distance matrix. We theoretically prove that EDEN is permutation-equivariant for all level graph representation learning, and we empirically illustrate that EDEN's expressive power can reach up to the 3-WL test. Extensive experiments on real-world datasets show that combining EDEN with conventional GNNs surpasses recent advanced GNNs.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.