Emergent Mind

On dimensionality of feature vectors in MPNNs

(2402.03966)
Published Feb 6, 2024 in cs.LG

Abstract

We revisit the classical result of Morris et al.~(AAAI'19) that message-passing graphs neural networks (MPNNs) are equal in their distinguishing power to the Weisfeiler--Leman (WL) isomorphism test. Morris et al.~show their simulation result with ReLU activation function and $O(n)$-dimensional feature vectors, where $n$ is the number of nodes of the graph. By introducing randomness into the architecture, Aamand et al.~(NeurIPS'22) were able to improve this bound to $O(\log n)$-dimensional feature vectors, again for ReLU activation, although at the expense of guaranteeing perfect simulation only with high probability. Recently, Amir et al.~(NeurIPS'23) have shown that for any non-polynomial analytic activation function, it is enough to use just 1-dimensional feature vectors. In this paper, we give a simple proof of the result of Amit et al.~and provide an independent experimental validation of it.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.