Papers
Topics
Authors
Recent
2000 character limit reached

Context-Aware Graph Attention Networks (1910.01736v1)

Published 4 Sep 2019 in cs.LG, cs.SI, eess.IV, eess.SP, and stat.ML

Abstract: Graph Neural Networks (GNNs) have been widely studied for graph data representation and learning. However, existing GNNs generally conduct context-aware learning on node feature representation only which usually ignores the learning of edge (weight) representation. In this paper, we propose a novel unified GNN model, named Context-aware Adaptive Graph Attention Network (CaGAT). CaGAT aims to learn a context-aware attention representation for each graph edge by further exploiting the context relationships among different edges. In particular, CaGAT conducts context-aware learning on both node feature representation and edge (weight) representation simultaneously and cooperatively in a unified manner which can boost their respective performance in network training. We apply CaGAT on semi-supervised learning tasks. Promising experimental results on several benchmark datasets demonstrate the effectiveness and benefits of CaGAT.

Citations (2)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.