Emergent Mind

Large Language Models on Graphs: A Comprehensive Survey

(2312.02783)
Published Dec 5, 2023 in cs.CL and cs.LG

Abstract

LLMs, such as GPT4 and LLaMA, are creating significant advancements in natural language processing, due to their strong text encoding/decoding ability and newly found emergent capability (e.g., reasoning). While LLMs are mainly designed to process pure texts, there are many real-world scenarios where text data is associated with rich structure information in the form of graphs (e.g., academic networks, and e-commerce networks) or scenarios where graph data is paired with rich textual information (e.g., molecules with descriptions). Besides, although LLMs have shown their pure text-based reasoning ability, it is underexplored whether such ability can be generalized to graphs (i.e., graph-based reasoning). In this paper, we provide a systematic review of scenarios and techniques related to LLMs on graphs. We first summarize potential scenarios of adopting LLMs on graphs into three categories, namely pure graphs, text-attributed graphs, and text-paired graphs. We then discuss detailed techniques for utilizing LLMs on graphs, including LLM as Predictor, LLM as Encoder, and LLM as Aligner, and compare the advantages and disadvantages of different schools of models. Furthermore, we discuss the real-world applications of such methods and summarize open-source codes and benchmark datasets. Finally, we conclude with potential future research directions in this fast-growing field. The related source can be found at https://github.com/PeterGriffinJin/Awesome-Language-Model-on-Graphs.

Categorizes three scenarios of LLM on graph: Predictor, Aligner, Encoder.

Overview

  • The paper reviews LLMs applied to graph data structures, examining their strengths and limitations.

  • It categorizes the application scenarios for LLMs on graphs into pure graphs, text-rich graphs, and text-paired graphs.

  • Techniques involving LLMs on graphs are divided into three roles: predictor, encoder, and aligner.

  • LLMs on graphs hold promise for scientific discovery, computational social science, and other domain applications.

  • The paper identifies future research opportunities including better benchmarks, task exploration, and design of robust, generalizable models.

LLMs on Graphs

Introduction

Traditionally, LLMs have excelled in natural language processing by effectively encoding and decoding text. However, real-world data often presents itself in structured forms like graphs, which include academic and e-commerce networks or molecules paired with textual descriptions. While LLMs have showcased reasoning capabilities in text-based scenarios, their application in graph environments remains underexplored. By reviewing LLM techniques on graphs, this paper aims to present a systematic overview, categorizing potential scenarios and techniques, and suggesting future research directions.

Categories of Graph Scenarios

The application scenarios for LLMs on graphs are identified as:

  • Pure Graphs involve data with no or minimal textual information, challenging LLMs' reasoning abilities on graph theory tasks.
  • Text-Rich Graphs present scenarios where nodes or edges of a graph are associated with rich text, demanding models capable of understanding both text information and graph structures.
  • Text-Paired Graphs feature graphs with overarching text descriptions, as seen in molecules with text notations, requiring models to jointly leverage molecular structures and associated textual information.

LLM Techniques on Graphs

Based on the role of LLMs, techniques on graphs are categorized as follows:

  • LLM as Predictor includes methods transforming graphs to text sequences or employing LLM architectures that encode text and graph information concurrently. Techniques also involve LLM fine-tuning using graph structure supervision.
  • LLM as Encoder describes using LLMs for initial text encoding before graph neural networks (GNNs) perform structure encoding. Challenges include convergence and sparsity issues, addressed through optimization, data augmentation, and knowledge distillation.
  • LLM as Aligner relates to models aligning LLMs with GNNs for mutual enhancement, employing methods like prediction and latent space alignment.

Applications and Benchmarks

LLMs on graphs are applicable in domains such as scientific discovery and computational social science. They can also be used for virtual screening, optimizing scientific hypotheses, and synthesis planning. Benchmark datasets cover pure graphs, text-rich networks, and text-paired graphs. Evaluation metrics vary across tasks, often focusing on accuracy, precision, or domain-specific criteria.

Future Research Directions

Areas for future investigation encompass the creation of better benchmark datasets, exploration of broader task spaces with LLMs, design of multi-modal foundation models, efficient optimization of LLMs on graphs, and development of generalizable and robust LLMs for graph environments.

Conclusion

Despite strides in applying LLMs to graphs, significant challenges and questions remain. Addressing these can unlock the potential for LLMs to enhance our understanding across diverse graph scenarios, contributing to more advanced problem-solving and decision-making processes.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.