Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LLM and GNN are Complementary: Distilling LLM for Multimodal Graph Learning (2406.01032v1)

Published 3 Jun 2024 in cs.LG and cs.AI

Abstract: Recent progress in Graph Neural Networks (GNNs) has greatly enhanced the ability to model complex molecular structures for predicting properties. Nevertheless, molecular data encompasses more than just graph structures, including textual and visual information that GNNs do not handle well. To bridge this gap, we present an innovative framework that utilizes multimodal molecular data to extract insights from LLMs. We introduce GALLON (Graph Learning from LLM Distillation), a framework that synergizes the capabilities of LLMs and GNNs by distilling multimodal knowledge into a unified Multilayer Perceptron (MLP). This method integrates the rich textual and visual data of molecules with the structural analysis power of GNNs. Extensive experiments reveal that our distilled MLP model notably improves the accuracy and efficiency of molecular property predictions.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Junjie Xu (23 papers)
  2. Zongyu Wu (15 papers)
  3. Minhua Lin (15 papers)
  4. Xiang Zhang (395 papers)
  5. Suhang Wang (118 papers)
Citations (7)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com