Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
104 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
40 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quadratic Graph Attention Network (Q-GAT) for Robust Construction of Gene Regulatory Networks (2303.14193v2)

Published 24 Mar 2023 in q-bio.MN and cs.CE

Abstract: Gene regulatory relationships can be abstracted as a gene regulatory network (GRN), which plays a key role in characterizing complex cellular processes and pathways. Recently, graph neural networks (GNNs), as a class of deep learning models, have emerged as a useful tool to infer gene regulatory relationships from gene expression data. However, deep learning models have been found to be vulnerable to noise, which greatly hinders the adoption of deep learning in constructing GRNs, because high noise is often unavoidable in the process of gene expression measurement. Can we preferably prototype a robust GNN for constructing GRNs? In this paper, we give a positive answer by proposing a Quadratic Graph Attention Network (Q-GAT) with a dual attention mechanism. We study the changes in the predictive accuracy of Q-GAT and 9 state-of-the-art baselines by introducing different levels of adversarial perturbations. Experiments in the E. coli and S. cerevisiae datasets suggest that Q-GAT outperforms the state-of-the-art models in robustness. Lastly, we dissect why Q-GAT is robust through the signal-to-noise ratio (SNR) and interpretability analyses. The former informs that nonlinear aggregation of quadratic neurons can amplify useful signals and suppress unwanted noise, thereby facilitating robustness, while the latter reveals that Q-GAT can leverage more features in prediction thanks to the dual attention mechanism, which endows Q-GAT with the ability to confront adversarial perturbation. We have shared our code in https://github.com/Minorway/Q-GAT_for_Robust_Construction_of_GRN for readers' evaluation.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com