Nonlinear Correct and Smooth for Semi-Supervised Learning (2310.05757v1)
Abstract: Graph-based semi-supervised learning (GSSL) has been used successfully in various applications. Existing methods leverage the graph structure and labeled samples for classification. Label Propagation (LP) and Graph Neural Networks (GNNs) both iteratively pass messages on graphs, where LP propagates node labels through edges and GNN aggregates node features from the neighborhood. Recently, combining LP and GNN has led to improved performance. However, utilizing labels and features jointly in higher-order graphs has not been explored. Therefore, we propose Nonlinear Correct and Smooth (NLCS), which improves the existing post-processing approach by incorporating non-linearity and higher-order representation into the residual propagation to handle intricate node relationships effectively. Systematic evaluations show that our method achieves remarkable average improvements of 13.71% over base prediction and 2.16% over the state-of-the-art post-processing method on six commonly used datasets. Comparisons and analyses show our method effectively utilizes labels and features jointly in higher-order graphs to resolve challenging graph relationships.
- Carter T Butts. Social network analysis: A methodological introduction. Asian Journal of Social Psychology, 11(1):13–41, 2008.
- Current state and future trends: A citation network analysis of the learning analytics field. In Proceedings of the fourth international conference on learning analytics and knowledge, pages 231–240, 2014.
- Automated web usage data mining and recommendation system using k-nearest neighbor (knn) classification method. Applied Computing and Informatics, 12(1):90–108, 2016.
- Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in neural information processing systems, 33:7793–7804, 2020.
- Learning with local and global consistency. Advances in neural information processing systems, 16, 2003.
- Combining label propagation and simple models out-performs graph neural networks. arXiv preprint arXiv:2010.13993, 2020.
- Graphmix: Improved training of gnns for semi-supervised learning. In Proceedings of the AAAI conference on artificial intelligence, volume 35, pages 10024–10032, 2021.
- Meta propagation networks for graph few-shot semi-supervised learning. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, pages 6524–6531, 2022.
- Partially labeled classification with markov random walks. Advances in neural information processing systems, 14, 2001.
- Nonlinear higher-order label spreading. In Proceedings of the Web Conference 2021, pages 2402–2413, 2021.
- The graph neural network model. IEEE transactions on neural networks, 20(1):61–80, 2008.
- Deeper insights into graph convolutional networks for semi-supervised learning. In Thirty-Second AAAI conference on artificial intelligence, 2018.
- Graph random neural networks for semi-supervised learning on graphs. Advances in neural information processing systems, 33:22092–22103, 2020.
- Graph neural networks exponentially lose expressive power for node classification. arXiv preprint arXiv:1905.10947, 2019.
- Deepgcns: Can gcns go as deep as cnns? In Proceedings of the IEEE/CVF international conference on computer vision, pages 9267–9276, 2019.
- Combining graph convolutional neural networks and label propagation. ACM Transactions on Information Systems (TOIS), 40(4):1–27, 2021.
- Relational inductive biases, deep learning, and graph networks. arXiv preprint arXiv:1806.01261, 2018.
- Graph-based semi-supervised learning: A review. Neurocomputing, 408:216–230, 2020.
- Attention-based graph neural network for semi-supervised learning. arXiv preprint arXiv:1803.03735, 2018.
- Simple and deep graph convolutional networks. In International conference on machine learning, pages 1725–1735. PMLR, 2020.
- Understanding and resolving performance degradation in deep graph convolutional networks. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pages 2728–2737, 2021.
- Tackling over-smoothing for general graph convolutional networks. arXiv preprint arXiv:2008.09864, 2020.
- Bag of tricks for training deeper graph neural networks: A comprehensive benchmark study. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022.
- A unifying generative model for graph learning algorithms: Label propagation, graph convolutions, and combinations. SIAM Journal on Mathematics of Data Science, 4(1):100–125, 2022.
- Higher-order label homogeneity and spreading in graphs. In Proceedings of The Web Conference 2020, pages 2493–2499, 2020.
- Residual correlation in graph neural network regression. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pages 588–598, 2020.
- Graph attention networks. arXiv preprint arXiv:1710.10903, 2017.