Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Geometric Graph Filters and Neural Networks: Limit Properties and Discriminability Trade-offs (2305.18467v2)

Published 29 May 2023 in cs.LG and eess.SP

Abstract: This paper studies the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold, thus encoding geometric information. We consider convolutional MNNs and GNNs where the manifold and the graph convolutions are respectively defined in terms of the Laplace-Beltrami operator and the graph Laplacian. Using the appropriate kernels, we analyze both dense and moderately sparse graphs. We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold. As a byproduct of this analysis, we observe an important trade-off between the discriminability of graph filters and their ability to approximate the desired behavior of manifold filters. We then discuss how this trade-off is ameliorated in neural networks due to the frequency mixing property of nonlinearities. We further derive a transferability corollary for geometric graphs sampled from the same manifold. We validate our results numerically on a navigation control problem and a point cloud classification task.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (49)
  1. Z. Wang, L. Ruiz, and A. Ribeiro, “Convolutional filtering on sampled manifolds,” in ICASSP 2023-2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).   IEEE, 2023, pp. 1–5.
  2. ——, “Convergence of graph neural networks on relatively sparse graphs,” 2023. [Online]. Available: https://zhiyangw.com/Papers/Asilomar2023.pdf
  3. V. N. Ioannidis, A. G. Marques, and G. B. Giannakis, “Graph neural networks for predicting protein functions,” in 2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP).   IEEE, 2019, pp. 221–225.
  4. V. Gligorijević, P. D. Renfrew, T. Kosciolek, J. K. Leman, D. Berenberg, T. Vatanen, C. Chandler, B. C. Taylor, I. M. Fisk, H. Vlamakis et al., “Structure-based protein function prediction using graph convolutional networks,” Nature communications, vol. 12, no. 1, pp. 1–14, 2021.
  5. Q. Li, F. Gama, A. Ribeiro, and A. Prorok, “Graph neural networks for decentralized multi-robot path planning,” in 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2020, pp. 11 785–11 792.
  6. E. Tolstaya, J. Paulos, V. Kumar, and A. Ribeiro, “Multi-robot coverage and exploration using spatial graph neural networks,” in 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2021, pp. 8944–8950.
  7. J. Zeng, G. Cheung, M. Ng, J. Pang, and C. Yang, “3d point cloud denoising using graph laplacian regularization of a low dimensional manifold model,” IEEE Transactions on Image Processing, vol. 29, pp. 3474–3489, 2019.
  8. M. Devanne, H. Wannous, S. Berretti, P. Pala, M. Daoudi, and A. Del Bimbo, “3-d human action recognition by shape analysis of motion trajectories on riemannian manifold,” IEEE transactions on cybernetics, vol. 45, no. 7, pp. 1340–1352, 2014.
  9. Y. Wang, Y. Sun, Z. Liu, S. E. Sarma, M. M. Bronstein, and J. M. Solomon, “Dynamic graph cnn for learning on point clouds,” Acm Transactions On Graphics (tog), vol. 38, no. 5, pp. 1–12, 2019.
  10. Z. He, L. Wang, H. Ye, G. Y. Li, and B.-H. F. Juang, “Resource allocation based on graph neural networks in vehicular communications,” in GLOBECOM 2020-2020 IEEE Global Communications Conference.   IEEE, 2020, pp. 1–5.
  11. Z. Wang, L. Ruiz, M. Eisen, and A. Ribeiro, “Stable and transferable wireless resource allocation policies via manifold neural networks,” in ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).   IEEE, 2022, pp. 8912–8916.
  12. F. Gama, E. Isufi, G. Leus, and A. Ribeiro, “Graphs, convolutions, and neural networks: From graph filters to graph neural networks,” IEEE Signal Processing Magazine, vol. 37, no. 6, pp. 128–138, 2020.
  13. A. Ortega, P. Frossard, J. Kovačević, J. M. Moura, and P. Vandergheynst, “Graph signal processing: Overview, challenges, and applications,” Proceedings of the IEEE, vol. 106, no. 5, pp. 808–828, 2018.
  14. F. Scarselli, M. Gori, A. C. Tsoi, M. Hagenbuchner, and G. Monfardini, “The graph neural network model,” IEEE transactions on neural networks, vol. 20, no. 1, pp. 61–80, 2008.
  15. F. Gama, A. G. Marques, G. Leus, and A. Ribeiro, “Convolutional neural network architectures for signals supported on graphs,” IEEE Transactions on Signal Processing, vol. 67, no. 4, pp. 1034–1049, 2019.
  16. Z. Wang, L. Ruiz, and A. Ribeiro, “Convolutional neural networks on manifolds: From graphs and back,” arXiv preprint arXiv:2210.00376, 2022.
  17. ——, “Stability of neural networks on manifolds to relative perturbations,” in ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).   IEEE, 2022, pp. 5473–5477.
  18. J. Masci, D. Boscaini, M. Bronstein, and P. Vandergheynst, “Geodesic convolutional neural networks on riemannian manifolds,” in Proceedings of the IEEE international conference on computer vision workshops, 2015, pp. 37–45.
  19. R. Chakraborty, J. Bouza, J. H. Manton, and B. C. Vemuri, “Manifoldnet: A deep neural network for manifold-valued data with applications,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 44, no. 2, pp. 799–810, 2020.
  20. F. Gama, J. Bruna, and A. Ribeiro, “Stability properties of graph neural networks,” IEEE Transactions on Signal Processing, vol. 68, pp. 5680–5695, 2020.
  21. N. Keriven, A. Bietti, and S. Vaiter, “Convergence and stability of graph convolutional networks on large random graphs,” Advances in Neural Information Processing Systems, vol. 33, pp. 21 512–21 523, 2020.
  22. D. Zou and G. Lerman, “Graph convolutional neural networks via scattering,” Applied and Computational Harmonic Analysis, vol. 49, no. 3, pp. 1046–1074, 2020.
  23. Z. Wang, L. Ruiz, and A. Ribeiro, “Stability to deformations of manifold filters and manifold neural networks,” arXiv preprint arXiv:2106.03725, 2021.
  24. F. Monti, D. Boscaini, J. Masci, E. Rodola, J. Svoboda, and M. M. Bronstein, “Geometric deep learning on graphs and manifolds using mixture model cnns,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 5115–5124.
  25. M. M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, and P. Vandergheynst, “Geometric deep learning: going beyond euclidean data,” IEEE Signal Processing Magazine, vol. 34, no. 4, pp. 18–42, 2017.
  26. M. Eliasof, E. Haber, and E. Treister, “Pde-gcn: novel architectures for graph neural networks motivated by partial differential equations,” Advances in neural information processing systems, vol. 34, pp. 3836–3849, 2021.
  27. W. Shi and R. Rajkumar, “Point-gnn: Graph neural network for 3d object detection in a point cloud,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2020, pp. 1711–1719.
  28. M. T. Kejani, F. Dornaika, and H. Talebi, “Graph convolution networks with manifold regularization for semi-supervised learning,” Neural Networks, vol. 127, pp. 160–167, 2020.
  29. L. Ruiz, L. F. Chamon, and A. Ribeiro, “Transferability Properties of Graph Neural Networks,” arXiv preprint arXiv:2112.04629, 2021.
  30. ——, “Graphon signal processing,” IEEE Transactions on Signal Processing, vol. 69, pp. 4961–4976, 2021.
  31. J. Zhou, G. Cui, S. Hu, Z. Zhang, C. Yang, Z. Liu, L. Wang, C. Li, and M. Sun, “Graph neural networks: A review of methods and applications,” AI Open, vol. 1, pp. 57–81, 2020.
  32. L. Ruiz, L. Chamon, and A. Ribeiro, “Graphon neural networks and the transferability of graph neural networks,” Advances in Neural Information Processing Systems, vol. 33, pp. 1702–1712, 2020.
  33. S. Maskey, R. Levie, and G. Kutyniok, “Transferability of graph neural networks: an extended graphon approach,” arXiv preprint arXiv:2109.10096, 2021.
  34. J. Calder and N. G. Trillos, “Improved spectral convergence rates for graph laplacians on epsilon-graphs and k-nn graphs,” arXiv preprint arXiv:1910.13476, 2019.
  35. J. Chew, D. Needell, and M. Perlmutter, “A convergence rate for manifold neural networks,” arXiv preprint arXiv:2212.12606, 2022.
  36. D. I. Shuman, S. K. Narang, P. Frossard, A. Ortega, and P. Vandergheynst, “The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains,” IEEE signal processing magazine, vol. 30, no. 3, pp. 83–98, 2013.
  37. A. Sandryhaila and J. M. Moura, “Discrete signal processing on graphs,” IEEE transactions on signal processing, vol. 61, no. 7, pp. 1644–1656, 2013.
  38. J. Pach, “The beginnings of geometric graph theory,” Erdős Centennial, pp. 465–484, 2013.
  39. R. Merris, “A survey of graph laplacians,” Linear and Multilinear Algebra, vol. 39, no. 1-2, pp. 19–31, 1995.
  40. R. Levie, W. Huang, L. Bucci, M. Bronstein, and G. Kutyniok, “Transferability of spectral graph convolutional neural networks,” Journal of Machine Learning Research, vol. 22, no. 272, pp. 1–59, 2021.
  41. D. B. Dunson, H.-T. Wu, and N. Wu, “Spectral convergence of graph laplacian and heat kernel reconstruction in l∞superscript𝑙l^{\infty}italic_l start_POSTSUPERSCRIPT ∞ end_POSTSUPERSCRIPT from random samples,” Applied and Computational Harmonic Analysis, vol. 55, pp. 282–336, 2021.
  42. M. Belkin and P. Niyogi, “Towards a theoretical foundation for laplacian-based manifold methods,” Journal of Computer and System Sciences, vol. 74, no. 8, pp. 1289–1308, 2008.
  43. Y. Shi and B. Xu, “Gradient estimate of an eigenfunction on a compact riemannian manifold without boundary,” Annals of Global Analysis and Geometry, vol. 38, no. 1, pp. 21–26, 2010.
  44. A. Seelmann, “Notes on the s⁢i⁢n⁢2⁢θ𝑠𝑖𝑛2𝜃sin2\thetaitalic_s italic_i italic_n 2 italic_θ theorem,” Integral Equations and Operator Theory, vol. 79, no. 4, pp. 579–597, 2014.
  45. M. Hamidouche, “Spectral analysis of random geometric graphs,” Ph.D. dissertation, Université Côte d’Azur, 2020.
  46. J. Cervino, L. Chamon, B. D. Haeffele, R. Vidal, and A. Ribeiro, “Learning globally smooth functions on manifolds,” arXiv preprint arXiv:2210.00301, 2022.
  47. Z. Wu, S. Song, A. Khosla, F. Yu, L. Zhang, X. Tang, and J. Xiao, “3d shapenets: A deep representation for volumetric shapes,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2015, pp. 1912–1920.
  48. U. Von Luxburg, M. Belkin, and O. Bousquet, “Consistency of spectral clustering,” The Annals of Statistics, pp. 555–586, 2008.
  49. M. Belkin and P. Niyogi, “Convergence of laplacian eigenmaps,” Advances in neural information processing systems, vol. 19, 2006.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Zhiyang Wang (32 papers)
  2. Luana Ruiz (34 papers)
  3. Alejandro Ribeiro (281 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.