Abstract
In the real world, numerous heterophilic networks effectively model the tendency of similar entities to repel each other and dissimilar entities to be attracted to each other within complex systems. Concerning the node classification problem in heterophilic networks, a plethora of heterophilic Graph Neural Networks (GNNs) have emerged. However, these GNNs demand extensive hyperparameter tuning, activation function selection, parameter initialization, and other configuration settings, particularly when dealing with diverse heterophilic networks and resource constraints. This situation raises a fundamental question: Can a method be designed to directly preprocess heterophilic networks and then leverage the trained models in network representation learning systems? In this paper, we propose a novel approach to transform heterophilic network structures. Specifically, we train an edge classifier and subsequently employ this edge classifier to transform a heterophilic network into its corresponding homophilic counterpart. Finally, we conduct experiments on heterophilic network datasets with variable sizes, demonstrating the effectiveness of our approach. The code and datasets are publicly available at https://github.com/xueyanfeng/D_c_GNNs.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data availibility
Data supporting this study are openly available at https://github.com/xueyanfeng/D_c_GNNs.
Notes
Scikit-learn is an open-source machine learning library that supports both supervised and unsupervised learning techniques.
References
Abu-El-Haija S, Perozzi B, Kapoor A, Alipourfard N, Lerman K, Harutyunyan H, Ver Steeg G, Galstyan A (2019) Mixhop: higher-order graph convolutional architectures via sparsified neighborhood mixing. In: International conference on machine learning. PMLR, pp 21–29
Bo D, Wang X, Shi C, Shen H (2021) Beyond low-frequency information in graph convolutional networks. Proc AAAI Conf Artif Intell 35:3950–3957
Cen Y, Hou Z, Wang Y, Chen Q, Luo Y, Yu Z, Zhang H, Yao X, Zeng A, Guo S, Dong Y, Yang Y, Zhang P, Dai G, Wang Y, Zhou C, Yang H, Tang J (2021) Cogdl: a toolkit for deep learning on graphs. arXiv preprint arXiv:2103.00959
Chamberlain B, Rowbottom J, Gorinova MI, Bronstein M, Webb S, Rossi E (2021) Grand: Graph neural diffusion. In: International conference on machine learning. PMLR, pp 1407–1418
Chen M, Wei Z, Huang Z, Ding B, Li Y (2020) Simple and deep graph convolutional networks. In: International conference on machine learning. PMLR, pp 1725–1735
Chien E, Peng J, Li P, Milenkovic O (2021) Adaptive universal generalized pagerank graph neural network. In: International conference on learning representations
Du L, Shi X, Fu Q, Ma X, Liu H, Han S, Zhang D (2022) Gbk-gnn: gated bi-kernel graph neural networks for modeling both homophily and heterophily. In: Proceedings of the ACM web conference 2022, pp 1550–1558
Fey M, Lenssen JE (2019) Fast graph representation learning with PyTorch Geometric. In: ICLR workshop on representation learning on graphs and manifolds
Gasteiger J, Bojchevski A, Günnemann S (2019) Predict then propagate: graph neural networks meet personalized pagerank. In: International conference on learning representations (ICLR)
Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE (2017) Neural message passing for quantum chemistry. In: International conference on machine learning. PMLR, pp 1263–1272
Jin W, Derr T, Wang Y, Ma Y, Liu Z, Tang J (2021) Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM international conference on web search and data mining, pp 148–156
Jin W, Ma Y, Liu X, Tang X, Wang S, Tang J (2020) Graph structure learning for robust graph neural networks. In: Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining, pp 66–74
Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks. In: International conference on learning representations (ICLR)
Li Q, Han Z, Wu XM (2018) Deeper insights into graph convolutional networks for semi-supervised learning. In: Thirty-second AAAI conference on artificial intelligence
Li X, Zhu R, Cheng Y, Shan C, Luo S, Li D, Qian W (2022) Finding global homophily in raph neural networks when meeting heterophily. arXiv preprint arXiv:2205.07308
Lim D, Hohne F, Li X, Huang SL, Gupta V, Bhalerao O, Lim SN (2021) Large scale learning on non-homophilous graphs: new benchmarks and strong simple methods. Adv Neural Inf Process Syst 34:20887–20902
Liu M, Wang Z, Ji S (2021) Non-local graph neural networks[J]. IEEE Trans Pattern Anal Mach Intell 44(12):10270–10276
Luan S, Hua C, Lu Q, Zhu J, Zhao M, Zhang S, Chang XW, Precup D (2021) Is heterophily a real nightmare for graph neural networks to do node classification? arXiv preprint arXiv:2109.05641
Ma Y, Liu X, Shah N, Tang J (2021) Is homophily a necessity for graph neural networks? arXiv preprint arXiv:2106.06134
Maurya SK, Liu X, Murata T (2021) Improving graph neural networks with simple architecture design. arXiv preprint arXiv:2105.07634
Pei H, Wei B, Chang KCC, Lei Y, Yang B (2020) Geom-gcn: geometric graph convolutional networks. In: International conference on learning representations
Rong Y, Huang W, Xu T, Huang J (2020) Dropedge: towards deep graph convolutional networks on node classification. In: International conference on learning representations https://openreview.net/forum?id=Hkx1qkrKPr
Sarkar S, Bhagwat A, Mukherjee A (2018) Core2vec: a core-preserving feature learning framework for networks. In: 2018 IEEE/ACM international conference on advances in social networks analysis and mining (ASONAM), pp 487–490. https://doi.org/10.1109/ASONAM.2018.8508693
Suresh S, Budde V, Neville J, Li P, Ma J (2021) Breaking the limit of graph neural networks by improving the assortativity of graphs with local mixing patterns. In: Proceedings of the 27th ACM SIGKDD conference on knowledge discovery & data mining
Veličković P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y (2018)Graph attention networks. In: International conference on learning representations
Wang B, Cheng L, Sheng J, Hou Z, Chang Y (2022) Graph convolutional networks fusing motif-structure information. Sci Rep 12(1):1–12
Wang J, Liang J, Liang J, et al. (2022) Guide: training deep graph neural networks via guided dropout over edges[J]. IEEE Trans Neural Netw Learn Syst
Wang M, Zheng D, Ye Z, Gan Q, Li M, Song X, Zhou, J, Ma C, Yu L, Gai Y, Xiao T, He T, Karypis G, Li J, Zhang Z (2019) Deep graph library: a graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315
Wang T, Jin D, Wang R, He D, Huang Y (2022) Powerful graph convolutional networks with adaptive propagation mechanism for homophily and heterophily. In: Proceedings of the AAAI conference on artificial intelligence, vol 36, pp 4210–4218
Xu K, Hu W, Leskovec J, Jegelka S (2018) How powerful are graph neural networks? In: International conference on learning representations
Yan Y, Hashemi M, Swersky K, Yang Y, Koutra D (2021) Two sides of the same coin: heterophily and oversmoothing in graph convolutional neural networks. arXiv preprint arXiv:2102.06462
Yang Y, Liu T, Wang Y, Zhou J, Gan Q, Wei Z, Zhang Z, Huang Z, Wipf D (2021) Graph neural networks inspired by classical iterative algorithms. In: International conference on machine learning. PMLR, pp 11773–11783
Ye Y, Ji S (2021) Sparse graph attention networks. IEEE Trans Knowl Data Eng 35(1):905–916
Ye Y, Ji S (2023) Sparse graph attention networks. IEEE Trans Knowl Data Eng 35(1):905–916. https://doi.org/10.1109/TKDE.2021.3072345
Zhou K, Song Q, Huang X, Hu X (2019) Auto-gnn: neural architecture search of graph neural networks. arXiv preprint arXiv:1909.03184
Zhu J, Rossi RA, Rao A, Mai T, Lipka N, Ahmed NK, Koutra D (2021) Graph neural networks with heterophily. Proc AAAI Conf Artif Intell 35:11168–11176
Zhu J, Yan Y, Zhao L, Heimann M, Akoglu L, Koutra D (2020) Beyond homophily in graph neural networks: current limitations and effective designs. Adv Neural Inf Process Syst 33:7793–7804
Zhu R, Zhao K, Yang H, Lin W, Zhou C, Ai B, Li Y, Zhou J (2019) Aligraph: a comprehensive graph neural network platform. Proc VLDB Endowment 12(12):2094–2105
Acknowledgements
This research is supported in part by National Natural Science Foundation of China under Grant 12231012, Research of Technological Important Programs in the city of Lüliang, China (No. 2022GXYF18), Innovation Project of Higher Education Teaching Reform in Shanxi (No. J20221164, 2022YJJG310) and Natural Science Foundation in Shanxi (No. 202203021221229).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Xue, Y., Jin, Z. & Gao, W. A Data-centric graph neural network for node classification of heterophilic networks. Int. J. Mach. Learn. & Cyber. 15, 3413–3423 (2024). https://doi.org/10.1007/s13042-024-02100-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13042-024-02100-y