iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://unpaywall.org/10.1007/S13042-024-02100-Y
A Data-centric graph neural network for node classification of heterophilic networks | International Journal of Machine Learning and Cybernetics Skip to main content
Log in

A Data-centric graph neural network for node classification of heterophilic networks

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

In the real world, numerous heterophilic networks effectively model the tendency of similar entities to repel each other and dissimilar entities to be attracted to each other within complex systems. Concerning the node classification problem in heterophilic networks, a plethora of heterophilic Graph Neural Networks (GNNs) have emerged. However, these GNNs demand extensive hyperparameter tuning, activation function selection, parameter initialization, and other configuration settings, particularly when dealing with diverse heterophilic networks and resource constraints. This situation raises a fundamental question: Can a method be designed to directly preprocess heterophilic networks and then leverage the trained models in network representation learning systems? In this paper, we propose a novel approach to transform heterophilic network structures. Specifically, we train an edge classifier and subsequently employ this edge classifier to transform a heterophilic network into its corresponding homophilic counterpart. Finally, we conduct experiments on heterophilic network datasets with variable sizes, demonstrating the effectiveness of our approach. The code and datasets are publicly available at https://github.com/xueyanfeng/D_c_GNNs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Algorithm 1
Algorithm 2
Fig. 3
Fig. 4

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data availibility

Data supporting this study are openly available at https://github.com/xueyanfeng/D_c_GNNs.

Notes

  1. https://docs.dgl.ai/tutorials/blitz/index.html.

  2. https://pytorch-geometric.readthedocs.io/en/latest/.

  3. https://docs.cogdl.ai/en/latest/.

  4. https://graph-learn.readthedocs.io/en/latest/index_en.html.

  5. https://github.com/tencent/plato.

  6. https://pgl.readthedocs.io/en/latest/.

  7. https://github.com/facebookresearch/PyTorch-BigGraph.

  8. Scikit-learn is an open-source machine learning library that supports both supervised and unsupervised learning techniques.

  9. https://github.com/ChandlerBang/SimP-GCN.

  10. https://github.com/bdy9527/FAGCN.

References

  1. Abu-El-Haija S, Perozzi B, Kapoor A, Alipourfard N, Lerman K, Harutyunyan H, Ver Steeg G, Galstyan A (2019) Mixhop: higher-order graph convolutional architectures via sparsified neighborhood mixing. In: International conference on machine learning. PMLR, pp 21–29

  2. Bo D, Wang X, Shi C, Shen H (2021) Beyond low-frequency information in graph convolutional networks. Proc AAAI Conf Artif Intell 35:3950–3957

    Google Scholar 

  3. Cen Y, Hou Z, Wang Y, Chen Q, Luo Y, Yu Z, Zhang H, Yao X, Zeng A, Guo S, Dong Y, Yang Y, Zhang P, Dai G, Wang Y, Zhou C, Yang H, Tang J (2021) Cogdl: a toolkit for deep learning on graphs. arXiv preprint arXiv:2103.00959

  4. Chamberlain B, Rowbottom J, Gorinova MI, Bronstein M, Webb S, Rossi E (2021) Grand: Graph neural diffusion. In: International conference on machine learning. PMLR, pp 1407–1418

  5. Chen M, Wei Z, Huang Z, Ding B, Li Y (2020) Simple and deep graph convolutional networks. In: International conference on machine learning. PMLR, pp 1725–1735

  6. Chien E, Peng J, Li P, Milenkovic O (2021) Adaptive universal generalized pagerank graph neural network. In: International conference on learning representations

  7. Du L, Shi X, Fu Q, Ma X, Liu H, Han S, Zhang D (2022) Gbk-gnn: gated bi-kernel graph neural networks for modeling both homophily and heterophily. In: Proceedings of the ACM web conference 2022, pp 1550–1558

  8. Fey M, Lenssen JE (2019) Fast graph representation learning with PyTorch Geometric. In: ICLR workshop on representation learning on graphs and manifolds

  9. Gasteiger J, Bojchevski A, Günnemann S (2019) Predict then propagate: graph neural networks meet personalized pagerank. In: International conference on learning representations (ICLR)

  10. Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE (2017) Neural message passing for quantum chemistry. In: International conference on machine learning. PMLR, pp 1263–1272

  11. Jin W, Derr T, Wang Y, Ma Y, Liu Z, Tang J (2021) Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM international conference on web search and data mining, pp 148–156

  12. Jin W, Ma Y, Liu X, Tang X, Wang S, Tang J (2020) Graph structure learning for robust graph neural networks. In: Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining, pp 66–74

  13. Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks. In: International conference on learning representations (ICLR)

  14. Li Q, Han Z, Wu XM (2018) Deeper insights into graph convolutional networks for semi-supervised learning. In: Thirty-second AAAI conference on artificial intelligence

  15. Li X, Zhu R, Cheng Y, Shan C, Luo S, Li D, Qian W (2022) Finding global homophily in raph neural networks when meeting heterophily. arXiv preprint arXiv:2205.07308

  16. Lim D, Hohne F, Li X, Huang SL, Gupta V, Bhalerao O, Lim SN (2021) Large scale learning on non-homophilous graphs: new benchmarks and strong simple methods. Adv Neural Inf Process Syst 34:20887–20902

    Google Scholar 

  17. Liu M, Wang Z, Ji S (2021) Non-local graph neural networks[J]. IEEE Trans Pattern Anal Mach Intell 44(12):10270–10276

  18. Luan S, Hua C, Lu Q, Zhu J, Zhao M, Zhang S, Chang XW, Precup D (2021) Is heterophily a real nightmare for graph neural networks to do node classification? arXiv preprint arXiv:2109.05641

  19. Ma Y, Liu X, Shah N, Tang J (2021) Is homophily a necessity for graph neural networks? arXiv preprint arXiv:2106.06134

  20. Maurya SK, Liu X, Murata T (2021) Improving graph neural networks with simple architecture design. arXiv preprint arXiv:2105.07634

  21. Pei H, Wei B, Chang KCC, Lei Y, Yang B (2020) Geom-gcn: geometric graph convolutional networks. In: International conference on learning representations

  22. Rong Y, Huang W, Xu T, Huang J (2020) Dropedge: towards deep graph convolutional networks on node classification. In: International conference on learning representations https://openreview.net/forum?id=Hkx1qkrKPr

  23. Sarkar S, Bhagwat A, Mukherjee A (2018) Core2vec: a core-preserving feature learning framework for networks. In: 2018 IEEE/ACM international conference on advances in social networks analysis and mining (ASONAM), pp 487–490. https://doi.org/10.1109/ASONAM.2018.8508693

  24. Suresh S, Budde V, Neville J, Li P, Ma J (2021) Breaking the limit of graph neural networks by improving the assortativity of graphs with local mixing patterns. In: Proceedings of the 27th ACM SIGKDD conference on knowledge discovery & data mining

  25. Veličković P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y (2018)Graph attention networks. In: International conference on learning representations

  26. Wang B, Cheng L, Sheng J, Hou Z, Chang Y (2022) Graph convolutional networks fusing motif-structure information. Sci Rep 12(1):1–12

    Google Scholar 

  27. Wang J, Liang J, Liang J, et al. (2022) Guide: training deep graph neural networks via guided dropout over edges[J]. IEEE Trans Neural Netw Learn Syst

  28. Wang M, Zheng D, Ye Z, Gan Q, Li M, Song X, Zhou, J, Ma C, Yu L, Gai Y, Xiao T, He T, Karypis G, Li J, Zhang Z (2019) Deep graph library: a graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315

  29. Wang T, Jin D, Wang R, He D, Huang Y (2022) Powerful graph convolutional networks with adaptive propagation mechanism for homophily and heterophily. In: Proceedings of the AAAI conference on artificial intelligence, vol 36, pp 4210–4218

  30. Xu K, Hu W, Leskovec J, Jegelka S (2018) How powerful are graph neural networks? In: International conference on learning representations

  31. Yan Y, Hashemi M, Swersky K, Yang Y, Koutra D (2021) Two sides of the same coin: heterophily and oversmoothing in graph convolutional neural networks. arXiv preprint arXiv:2102.06462

  32. Yang Y, Liu T, Wang Y, Zhou J, Gan Q, Wei Z, Zhang Z, Huang Z, Wipf D (2021) Graph neural networks inspired by classical iterative algorithms. In: International conference on machine learning. PMLR, pp 11773–11783

  33. Ye Y, Ji S (2021) Sparse graph attention networks. IEEE Trans Knowl Data Eng 35(1):905–916

    MathSciNet  Google Scholar 

  34. Ye Y, Ji S (2023) Sparse graph attention networks. IEEE Trans Knowl Data Eng 35(1):905–916. https://doi.org/10.1109/TKDE.2021.3072345

    Article  MathSciNet  Google Scholar 

  35. Zhou K, Song Q, Huang X, Hu X (2019) Auto-gnn: neural architecture search of graph neural networks. arXiv preprint arXiv:1909.03184

  36. Zhu J, Rossi RA, Rao A, Mai T, Lipka N, Ahmed NK, Koutra D (2021) Graph neural networks with heterophily. Proc AAAI Conf Artif Intell 35:11168–11176

    Google Scholar 

  37. Zhu J, Yan Y, Zhao L, Heimann M, Akoglu L, Koutra D (2020) Beyond homophily in graph neural networks: current limitations and effective designs. Adv Neural Inf Process Syst 33:7793–7804

    Google Scholar 

  38. Zhu R, Zhao K, Yang H, Lin W, Zhou C, Ai B, Li Y, Zhou J (2019) Aligraph: a comprehensive graph neural network platform. Proc VLDB Endowment 12(12):2094–2105

    Article  Google Scholar 

Download references

Acknowledgements

This research is supported in part by National Natural Science Foundation of China under Grant 12231012, Research of Technological Important Programs in the city of Lüliang, China (No. 2022GXYF18), Innovation Project of Higher Education Teaching Reform in Shanxi (No. J20221164, 2022YJJG310) and Natural Science Foundation in Shanxi (No. 202203021221229).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhen Jin.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xue, Y., Jin, Z. & Gao, W. A Data-centric graph neural network for node classification of heterophilic networks. Int. J. Mach. Learn. & Cyber. 15, 3413–3423 (2024). https://doi.org/10.1007/s13042-024-02100-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-024-02100-y

Keywords

Navigation