Full Text:   <2275>

Summary:  <1384>

CLC number: TP391

On-line Access: 2020-12-10

Received: 2019-11-30

Revision Accepted: 2020-04-21

Crosschecked: 2020-09-24

Cited: 0

Clicked: 4230

Citations:  Bibtex RefMan EndNote GB/T7714

 ORCID:

Hao Wang

https://orcid.org/0000-0002-9613-6169

Tie-hu Fan

https://orcid.org/0000-0003-1496-9464

-   Go to

Article info.
Open peer comments

Frontiers of Information Technology & Electronic Engineering  2020 Vol.21 No.12 P.1795-1803

http://doi.org/10.1631/FITEE.1900663


A local density optimization method based on a graph convolutional network


Author(s):  Hao Wang, Li-yan Dong, Tie-hu Fan, Ming-hui Sun

Affiliation(s):  College of Computer Science and Technology, Jilin University, Changchun 130012, China; more

Corresponding email(s):   wanghao18@mails.jlu.edu.cn, dongly@jlu.edu.cn, fth@jlu.edu.cn, smh@jlu.edu.cn

Key Words:  Semi-supervised learning, Graph convolutional network, Graph embedding, Local density


Hao Wang, Li-yan Dong, Tie-hu Fan, Ming-hui Sun. A local density optimization method based on a graph convolutional network[J]. Frontiers of Information Technology & Electronic Engineering, 2020, 21(12): 1795-1803.

@article{title="A local density optimization method based on a graph convolutional network",
author="Hao Wang, Li-yan Dong, Tie-hu Fan, Ming-hui Sun",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="21",
number="12",
pages="1795-1803",
year="2020",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.1900663"
}

%0 Journal Article
%T A local density optimization method based on a graph convolutional network
%A Hao Wang
%A Li-yan Dong
%A Tie-hu Fan
%A Ming-hui Sun
%J Frontiers of Information Technology & Electronic Engineering
%V 21
%N 12
%P 1795-1803
%@ 2095-9184
%D 2020
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.1900663

TY - JOUR
T1 - A local density optimization method based on a graph convolutional network
A1 - Hao Wang
A1 - Li-yan Dong
A1 - Tie-hu Fan
A1 - Ming-hui Sun
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 21
IS - 12
SP - 1795
EP - 1803
%@ 2095-9184
Y1 - 2020
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.1900663


Abstract: 
Success has been obtained using a semi-supervised graph analysis method based on a graph convolutional network (GCN). However, GCN ignores some local information at each node in the graph, so that data preprocessing is incomplete and the model generated is not accurate enough. Thus, in the case of numerous unsupervised models based on graph embedding technology, local node information is important. In this paper, we apply a local analysis method based on the similar neighbor hypothesis to a GCN, and propose a local density definition; we call this method LDGCN. The LDGCN algorithm processes the input data of GCN in two methods, i.e., the unbalanced and balanced methods. Thus, the optimized input data contains detailed local node information, and then the model generated is accurate after training. We also introduce the implementation of the LDGCN algorithm through the principle of GCN, and use three mainstream datasets to verify the effectiveness of the LDGCN algorithm (i.e., the Cora, Citeseer, and Pubmed datasets). Finally, we compare the performances of several mainstream graph analysis algorithms with that of the LDGCN algorithm. Experimental results show that the LDGCN algorithm has better performance in node classification tasks.

一个基于图卷积神经网络的局部密度优化方法

王浩1,2,董立岩1,2,范铁虎3,孙铭会1,2
1吉林大学计算机科学与技术学院,中国长春市,130012
2吉林大学符号计算与知识工程教育部重点实验室,中国长春市,130012
3吉林大学仪器科学与电气工程学院,中国长春市,130012

摘要:基于图卷积神经网络的半监督图分析(GCN)方法已取得成功。然而,该方法忽略了图中节点的某些局部信息,说明GCN数据预处理不够完善,训练产生的模型不够精确。因此,在基于图嵌入技术的多个非监督方法中,对输入数据局部信息的采集非常重要。本文将基于相似邻接度假设的局部分析方法应用到图卷积网络,并给出局部密度的定义;该方法被称作LDGCN。LDGCN通过两种不同方法处理图卷积网络的输入数据,即非平衡方法和平衡方法。被处理后的输入数据包含更详细的结点局部信息,训练所生成的模型更准确。通过GCN原理介绍LDGCN的实现,然后使用3个主流图数据集(Cora,Citeseer和Pubmed)验证其有效性。最后,通过节点分类实验与多个主流图分析方法对比,结果表明LDGCN算法有更好表现。

关键词:半监督学习;图卷积网络;图嵌入;局部密度

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

[1]Chen HC, Perozzi B, Al-Rfou R, et al., 2018. A tutorial on network embeddings. https://arxiv.org/abs/1808.02590v1

[2]Fouss F, Pirotte A, Renders JM, et al., 2007. Random-walk computation of similarities between nodes of a graph with application to collaborative recommendation. IEEE Trans Knowl Data Eng, 19(3):355-369.

[3]Grover A, Leskovec J, 2016. node2vec: scalable feature learning for networks. Proc 22nd ACM SIGKDD Int Conf on Knowledge Discovery and Data Mining, p.855-864.

[4]Grover A, Zweig A, Ermon S, 2018. Graphite: iterative generative modeling of graphs. https://arxiv.org/abs/1803.10459v2

[5]Kipf TN, Welling M, 2016. Semi-supervised classification with graph convolutional networks. https://arxiv.org/abs/1609.02907

[6]Langville AN, Meyer CD, 2006. Google’s PageRank and Beyond: the Science of Search Engine Rankings. Princeton University Press, Princeton, USA, p.234.

[7]Le Q, Mikolov T, 2014. Distributed representations of sentences and documents. Proc 31st Int Conf on Machine Learning, p.II-1188-II-1196.

[8]LeCun Y, Boser B, Denker JS, et al., 1989. Backpropagation applied to handwritten zip code recognition. Neur Comput, 1(4):541-551.

[9]LeCun Y, Bottou L, Bengio Y, et al., 1998. Gradient-based learning applied to document recognition. Proc IEEE, 86(11):2278-2324.

[10]LeCun Y, Bottou Y, Hinton G, 2015. Deep learning. Nature, 521(7553):436-444.

[11]Lin F, Cohen WW, 2010. Semi-supervised classification of network data using very few labels. Int Conf on Advances in Social Networks Analysis and Mining, p.192-199.

[12]Lorrain F, White HC, 1971. Structural equivalence of individuals in social networks. J Math Soc, 1(1):49-80.

[13]Mikolov T, Chen K, Corrado G, et al., 2013. Efficient estimation of word representations in vector space. https://arxiv.org/abs/1301.3781

[14]Mnih A, Kavukcuoglu K, 2013. Learning word embeddings efficiently with noise-contrastive estimation. Proc 26th Int Conf on Neural Information Processing Systems, p.2265-2273.

[15]Narayanan A, Chandramohan M, Chen LH, et al., 2016. subgraph2vec: learning distributed representations of rooted sub-graphs from large graphs. https://arxiv.org/abs/1606.08928

[16]Niepert M, Ahmed M, Kutzkov K, 2016. Learning convolutional neural networks for graphs. Proc 33rd Int Conf on Machine Learning, p.2014-2023.

[17]Page L, Brin S, Motwani R, et al., 1998. The Pagerank Citation Ranking: Bringing Order to the Web. Technical Report SIDL-WP-1999-0120, Stanford InfoLab, Stanford, USA.

[18]Perozzi B, Al-Rfou R, Skiena S, 2014. DeepWalk: online learning of social representations. Proc 20th ACM SIGKDD Int Conf on Knowledge Discovery and Data Mining, p.701-710.

[19]Pizarro N, 2007. Structural identity and equivalence of individuals in social networks: beyond duality. Int Soc, 22(6):767-792.

[20]Ribeiro LFR, Saverese PHP, Figueiredo DR, 2017. struc2vec: learning node representations from structural identity. Proc 23rd ACM SIGKDD Int Conf on Knowledge Discovery and Data Mining, p.385-394.

[21]Shuman DI, Narang SK, Frossard P, et al., 2013. The emerging field of signal processing on graphs: extending high- dimensional data analysis to networks and other irregular domains. IEEE Signal Process Mag, 30(3):83-98.

[22]Tang L, Liu H, 2011. Leveraging social media networks for classification. Data Min Knowl Discov, 23(3):447-478.

[23]Wang HF, Zhang CY, Lin DY, et al., 2019. An artificial intelligence based method for evaluating power grid node importance using network embedding and support vector regression. Front Inform Technol Electron Eng, 20(6): 816-828.

[24]Weston J, Ratle F, Mobahi H, et al., 2012. Deep learning via semi-supervised embedding. In: Montavon G, Orr GB, Müller KR (Eds.), Neural Networks: Tricks of the Trade. Springer, Berlin, Heidelberg, p.639-655.

[25]Yang LM, Zhang W, Chen YF, 2015. Time-series prediction based on global fuzzy measure in social networks. Front Inform Technol Electron Eng, 16(10):805-816.

[26]Yang ZL, Cohen WW, Salakhutdinov R, 2016. Revisiting semi-supervised learning with graph embeddings. Proc 33rd Int Conf on Machine Learning, p.40-48.

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE