DeepWalk论文精读:(1)解决问题&相关工作

DeepWalk论文精读系列:
(1)解决问题&相关工作:https://www.cnblogs.com/ZhaoLX/p/DeepWalk1.html
(2)核心算法:https://www.cnblogs.com/ZhaoLX/p/DeepWalk2.html
(3)实验:https://www.cnblogs.com/ZhaoLX/p/DeepWalk3.html
(4)总结及不足:https://www.cnblogs.com/ZhaoLX/p/DeepWalk4.html

模块1

1. 研究背景

随着互联网的发展,社交网络逐渐复杂化、多元化。在一个社交网络中,充斥着不同类型的用户,用户间产生各式各样的互动联系,形成大小不一的社群。为了对社交网络进行研究分析,需要将网络中的节点(用户)进行分类。

2. 问题描述

给定一个社交网络,以图$G_L=(V,E,X,Y)$的形式表示,其中$X in mathbb{R}^{|V| imes S}$ ($S$是每个属性节点的特征空间大小),$Y in mathbb{R}^{|V| imes |Y|}$ ($Y$是标签集合)。X是一个节点个数x节点特征空间维数的矩阵,即用S维的向量来代表图中的每一个节点。Y是一个节点个数x标签个数的矩阵,代表每个节点都有一个自己的标签。

任务是将社交网络中的结点进行分类,但该分类问题和普通分类不同。普通的分类问题学习函数$H$将$X$映射到$Y$,但对于一个图,需要利用到节点在图中嵌入的信息,即节点邻居等。这部分信息常常是隐藏的,不体现在$X$当中,需要结合网络结构来进行分类[37],单个节点的分类不是独立的任务,而是与其它节点的分类相关。这被称作集体分类(Collective Classification)或关联分类(Relational Classification)。

DeepWalk从关联分类任务中提取除了上游子任务——学习所有结点的嵌入表示$X_E in mathbb{R}^{|V| imes d}$,其中$d$是一个较小的维度数,$X_E$可用于捕捉节点的结构信息

  1. 适应性。能适应在时刻变化的社交网络,新加入边时不需要重新学习。
  2. 社群性。两个嵌入表示之间的距离应当可以度量两个节点之间的相似性,这个性质可以保证同质图上的泛化效果。
  3. 低维性。低维度的表示可以在少量标注数据上获得更好的泛化性能,且能加速收敛。
  4. 连续性。连续的表示在区分社区边界上更加平滑,使得分类结果更加鲁棒。

3. 研究现状

3.1 网络特征生成

在DeepWalk出现前,网络特征表示大多过于稀疏。稀疏性使得基于统计的学习模型变得难以泛化,增加了在网络上进行分类[15, 37]、内容推荐[11]、异常检测[5]、缺失边预测[22]等研究任务的难度。DeepWalk在以稠密、低维形式表征网络节点上做出了突出贡献。

3.2 关联分类

在传统的关联分类算法中,将问题转化为无向马尔科夫网络(undirected Markov network)的推断(inference)问题,在知道网络结构的情况下使用迭代近似推断算法(iterative approximate inference algorithm),比如迭代分类(iterative classification)[31]、Gibbs采样(Gibbs Sampling)[14]、标签松弛(label relaxation)[18],来计算后面的标签分布。但DeepWalk采用了另一种不利用标注数据的、无监督的方法,如果引入一些标注数据,还可以进一步提高效果。

此外,传统的关联分类算法更注重利用结点社群等信息进行分类,如学习聚类[32]、临近节点添加“虚边”[13]、使用PageRank算法[23]、使用文本信息[43]等方法,传统方法专注于分类任务,试图最大限度地利用信息完成精准分类。然而,DeepWalk将节点的分类问题转化为了节点的表示学习问题,利用信息来学习节点的嵌入表示,如果嵌入表示学习得好,那么分类问题就会容易许多。

3.3 随机游走和语言模型

DeepWalk将两个不相干领域的模型与算法进行了融合。一是随机游走(Random Walk),用于捕捉节点的局部结构,该方法曾被用于内容推荐[11]、社区识别[1, 38]等。二是语言模型(Language Modeling),特别借鉴了“利用一个单词预测上下文”的skip-gram算法。

作者观察到,随机游走序列上节点出现频率,以及自然语言中的词频,都服从幂律分布。所以,DeepWalk利用一些短的随机游走序列,使用原本在语言模型中设计的优化方法,学习到了节点的表示。

参考文献

网络表示稀疏性:

[15] L. Getoor and B. Taskar. Introduction to statistical relational learning. MIT press, 2007.

[37] P. Sen, G. Namata, M. Bilgic, L. Getoor, B. Galligher, and T. Eliassi-Rad. Collective classification in network data. AI magazine, 29(3):93, 2008.

[11] F. Fouss, A. Pirotte, J.-M. Renders, and M. Saerens. Random-walk computation of similarities between nodes of a graph with application to collaborative recommendation. Knowledge and Data Engineering, IEEE Transactions on, 19(3):355–369, 2007.

[5] V. Chandola, A. Banerjee, and V. Kumar. Anomaly detection: A survey. ACM Computing Surveys (CSUR), 41(3):15, 2009.

[22] D. Liben-Nowell and J. Kleinberg. The link-prediction problem for social networks. Journal of the American society for information science and technology, 58(7):1019–1031, 2007.

[2] Y. Bengio, A. Courville, and P. Vincent. Representation learning: A review and new perspectives. 2013.

collective classification——inference:

[31] J. Neville and D. Jensen. Iterative classification in relational data. In Proc. AAAI-2000 Workshop on Learning Statistical Models from Relational Data, pages 13–20, 2000.

[14] S. Geman and D. Geman. Stochastic relaxation, gibbs distributions, and the bayesian restoration of images. Pattern Analysis and Machine Intelligence, IEEE Transactions on, (6):721–741, 1984.

[18] R. A. Hummel and S. W. Zucker. On the foundations of relaxation labeling processes. Pattern Analysis and Machine Intelligence, IEEE Transactions on, (3):267–287, 1983.

[33] J. Neville and D. Jensen. A bias/variance decomposition for models using collective inference. Machine Learning, 73(1):87–106, 2008.

collective classification——classification with community information

[32] J. Neville and D. Jensen. Leveraging relational autocorrelation with latent group models. In Proceedings of the 4th International Workshop on Multi relational Mining, MRDM’05, pages 49–55, New York, NY, USA, 2005. ACM.

[13] B. Gallagher, H. Tong, T. Eliassi-Rad, and C. Faloutsos. Using ghost edges for classification in sparsely labeled networks. In Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’08, pages 256–264, New York, NY, USA, 2008. ACM.

[23] F. Lin and W. Cohen. Semi-supervised classification of network data using very few labels. In Advances in Social Networks Analysis and Mining (ASONAM), 2010 International Conference on, pages 192–199, Aug 2010.

[43] X. Wang and G. Sukthankar. Multi-label relational neighbor classification using social context features. In Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining, pages 464–472. ACM, 2013.

random walk:

[11] F. Fouss, A. Pirotte, J.-M. Renders, and M. Saerens. Random-walk computation of similarities between nodes of a graph with application to collaborative recommendation. Knowledge and Data Engineering, IEEE Transactions on, 19(3):355–369, 2007.

[1] R. Andersen, F. Chung, and K. Lang. Local graph partitioning using pagerank vectors. In Foundations of Computer Science, 2006. FOCS’06. 47th Annual IEEE Symposium on, pages 475–486. IEEE, 2006.

[38] D. A. Spielman and S.-H. Teng. Nearly-linear time algorithms for graph partitioning, graph sparsification, and solving linear systems. In Proceedings of the thirty-sixth annual ACM symposium on Theory of computing, pages 81–90. ACM, 2004.

language modeling:

[6] R. Collobert and J. Weston. A unified architecture for natural language processing: Deep neural networks with multitask learning. In Proceedings of the 25th international conference on Machine learning, pages 160–167. ACM, 2008.

[28] T. Mikolov, W.-t. Yih, and G. Zweig. Linguistic regularities in continuous space word representations. In Proceedings of NAACL-HLT, pages 746–751, 2013

[26] T. Mikolov, K. Chen, G. Corrado, and J. Dean. Efficient estimation of word representations in vector space. CoRR, abs/1301.3781, 2013.
[27] T. Mikolov, I. Sutskever, K. Chen, G. S. Corrado, and J. Dean. Distributed representations of words and phrases and their compositionality. In Advances in Neural Information Processing Systems 26, pages 3111–3119. 2013.

原文地址:https://www.cnblogs.com/ZhaoLX/p/DeepWalk1.html