簡易檢索 / 詳目顯示

研究生: 李享紝
Hsiang-Jen Li
論文名稱: 結合記憶機制以及多躍式選擇於時序圖嵌入
TGMK: Temporal Graph Embedding incorporating with Memory Mechanism and k-hop Selection
指導教授: 沈上翔
Shan-Hsiang Shen
楊朝龍
Chao-Lung Yang
口試委員: 楊朝龍
Chao-Lung Yang
沈上翔
Shan-Hsiang Shen
陳怡伶
Yi-Ling Chen
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2024
畢業學年度: 112
語文別: 英文
論文頁數: 83
中文關鍵詞: 網路科學複雜系統圖神經網路圖嵌入時序圖
外文關鍵詞: Network Science, Complex System, Graph Neural Network, Graph Embedding, Temporal Graph
相關次數: 點閱:142下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

社群網路圖用於描述人與人之間互動的關係,與傳統表格資料不同,具有無限擴展性,由於其特殊的圖形結構,處理這類資料需要特定的方法。在處理網路圖時,通常會遇到冷啟動、巨量資料以及任務泛化等問題。圖神經網路(GNN, Graph Neural Network)是一種常見的解決方案,通過聚合相鄰節點的特徵來收集與其連接的節點資訊,從而有效地縮小並簡化整個網路結構,使其能夠應用於各類機器學習模型。然而,大部分的圖神經網路假設網路結構是固定的,忽略時間因素,即使考慮時間,模型通常也會變得複雜難以理解。為了解決這些問題,本研究提出了TGMK(TemporalGraph Embedding incorporating with Memory Mechanism and k-hop Selection),一個基於TGN(TemporalGraphNetwork)[20] 的新框架。TGMK透過模組化設計,能根據不同資料調整模組內容,提升模型的靈活性和適應性。為了驗證TGMK框架的有效性,本研究在三組不同資料集上進行了連結預測任務的實驗,並在不同模組配置下評估了TGMK的效能。實驗結果顯示,在適當的模組配置下,TGMK在其中兩組資料集上分別獲得了 24% 和 38% 的提升。然而,在部分資料集上,僅有些微改進甚至出現性能下降的情況。這種結果的差異性凸顯了根據不同資料特性調整模型配置的必要性,並提供了選擇模組的指南,展示了TGMK在多樣情境下的應用價值。未來企業在開發類似模型應用時,可直接採用此框架,其模組化設計可便捷地根據資料進行調整,加速開發過程。


A social network graph is utilized to describe the interactions and relationships be tween individuals. Unlike traditional tabular data, it possesses the characteristic of in finite expansion, and its unique graphical structure necessitates specialized methods for processing. When dealing with network graphs, common challenges include cold start, scalability, and generalization. Graph Neural Networks (GNNs) have become a popular solution, as GNNs aggregate features of neighboring nodes to collect information from connected nodes, effectively reducing and simplifying the entire network structure for application in general machine learning models. However, most GNNs assume a fixed network structure, neglecting the temporal factor or resulting in overly complex models. To address these issues, the TGMK (Temporal Graph Embedding incorporating Memory Mechanism and k-hop Selection) framework is proposed. This new framework is based on TGN (Temporal Graph Network) [20]. TGMK’s modular design allows for adjust ments to module content based on different data, enhancing the model’s flexibility and adaptability. Experiments were conducted on link prediction tasks using three different datasets, and TGMK’s performance was evaluated under various module configurations. The experimental results indicate that with appropriate module configurations, TGMK achieved improvements of 24% and 38% on two of the datasets, though some datasets showed only minor improvements or even performance declines. This variability under scores the necessity of adjusting the model configuration according to the characteristics of different datasets and provides guidelines for module selection, demonstrating TGMK’s applicability in diverse scenarios. Future enterprises developing similar models can adopt this framework directly, leveraging its modular design to conveniently adjust according to data, thus accelerating the development process.

Recommendation Letter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . i Approval Letter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii Abstract in Chinese . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii Abstract in English . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iv Acknowledgment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x List of Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii 1 Introduction 1 1.1 Research Background and Statement of the Problem . . . . . . . . . . . . 1 1.2 Summary of the Contribution . . . . . . . . . . . . . . . . . . . . . . . . 2 1.3 Organization of the Thesis . . . . . . . . . . . . . . . . . . . . . . . . . 2 2 Related Work 4 2.1 Graph Embedding / Representation . . . . . . . . . . . . . . . . . . . . . 4 2.1.1 First-order Proximity . . . . . . . . . . . . . . . . . . . . . . . . 5 2.1.2 Second-order and Higher-order Proximity . . . . . . . . . . . . . 5 2.1.3 Combination of two Proximities or Arbitrary-order Proximity . . 6 2.1.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 2.2 Graph Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 2.2.1 Addressing Conventional Graph Embedding Challenges . . . . . 7 2.2.2 Theoretical Foundation . . . . . . . . . . . . . . . . . . . . . . . 8 2.2.3 Existing GNN Libraries . . . . . . . . . . . . . . . . . . . . . . 9 2.2.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 2.3 Temporal Graph Neural Network . . . . . . . . . . . . . . . . . . . . . . 10 2.3.1 Discrete-time Dynamic Graphs . . . . . . . . . . . . . . . . . . . 11 2.3.2 Continuos-time Dynamic Graphs . . . . . . . . . . . . . . . . . 11 2.3.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.4 Behavioral and Temporal Encoding Techniques . . . . . . . . . . . . . . 14 2.4.1 Behavioral Encoding . . . . . . . . . . . . . . . . . . . . . . . . 15 2.4.2 Time Encoding . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 2.4.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 2.5 Meta-heuristic Approaches for Feature Selection . . . . . . . . . . . . . 19 2.5.1 Feature Selection . . . . . . . . . . . . . . . . . . . . . . . . . . 19 2.5.2 Meta-heuristic Algorithms . . . . . . . . . . . . . . . . . . . . . 20 2.5.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 3 Methodology 24 3.1 Preliminaries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 3.2 Encoder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 3.2.1 Memory Mechanism . . . . . . . . . . . . . . . . . . . . . . . . 28 3.2.2 k-hop Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 3.2.3 Graph Neural Networks . . . . . . . . . . . . . . . . . . . . . . 33 3.3 Decoder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 3.3.1 Time-Aware Linear Link Predictor . . . . . . . . . . . . . . . . . 36 3.3.2 Time-Enhanced Element-Wise Link Predictor . . . . . . . . . . . 37 3.3.3 Concatenated Time-Conditioned Link Predictor . . . . . . . . . . 38 3.3.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 3.4 Loss Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 4 Experiment and Result 41 4.1 Environment Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 4.2 Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 4.2.1 Enron Email Dataset . . . . . . . . . . . . . . . . . . . . . . . . 42 4.2.2 DBLP Citation Network . . . . . . . . . . . . . . . . . . . . . . 42 4.2.3 GDELTLite . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 4.3 Experiment Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 4.3.1 Modified Split Ratios for better evaluation . . . . . . . . . . . . 45 4.3.2 Evaluation Metrics . . . . . . . . . . . . . . . . . . . . . . . . . 45 4.3.3 The design of the control and experimental groups . . . . . . . . 47 4.4 Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 4.4.1 Legacy V.S. New Train and Test Split Ratios . . . . . . . . . . . 49 4.4.2 Static V.S. TGMK Framework . . . . . . . . . . . . . . . . . . . 50 4.4.3 TGN V.S. TGMK Framework . . . . . . . . . . . . . . . . . . . 50 4.4.4 Ablation Study on the Proposed Framework . . . . . . . . . . . . 51 5 Conclusion 55 5.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 5.2 Future Work and Discussion . . . . . . . . . . . . . . . . . . . . . . . . 56 5.2.1 Explainable AI (XAI) . . . . . . . . . . . . . . . . . . . . . . . 56 5.2.2 HyperGraphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 Reference 56 Appendix 60

[1] Mikhail Belkin and Partha Niyogi. Laplacian eigenmaps and spectral techniques for embedding and clustering. Advances in neural information processing systems, 14, 2001.
[2] Kyunghyun Cho, Bart Van Merriënboer, Caglar Gulcehre, Dzmitry Bahdanau, Fethi Bougares, Holger Schwenk, and Yoshua Bengio. Learning phrase representations using rnn encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078, 2014.
[3] Weilin Cong, Si Zhang, Jian Kang, Baichuan Yuan, Hao Wu, Xin Zhou, Hanghang Tong, and Mehrdad Mahdavi. Do we really need complicated model architectures for temporal networks? arXiv preprint arXiv:2302.11636, 2023.
[4] Marco Dorigo. The any system optimization by a colony of cooperating agents. IEEE Trans. System, Man & Cybernetics-Part B, 26(1):1–13, 1996.
[5] Matthias Fey and Jan E. Lenssen. Fast graph representation learning with PyTorch Geometric. In ICLR Workshop on Representation Learning on Graphs and Manifolds, 2019.
[6] Martin Grohe. word2vec, node2vec, graph2vec, x2vec: Towards a theory of vector embeddings of structured data. In Proceedings of the 39th ACM SIGMOD-SIGACT-SIGAI Symposium on Principles of Database Systems, PODS’20, page 1–16, New York, NY, USA, 2020. Association for Computing Machinery. ISBN 9781450371087. doi: 10.1145/3375395.3387641. URL https://doi.org/10.1145/3375395.3387641.
[7] William L Hamilton, Rex Ying, and Jure Leskovec. Representation learning on graphs: Methods and applications. arXiv preprint arXiv:1709.05584, 2017.
[8] Sepp Hochreiter and Jürgen Schmidhuber. Long short-term memory. Neural computation, 9(8):1735–1780, 1997.
[9] Ningyuan Teresa Huang and Soledad Villar. A short tutorial on the weisfeiler-lehman test and its variants. In ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 8533–8537. IEEE, 2021.
[10] Shenyang Huang, Farimah Poursafaei, Jacob Danovitch, Matthias Fey, Weihua Hu, Emanuele Rossi, Jure Leskovec, Michael Bronstein, Guillaume Rabusseau, and Reihaneh Rabbany. Temporal graph benchmark for machine learning on temporal graphs. Advances in Neural Information Processing Systems, 2023.
[11] Kashif Hussain, Mohd Najib Mohd Salleh, Shi Cheng, and Yuhui Shi. Metaheuristic research: a comprehensive survey. Artificial intelligence review, 52:2191–2233, 2019.
[12] Dervis Karaboga et al. An idea based on honey bee swarm for numerical optimization. Technical report, Technical report-tr06, Erciyes university, engineering faculty, computer …, 2005.
[13] Seyed Mehran Kazemi, Rishab Goel, Sepehr Eghbali, Janahan Ramanan, Jaspreet Sahota, Sanjay Thakur, Stella Wu, Cathal Smyth, Pascal Poupart, and Marcus Brubaker. Time2vec: Learning a vector representation of time. arXiv preprint arXiv:1907.05321, 2019.
[14] Seyed Mehran Kazemi, Rishab Goel, Kshitij Jain, Ivan Kobyzev, Akshay Sethi, Peter Forsyth, and Pascal Poupart. Representation learning for dynamic graphs: A survey. Journal of Machine Learning Research, 21(70):1–73, 2020.
[15] James Kennedy and Russell Eberhart. Particle swarm optimization. In Proceedings of ICNN’95-international conference on neural networks, volume 4, pages 1942–1948. ieee, 1995.
[16] Thomas N Kipf and Max Welling. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
[17] Srijan Kumar, Xikun Zhang, and Jure Leskovec. Predicting dynamic embedding trajectory in temporal interaction networks. In Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pages 1269–1278, 2019.
[18] Annamalai Narayanan, Mahinthan Chandramohan, Rajasekar Venkatesan, Lihui Chen, Yang Liu, and Shantanu Jaiswal. graph2vec: Learning distributed representations of graphs. arXiv preprint arXiv:1707.05005, 2017.
[19] Lawrence Page, Sergey Brin, Rajeev Motwani, Terry Winograd, et al. The pagerank citation ranking: Bringing order to the web. 1999.
[20] Emanuele Rossi, Ben Chamberlain, Fabrizio Frasca, Davide Eynard, Federico Monti, and Michael Bronstein. Temporal graph networks for deep learning on dynamic graphs. In ICML 2020 Workshop on Graph Representation Learning, 2020.
[21] Benedek Rozemberczki, Paul Scherer, Yixuan He, George Panagopoulos, Alexander Riedel, Maria Astefanoaei, Oliver Kiss, Ferenc Beres, Guzman Lopez, Nicolas Collignon, and Rik Sarkar. PyTorch Geometric Temporal: Spatiotemporal Signal Processing with Neural Machine Learning Models. In Proceedings of the 30th ACM International Conference on Information and Knowledge Management, page 4564–4573, 2021.
[22] David E Rumelhart, Geoffrey E Hinton, and Ronald J Williams. Learning representations by backpropagating errors. nature, 323(6088):533–536, 1986.
[23] Jeffrey R Sampson. Adaptation in natural and artificial systems (john h. holland), 1976.
[24] Aravind Sankar, Yanhong Wu, Liang Gou, Wei Zhang, and Hao Yang. Dysat: Deep neural representation learning on dynamic graphs via self-attention networks. In Proceedings of the 13th international conference on web search and data mining, pages 519–527, 2020.
[25] Manik Sharma and Prableen Kaur. A comprehensive analysis of nature-inspired meta-heuristic techniques for feature selection problem. Archives of Computational Methods in Engineering, 28:1103–1127, 2021.
[26] Apeksha Shewalkar, Deepika Nyavanandi, and Simone A Ludwig. Performance evaluation of deep neural networks applied to speech recognition: Rnn, lstm and gru. Journal of Artificial Intelligence and Soft Computing Research, 9(4):235–245, 2019.
[27] Yunsheng Shi, Zhengjie Huang, Shikun Feng, Hui Zhong, Wenjin Wang, and Yu Sun. Masked label prediction: Unified message passing model for semi-supervised classification. arXiv preprint arXiv:2009.03509, 2020.
[28] Jian Tang, Meng Qu, Mingzhe Wang, Ming Zhang, Jun Yan, and Qiaozhu Mei. Line: Large-scale information network embedding. In Proceedings of the 24th international conference on world wide web, pages 1067–1077, 2015.
[29] Joshua B Tenenbaum, Vin de Silva, and John C Langford. A global geometric framework for nonlinear dimensionality reduction. science, 290(5500):2319–2323, 2000.
[30] Riku Togashi, Mayu Otani, and Shin’ichi Satoh. Alleviating cold-start problems in recommendation through pseudo-labelling over knowledge graph. In Proceedings of the 14th ACM international conference on web search and data mining, pages 931–939, 2021.
[31] Minjie Wang, Da Zheng, Zihao Ye, Quan Gan, Mufei Li, Xiang Song, Jinjing Zhou, Chao Ma, Lingfan Yu, Yu Gai, Tianjun Xiao, Tong He, George Karypis, Jinyang Li, and Zheng Zhang. Deep graph library: A graph-centric, highly-performant package for graph neural networks. arXiv preprint arXiv:1909.01315, 2019.
[32] Boris Weisfeiler and Andrei Leman. The reduction of a graph to canonical form and the algebra which appears therein. nti, Series, 2(9):12–16, 1968.
[33] Yubao Wu, Xiang Zhang, Yuchen Bian, Zhipeng Cai, Xiang Lian, Xueting Liao, and Fengpan Zhao. Second-order random walk-based proximity measures in graph analysis: formulations and algorithms. The VLDB Journal, 27:127–152, 2018.
[34] Da Xu, Chuanwei Ruan, Evren Korpeoglu, Sushant Kumar, and Kannan Achan. Inductive representation learning on temporal graphs. arXiv preprint arXiv:2002.07962, 2020.
[35] Mengjia Xu. Understanding graph embedding methods and their applications. SIAM Review, 63(4): 825–853, 2021.
[36] Jheng-Hong Yang, Chih-Ming Chen, Chuan-Ju Wang, and Ming-Feng Tsai. Hop-rec: high-order proximity for implicit recommendation. In Proceedings of the 12th ACM conference on recommender systems, pages 140–144, 2018.
[37] Bing Yu, Haoteng Yin, and Zhanxing Zhu. Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. arXiv preprint arXiv:1709.04875, 2017.
[38] Ziwei Zhang, Peng Cui, Xiao Wang, Jian Pei, Xuanrong Yao, and Wenwu Zhu. Arbitrary-order proximity preserved network embedding. In Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining, pages 2778–2786, 2018.
[39] Rong Zhu, Kun Zhao, Hongxia Yang, Wei Lin, Chang Zhou, Baole Ai, Yong Li, and Jingren Zhou. Aligraph: a comprehensive graph neural network platform. Proceedings of the VLDB Endowment, 12 (12):2094–2105, 2019.

無法下載圖示 全文公開日期 2026/07/26 (校內網路)
全文公開日期 2026/07/26 (校外網路)
全文公開日期 2026/07/26 (國家圖書館:臺灣博碩士論文系統)
QR CODE