Basic Search / Detailed Display

Author: 施宜廷
Thesis Title: 具快取之邊緣及雲端運算中增益基礎的卸載
Gain-based Offloading in Edge and Cloud Computing with Caching
Advisor: 賴源正
Yuan-Cheng Lai
Committee: 賴源正
Yuan-Cheng Lai
Arthur Chang
Chin-Ya Huang
Degree: 碩士
Department: 管理學院 - 資訊管理系
Department of Information Management
Thesis Publication Year: 2019
Graduation Academic Year: 108
Language: 英文
Pages: 36
Keywords (in Chinese): 行動邊緣運算具快取之行動邊緣運算卸載決策基於增益之卸載
Keywords (in other languages): MEC, Cache-enabled MEC, Offloading decision, Gain-based Offloading
Reference times: Clicks: 323Downloads: 0
School Collection Retrieve National Library Collection Retrieve Error Report
  • 為達到5G超低延遲相關應用的需求,提出基於行動邊緣計算(Mobile Edge Computing, MEC)的架構,包含使用著裝置(User Equipment, UE)及邊緣節點(edge)。由於快取(cache)能夠減少任務重複計算的時間,故與MEC整合成一個稱為具快取之行動邊緣運算(cache-enabled MEC)的領域。目前已經有許多相關文獻使用兩層cache-enabled MEC架構下探討任務卸載決策,然而這些論文是在有快取的環境中進行卸載決策,但其皆基於延遲時間或耗電量判斷卸載到哪個節點使得本身的利益最大,並未考量節點中已有快取資料而影響卸載決策。
    本論文基於實際的網路架構-三層cache-enabled網路架構(UE/edge/cloud)提出一個基於增益(gain)的卸載決策方法,稱為Gain-based Offloading(GO),因為不同的UE可能會有相同的任務需運算,因此GO以整體利益做考量,依照任務重複出現的機率,估算整體系統中存取同個任務所需花費的時間,若是在上層節點(edge/cloud)執行,cache的結果就能夠快速地提供給其他UE使用。Gain的意義為計算任務在每個UE執行數次和卸載到edge/cloud且其涵蓋的UE直接拿取快取資料的時間差距值,最後依據gain的大小決定卸載到哪個節點能得到最低的平均延遲時間。
    由實驗模擬結果得知,GO演算法相較於只考慮自身利益做卸載決策,可改進42.56 %的平均延遲時間,因為考量整體系統的利益,將近55%的任務在計算完gain後會被卸載到涵蓋UE數量較多的節點 (edge/cloud),亦即卸載到越上層的節點使得更多UE能直接拿取快取資源的利益大於每個UE在原節點執行,進而減少相同任務重複運算的時間。

    In order to achieve the 5G demands for ultra-low latency applications, Mobile Edge Computing(MEC)-based architecture is proposed, which includes User Equipment (UE) and edges. Besides, cache can reduce time to compute the repeated tasks which can be integrated as a new field-cached-enabled MEC. Currently, there are already some researches under the two-tier MEC-based architecture and joint the cache environment. However, these theses’ offloading decision depends on execution delay and energy consumption which only consider the benefits of the UE itself.
    This thesis is under a three-tier cache-enabled architecture(UE/edge/cloud) and proposes an approach of offloading decision, which called Gain-based Offloading(GO). The tasks may be repeatedly requested, so we are going to consider the overall benefits. According to the probability of repeated tasks, estimate the tasks duration in the system. If offload to edge/cloud, they can provide the cached results for other UEs quickly. The concept of gain is to calculate the difference of task duration between computing on UEs for many times and offloading to edge/cloud and belonged UEs can take the cached results. Then, according to the value of gain to decide whether to offload can benefit the whole system and achieve the lowest average delay.
    The simulation results show that GO algorithm is better than without considering gain by 42.56 % because there are about 55% of tasks decide to offload to the nodes that cover more UEs(edge/cloud) which means that offload to upper node can make more UEs tend to take the cached results so that can reduce lots of time to compute the repeated tasks.

    摘要 I Abstract II List of Tables IV List of Figures V Chapter 1 Introduction 1 Chapter 2 Related work 5 2.1 Offloading in Basic MEC 5 2.2 Offloading in cache-enabled MEC 7 Chapter 3 System model and problem formulation 9 3.1 Used notations 9 3.2 System model 10 3.3 Problem statement 12 Chapter 4 Gain-based Offloading(GO) 13 4.1 Concept 13 4.2 Duration calculation module 14 4.3 Gain calculation module 15 Chapter 5 Evaluation 18 5.1 Scenarios and parameters 18 5.2 The effects of average request size 20 5.3 The effects of average workload 22 5.4 The effects of number of UEs in an edge 24 Chapter 6 Conclusion 26 Reference 27

    [1] K. Kumar and Y. Lu, "Cloud Computing for Mobile Users: Can Offloading Computation Save Energy?," Computer, vol. 43, no. 4, pp. 51-56, 2010.
    [2] W. Zhang, Y. Wen, K. Guan, D. Kilper, H. Luo, and D. O. Wu, "Energy-Optimal Mobile Cloud Computing under Stochastic Wireless Channel," IEEE Transactions on Wireless Communications, vol. 12, no. 9, pp. 4569-4581, 2013.
    [3] S. Barbarossa, S. Sardellitti, and P. D. Lorenzo, "Communicating While Computing: Distributed mobile cloud computing over 5G heterogeneous networks," IEEE Signal Processing Magazine, vol. 31, no. 6, pp. 45-55, 2014.
    [4] C. You, K. Huang, and H. Chae, "Energy Efficient Mobile Cloud Computing Powered by Wireless Energy Transfer," IEEE Journal on Selected Areas in Communications, vol. 34, no. 5, pp. 1757-1771, 2016.
    [5] M. Chen, B. Liang, and M. Dong, "A semidefinite relaxation approach to mobile cloud offloading with computing access point," in 2015 IEEE 16th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC), 2015, pp. 186-190.
    [6] Y. Mao, C. You, J. Zhang, K. Huang, and K. B. Letaief, "A Survey on Mobile Edge Computing: The Communication Perspective," IEEE Communications Surveys & Tutorials, vol. 19, no. 4, pp. 2322-2358, 2017.
    [7] Y. Mao, J. Zhang, and K. B. Letaief, "Dynamic Computation Offloading for Mobile-Edge Computing With Energy Harvesting Devices," IEEE Journal on Selected Areas in Communications, vol. 34, no. 12, pp. 3590-3605, 2016.
    [8] J. Liu, Y. Mao, J. Zhang, and K. B. Letaief, "Delay-optimal computation task scheduling for mobile-edge computing systems," in 2016 IEEE International Symposium on Information Theory (ISIT), 2016, pp. 1451-1455.
    [9] Y. Hao, M. Chen, L. Hu, M. S. Hossain, and A. Ghoneim, "Energy Efficient Task Caching and Offloading for Mobile Edge Computing," IEEE Access, vol. 6, pp. 11365-11373, 2018.
    [10] M. Chen, Y. Hao, L. Hu, M. S. Hossain, and A. Ghoneim, "Edge-CoCaCo: Toward Joint Optimization of Computation, Caching, and Communication on Edge Cloud," IEEE Wireless Communications, vol. 25, no. 3, pp. 21-27, 2018.
    [11] H. Wang, R. Li, L. Fan, and H. Zhang, "Joint computation offloading and data caching with delay optimization in mobile-edge computing systems," in 2017 9th International Conference on Wireless Communications and Signal Processing (WCSP), 2017, pp. 1-6.
    [12] W. Fan, Y. Liu, B. Tang, F. Wu, and H. Zhang, "TerminalBooster: Collaborative Computation Offloading and Data Caching via Smart Basestations," IEEE Wireless Communications Letters, vol. 5, no. 6, pp. 612-615, 2016.
    [13] Q. Zhu, B. Si, F. Yang, and Y. Ma, "Task offloading decision in fog computing system," China Communications, vol. 14, no. 11, pp. 59-68, 2017.
    [14] H. Ko, J. Lee, and S. Pack, "Spatial and Temporal Computation Offloading Decision Algorithm in Edge Cloud-Enabled Heterogeneous Networks," IEEE Access, vol. 6, pp. 18920-18932, 2018.
    [15] A. V. Guglielmi, M. Levorato, and L. Badia, "A Bayesian Game Theoretic Approach to Task Offloading in Edge and Cloud Computing," in 2018 IEEE International Conference on Communications Workshops (ICC Workshops), 2018, pp. 1-6.
    [16] X. Wang, M. Chen, T. Taleb, A. Ksentini, and V. C. M. Leung, "Cache in the air: exploiting content caching and delivery techniques for 5G systems," IEEE Communications Magazine, vol. 52, no. 2, pp. 131-139, 2014.
    [17] E. Bastug, M. Bennis, and M. Debbah, "Living on the edge: The role of proactive caching in 5G wireless networks," IEEE Communications Magazine, vol. 52, no. 8, pp. 82-89, 2014.
    [18] N. Golrezaei, K. Shanmugam, A. G. Dimakis, A. F. Molisch, and G. Caire, "FemtoCaching: Wireless video content delivery through distributed caching helpers," in 2012 Proceedings IEEE INFOCOM, 2012, pp. 1107-1115.
    [19] Y. Lin, Y. Lai, C. Huang, and H. Chien, "Three-Tier Capacity and Traffic Allocation for Core, Edges, and Devices for Mobile Edge Computing," IEEE Transactions on Network and Service Management, pp. 1-1, 2018.

    無法下載圖示 Full text public date 2024/09/02 (Intranet public)
    Full text public date This full text is not authorized to be published. (Internet public)
    Full text public date This full text is not authorized to be published. (National library)