Basic Search / Detailed Display

Author: 王煜祥
Yu-Hsiang Wang
Thesis Title: 具快取之行動邊緣運算中的 合作運算卸載
Collaborative Computation Offloading in Mobile Edge Computing with Caching
Advisor: 賴源正
Yuan-Cheng Lai
Committee: 賴源正
Yuan-Cheng Lai
沈上翔
Shan-Hsiang Shen
張榮昇
Arthur Chang
Degree: 碩士
Master
Department: 管理學院 - 資訊管理系
Department of Information Management
Thesis Publication Year: 2019
Graduation Academic Year: 107
Language: 英文
Pages: 40
Keywords (in Chinese): 5G行動網路行動邊緣運算任務卸載增益基礎合作卸載決策
Keywords (in other languages): 5G, MEC, Offloading, Gain-based Collaborative Offloading
Reference times: Clicks: 534Downloads: 0
Share:
School Collection Retrieve National Library Collection Retrieve Error Report
  • 行動邊緣運算(MEC)為下一代行動網路5G的主要功能之一,5G行動基地台將搭配具備運算能力的邊緣伺服器(Edge server),其主要目的為提供極低延遲的應用服務,屆時某些需低延遲的運算任務將不再傳到距離較遠的雲端伺服器(Cloud server)做運算,而是傳到距離較近的邊緣伺服器做運算。
    現有的文獻皆以兩層為主要探討對象,且在進行任務卸載時(Offloading)只考慮自身的利益做決策,但相同的任務在不同的裝置可能會重複發生,若只單看自身利益將可能造成相同的任務在不同的裝置上重複運算,導致延遲時間更長。基於上述原因,我們在有快取及合作的邊緣伺服器架構下提出一個以全系統利益的增益基礎合作卸載決策(Gain-based Collaborative Offloading, GCO),裝置將任務卸載至邊緣伺服器且邊緣伺服器快取其任務結果,當有相同任務請求時,邊緣伺服器可以直接將快取資料回傳,降低延遲時間。我們的增益概念為評估卸載到邊緣伺服器是否可以減少相同任務重複計算的次數藉以降低平均延遲。
    GCO在卸載時,會考量整個系統的利益決定卸載與否,若在同一台或鄰邊的邊緣伺服器所屬的裝置對相同的任務有高存取機率時,卸載至邊緣伺服器可以降低計算相同任務的次數,進而大幅降低重複任務的計算時間。模擬結果顯示GCO比傳統的兩層式卸載決策可以減少40%的延遲時間。


    Mobile Edge Computing (MEC) is one of the main functions of the next generation mobile network 5G. The 5G mobile base station will be equipped with edge server with computing power. Its main purpose is to provide application services with ultra-low latency. At that time, some tasks that require low latency will not be transmitted to the cloud server for computation but will be transmitted to the edge server that is closer.
    The existing paper is mainly based on two-tier, and when the task is offloading, only the gain of the individual is considered. However, the same task may happen repeatedly on different UEs. If UE only considers its own gain when offloading that may cause the same task to repeat the computation on different UEs, resulting in longer delays for the whole system. For the above reasons, we propose a Gain-based Collaborative Offloading (GCO) in cache-enabled collaborative MEC. When the task is offloaded to the edge server, the edge server can cache the task result. When the same task is accessed, the cache can be directly returned, reducing the delay. Our gain concept is to evaluate whether the offload to the edge server can reduce the number of repeated computations of the same task to reduce the average delay.
    During the offloading, the GCO will consider the gain of the whole system determines whether to offload or not. If UEs covered by the same edge server or the neighboring edge server with high access probability to access the same task, offload to the edge server can reduce the same computation, thus reducing overall delay. Simulation results show that GCO can reduce the delay by 35% compared to the traditional two-tier offloading decision.

    摘要 I Abstract II Contents III List of Tables IV List of Figures V Chapter 1 Introduction 1 Chapter 2 Related Work 4 2.1 Offloading in Basic MEC 4 2.2 Offloading in Cache-enabled MEC 5 2.3 Offloading in Collaborative MEC 6 2.4 Summary of Existing Work 7 Chapter 3 System Model and Problem Statement 9 3.1 Notations 9 3.2 System model 10 3.3 Problem statement 11 Chapter 4 Gain-based Collaborative Offloading(GCO) 13 4.1 GCO concept 13 4.2 GCO algorithm 15 4.2.1 Flow chart 15 4.2.2 Formula 16 Chapter 5 Performance Evaluation 19 5.1 Scenario and Parameters 19 5.2 The effects of request size 21 5.3 The effects of workload 24 5.4 The effects of the number of UEs in an edge 27 Chapter 6 Conclusion and Future Work 30 References 31

    [1] ITU. (2016). Emerging Trends in 5G/IMT2020. Available: https://www.itu.int/en/membership/documents/missions/gva-mission-briefing-5g-28sept2016.pdf
    [2] M. Shafi et al., "5G: A tutorial overview of standards, trials, challenges, deployment, and practice," IEEE journal on selected areas in communications, vol. 35, no. 6, pp. 1201-1221, 2017.
    [3] Y. Mao, C. You, J. Zhang, K. Huang, and K. B. Letaief, "A survey on mobile edge computing: The communication perspective," IEEE Communications Surveys & Tutorials, vol. 19, no. 4, pp. 2322-2358, 2017.
    [4] Y. Mao, J. Zhang, and K. B. Letaief, "Dynamic computation offloading for mobile-edge computing with energy harvesting devices," IEEE Journal on Selected Areas in Communications, vol. 34, no. 12, pp. 3590-3605, 2016.
    [5] J. Liu, Y. Mao, J. Zhang, and K. B. Letaief, "Delay-optimal computation task scheduling for mobile-edge computing systems," in 2016 IEEE International Symposium on Information Theory (ISIT), 2016, pp. 1451-1455: IEEE.
    [6] K. Kumar and Y.-H. Lu, "Cloud computing for mobile users: Can offloading computation save energy?," Computer, no. 4, pp. 51-56, 2010.
    [7] W. Zhang, Y. Wen, K. Guan, D. Kilper, H. Luo, and D. O. Wu, "Energy-optimal mobile cloud computing under stochastic wireless channel," IEEE Transactions on Wireless Communications, vol. 12, no. 9, pp. 4569-4581, 2013.
    [8] S. Barbarossa, S. Sardellitti, and P. Di Lorenzo, "Communicating while computing: Distributed mobile cloud computing over 5G heterogeneous networks," IEEE Signal Processing Magazine, vol. 31, no. 6, pp. 45-55, 2014.
    [9] C. You, K. Huang, and H. Chae, "Energy efficient mobile cloud computing powered by wireless energy transfer," IEEE Journal on Selected Areas in Communications, vol. 34, no. 5, pp. 1757-1771, 2016.
    [10] M.-H. Chen, B. Liang, and M. Dong, "A semidefinite relaxation approach to mobile cloud offloading with computing access point," in 2015 IEEE 16th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC), 2015, pp. 186-190: IEEE.
    [11] P. Marsch et al., "5G radio access network architecture: Design guidelines and key considerations," IEEE Communications Magazine, vol. 54, no. 11, pp. 24-32, 2016.
    [12] K. Wang, H. Yin, W. Quan, and G. Min, "Enabling collaborative edge computing for software defined vehicular networks," IEEE Network, no. 99, pp. 1-6, 2018.
    [13] N. T. Ti and L. B. Le, "Computation offloading leveraging computing resources from edge cloud and mobile peers," in 2017 IEEE International Conference on Communications (ICC), 2017, pp. 1-6: IEEE.
    [14] J. Liu, K. Luo, Z. Zhou, and X. Chen, "A D2D offloading approach to efficient mobile edge resource pooling," in 2018 16th International Symposium on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks (WiOpt), 2018, pp. 1-6: IEEE.
    [15] W. Fan, Y. a. Liu, B. Tang, F. Wu, and H. Zhang, "TerminalBooster: Collaborative computation offloading and data caching via smart basestations," IEEE Wireless Communications Letters, vol. 5, no. 6, pp. 612-615, 2016.
    [16] M. Chen, Y. Hao, L. Hu, M. S. Hossain, and A. Ghoneim, "Edge-CoCaCo: Toward joint optimization of computation, caching, and communication on edge cloud," IEEE Wireless Communications, vol. 25, no. 3, pp. 21-27, 2018.
    [17] H. Wang, R. Li, L. Fan, and H. Zhang, "Joint computation offloading and data caching with delay optimization in mobile-edge computing systems," in 2017 9th International Conference on Wireless Communications and Signal Processing (WCSP), 2017, pp. 1-6: IEEE.
    [18] Y. Hao, M. Chen, L. Hu, M. S. Hossain, and A. Ghoneim, "Energy efficient task caching and offloading for mobile edge computing," IEEE Access, vol. 6, pp. 11365-11373, 2018.
    [19] E. Baştuğ, M. Bennis, and M. Debbah, "Social and spatial proactive caching for mobile data offloading," in 2014 IEEE international conference on communications workshops (ICC), 2014, pp. 581-586: IEEE.
    [20] Y.-D. Lin, Y.-C. Lai, J.-X. Huang, and H.-T. Chien, "Three-Tier Capacity and Traffic Allocation for Core, Edges, and Devices for Mobile Edge Computing," IEEE Transactions on Network and Service Management, vol. 15, no. 3, pp. 923-933, 2018.

    無法下載圖示 Full text public date 2024/08/15 (Intranet public)
    Full text public date This full text is not authorized to be published. (Internet public)
    Full text public date This full text is not authorized to be published. (National library)
    QR CODE