研究生: |
謝政廷 Juen-Ting Shie |
---|---|
論文名稱: |
整合SLAM-based頭戴式MR與BIM之室內施工進度視覺化管理遠端協作系統 Integration of SLAM-based Head-Mounted MR and BIM for Visualized Construction Progress Management System with Remote Collaboration |
指導教授: |
陳鴻銘
Hung-Ming Chen |
口試委員: |
謝佑明
Yo-Ming Hsieh 蔡孟涵 Meng-Han Tsai 陳鴻銘 Hung-Ming Chen |
學位類別: |
碩士 Master |
系所名稱: |
工程學院 - 營建工程系 Department of Civil and Construction Engineering |
論文出版年: | 2021 |
畢業學年度: | 109 |
語文別: | 中文 |
論文頁數: | 98 |
中文關鍵詞: | 即時定位與地圖建構 、混和實境 、HoloLens 、進度管理 、遠端協作 |
外文關鍵詞: | Simultaneous localization and mapping, Mixed Reality, HoloLens, Progress Management, Remote Collaboration |
相關次數: | 點閱:338 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
近年來,擴增實境裝置的發展十分的迅速,由擴增實境而衍生的混和實境更是帶來了相較於擴增實境更好的視覺化體驗。但在營建施工階段,因為施工環境而造成建築資訊模型(Building Information Modeling,BIM)貼合於施工環境的定位問題,導致此一應用的數量甚少。因此本研究主要利用SLAM-based頭戴式混和實境(Mixed Reality,MR)裝置,實作於施工場地使用之BIM室內施工進度視覺化管理系統。而在其中將會比較頭戴式裝置與本研究室先前研究所使用的手持式AR裝置,在使用SLAM技術上將BIM貼合於施工現場的成果差別,並提出在施工場地如何達成較好的裝置定位與貼合的空間特徵點掃描模式。在模型貼合完成之後,使用專案管理生產環境中常用的規劃、管理及控制軟體Oracle Primavera P6規劃施工進度,將營建進度管理概念帶入本系統,並於頭戴式MR裝置中視覺化回饋呈現各種施工進度資料的視覺化模式,如單一構件進度與整體構件進度等等。此外亦將遠端協作模式帶入本系統,使進度管理人員不必親臨現場,僅需熟悉使用頭戴式MR裝置之操作人員使用本系統,進度管理人員便可在中控中心管理各工地的施工進度。
With the development of Augmented Reality (AR), Mixed Reality (MR) that derived from AR has brought better visual experience than AR. However, in the construction phase, Building Information Modeling (BIM) positioning problems are due to the construction environment, resulting in a small number of such applications. Therefore, this research mainly uses a SLAM-based head-mount MR device to implement an indoor BIM visualized construction progress management system. In it, will compare the result of MR device using SLAM to positioning in the construction site to handheld AR device used for this laboratory’s previous research. Then proposes how to achieve better device positioning and spatial feature point scanning mode. After positioning the BIM model, use the commonly used planning, scheduling, controlling software, Oracle Primavera P6, to plan the construction schedule, bring the concept of construction schedule management into this system. Then present visual feedback in MR device with various visualization modes of construction progress data, such as single and overall component progress. In addition, bringing remote collaboration mode into this system, so that only the person on-site who is familiar with using MR device with this system is needed, the schedule manager can manage each site’s schedule in the central control center.
[1] S.-K.Wang andH.-M.Chen, “A Construction Progress On-site Monitoring and Presentation System Based on The Integration of Augmented Reality and BIM,” Oct. 2020, doi: 10.22260/isarc2020/0023.
[2] C.Flavián, S.Ibáñez-Sánchez, andC.Orús, “The impact of virtual, augmented and mixed reality technologies on the customer experience,” J. Bus. Res., vol. 100, pp. 547–560, Jul.2019, doi: 10.1016/j.jbusres.2018.10.050.
[3] S.Wang et al., “Augmented reality as a telemedicine platform for remote procedural training,” Sensors (Switzerland), vol. 17, no. 10, Oct.2017, doi: 10.3390/s17102294.
[4] K.ElAmmari andA.Hammad, “Remote interactive collaboration in facilities management using BIM-based mixed reality,” Autom. Constr., vol. 107, Nov.2019, doi: 10.1016/j.autcon.2019.102940.
[5] “HoloLens for Construction | Trimble XR10 with HoloLens 2.” https://fieldtech.trimble.com/en/products/mixed-reality/trimble-xr10-with-hololens-2 (accessed Aug. 10, 2021).
[6] “HoloLens 2 hardware | Microsoft Docs.” https://docs.microsoft.com/en-us/hololens/hololens2-hardware (accessed Aug. 10, 2021).
[7] “營建工程應導入專案管理的新思維 - 社團法人台灣省土木技師公會.” http://www.twce.org.tw/modules/freecontent/include.php?fname=twce%2Fpaper%2F873%2F1-1.htm (accessed Sep. 16, 2021).
[8] S.Meža, Ž.Turk, andM.Dolenc, “Measuring the potential of augmented reality in civil engineering,” Adv. Eng. Softw., vol. 90, pp. 1–10, Dec.2015, doi: 10.1016/j.advengsoft.2015.06.005.
[9] S.Meža, Ž.Turk, andM.Dolenc, “Component based engineering of a mobile BIM-based augmented reality system,” Autom. Constr., vol. 42, pp. 1–12, Jun.2014, doi: 10.1016/j.autcon.2014.02.011.
[10] A.Webster, S.Feiner, …B. M.-P. A. T., and undefined1996, “Augmented reality in architectural construction, inspection and renovation,” coffeetalk.cc.gatech.edu, Accessed: Sep.02, 2021. [Online]. Available: https://scholar.google.com/ftp://coffeetalk.cc.gatech.edu/pub/people/blair/asce.pdf.
[11] P. S.Dunston andD. H.Shin, “Key Areas And Issues For Augmented Reality Applications On Construction Sites,” Mix. Real. Archit. Des. Constr., pp. 157–170, 2009, doi: 10.1007/978-1-4020-9088-2_10.
[12] C.-S.Park, D.-Y.Lee, O.-S.Kwon, andX.Wang, “A framework for proactive construction defect management using BIM, augmented reality and ontology-based data collection template,” Autom. Constr., vol. 33, pp. 61–71, Aug.2013, doi: 10.1016/j.autcon.2012.09.010.
[13] H.Tamura, “Mixed Reality: Merging Real and Virtual Worlds,” J. Robot. Soc. Japan, vol. 16, no. 6, pp. 759–762, Sep.1998, doi: 10.7210/JRSJ.16.759.
[14] J.Chalhoub andS. K.Ayer, “Using Mixed Reality for electrical construction design communication,” Autom. Constr., vol. 86, pp. 1–10, Feb.2018, doi: 10.1016/j.autcon.2017.10.028.
[15] P.Hübner, K.Clintworth, Q.Liu, M.Weinmann, andS.Wursthorn, “Evaluation of HoloLens Tracking and Depth Sensing for Indoor Mapping Applications,” Sensors, vol. 20, no. 4, p. 1021, Feb.2020, doi: 10.3390/s20041021.
[16] A.Dong, M.LouMaher, M. J.Kim, N.Gu, andX.Wang, “Construction defect management using a telematic digital workbench,” Autom. Constr., vol. 18, no. 6, pp. 814–824, Oct.2009, doi: 10.1016/J.AUTCON.2009.03.005.
[17] F.Dai, A.Olorunfemi, W.Peng, D.Cao, andX.Luo, “Can mixed reality enhance safety communication on construction sites? An industry perspective,” Saf. Sci., vol. 133, 2021, doi: 10.1016/j.ssci.2020.105009.
[18] E.Arroyo, V.Righi, J.Blat, O. A.-T. andSystems, and undefined2010, “Distributed multi-touch virtual collaborative environments,” ieeexplore.ieee.org, Accessed: Sep.02, 2021. [Online]. Available: https://ieeexplore.ieee.org/abstract/document/5478454/?casa_token=xQSZJ1ZmOJIAAAAA:NexugyjA6clLnEvSv6E136_NDyAUnkAPdr_dmgBZ86SSM3W0QyJF-_cuSM8oxMpg1j9r-z_yUQ.
[19] A.Hammad, H.Wang, andS. P.Mudur, “Distributed Augmented Reality for Visualizing Collaborative Construction Tasks,” J. Comput. Civ. Eng., vol. 23, no. 6, pp. 418–427, Oct.2009, doi: 10.1061/(ASCE)0887-3801(2009)23:6(418).
[20] P.Hübner, K.Clintworth, Q.Liu, M.Weinmann, andS.Wursthorn, “Evaluation of HoloLens Tracking and Depth Sensing for Indoor Mapping Applications,” Sensors, vol. 20, no. 4, p. 1021, Feb.2020, doi: 10.3390/s20041021.
[21] “Direct manipulation with hands - Mixed Reality | Microsoft Docs.” https://docs.microsoft.com/en-us/windows/mixed-reality/design/direct-manipulation (accessed Sep. 18, 2021).
[22] “Point and commit with hands - Mixed Reality | Microsoft Docs.” https://docs.microsoft.com/en-us/windows/mixed-reality/design/point-and-commit (accessed Sep. 18, 2021).
[23] “WebRTC API - Web APIs | MDN.” https://developer.mozilla.org/zh-TW/docs/Web/API/WebRTC_API (accessed Aug. 28, 2021).
[24] “Usage Statistics and Market Share of Apache, August 2021.” https://w3techs.com/technologies/details/ws-apache (accessed Aug. 28, 2021).
[25] “microsoft/MixedReality-WebRTC: MixedReality-WebRTC is a collection of components to help mixed reality app developers integrate audio and video real-time communication into their application and improve their collaborative experience.” https://github.com/microsoft/MixedReality-WebRTC (accessed Sep. 22, 2021).
[26] “Locatable camera - Mixed Reality | Microsoft Docs.” https://docs.microsoft.com/en-us/windows/mixed-reality/develop/platform-capabilities-and-apis/locatable-camera (accessed Sep. 22, 2021).
[27] 郭榮欽, “再談COBie(上)郭榮欽 (2017.7).” https://www.ntubim.net/bim2356027396/-cobie-20177.
[28] J. Rosenberg, “Interactive Connectivity Establishment (ICE): A Protocol for Network Address Translator (NAT) Traversal for Offer/Answer Protocols,” Apr.2010. https://www.rfc-editor.org/rfc/rfc5245.html (accessed Sep. 03, 2021).