簡易檢索 / 詳目顯示

研究生: 王聖凱
Sheng-Kai Wang
論文名稱: 整合擴增實境與BIM之施工階段進度監控系統
A Construction Progress On-site Monitoring and Presentation System Based on The Integration of Augmented Reality and BIM
指導教授: 陳鴻銘
Hung-Ming Chen
口試委員: 楊元森
Yuan-Sen Yang
謝佑明
Yo-Ming Hsieh
學位類別: 碩士
Master
系所名稱: 工程學院 - 營建工程系
Department of Civil and Construction Engineering
論文出版年: 2020
畢業學年度: 109
語文別: 中文
論文頁數: 76
中文關鍵詞: 擴增實境室內定位建築資訊模型施工進度管理
外文關鍵詞: Augmented Reality, Indoor positioning, BIM, Construction progress management
相關次數: 點閱:328下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

在建築生命週期的施工階段中,現今普遍管理模式,施工人員基於填 表不斷累積施工進展的資訊,再以手動收集數據與圖說等型式,來掌控施 工進度,過程相當耗時且費工,且其視覺化程度不足。近年來,擴增實境 技術發展可以將現場影像與 BIM 模型做虛實畫面的整合,於真實場景中顯 示對應的建築資訊,但由於室內施工現場的定位問題,導致施工階段應用 甚少。因此本研究嘗試提出整合擴增實境與 BIM 且適用於室內施工階段之 進度管理系統概念架構。在此系統中,導入一種基於視覺慣性-即時定位與 點雲地圖建構的 AR 技術,透過掃描環境特徵點與平面檢測技術,將設定 於中心線上的對準用之虛擬元件貼合於實際現場相同物件,作為 BIM 模型 室內定位初始化,並依照現場所放置的實際物件在不同施工階段的完成面 進行位置偏移的參數校正,將模型於現場快速的定位貼合。日後便能依據 所紀錄之特徵點地圖將 BIM 模型疊合於現場做資訊呈現。此外亦導入 4D 營建管理的概念,使用者可藉由與 BIM 模型的互動,來達到管理施工作業 項目進度資訊的自動收集、資訊傳遞與進度的視覺化呈現回饋。基於時間、 BIM 模型與施工現場空間整合,可以在施工前了解空間中的約束與探討施 工的備選方案,促進後續施工進度上的掌握與調度,即時監控現場各施工 項目進度,彌補現有施工管理模式視覺化程度之不足。


In present, in the construction phase of the building life cycle, field personnel collect progress data, drawing, BIM information form construction site at certain time intervals, analysis and deliver them to their project management. Such discrete report does not explicitly convey problem with construction site and it is also time- consuming to combine all data and calculate earned value to report. In order to cope with this problem, this research presents augment reality-based construction progress monitor of management system. The system utilize AR representation based on Simultaneous Localization and Mapping (SLAM). The indoor positioning is first initialized by the user, who aligns 3D model preset virtual model on the center line with real component and regulate position parameter of the currently completed surface by different construction stage. Therefore, The live scene can be augmented with BIM 3D model by the device understanding of environment. In additional, the system also introduces 4D construction management model for monitoring the project progress on-site. The use of this system can help resident engineer, construction manager and site engineer in data collection, mechanism for visual feedback of progress, monitoring progress and evaluating project performance. Based on the above methods, this research has developed a way to quickly update the indoor positioning in a changeable construction environment, and can monitor the progress of each construction project in the real time. It makes up for the lack of visualization in the existing construction management mode. Results demonstrate high tracking accuracy, jitter free augmentation, and that the setup is sufficiently portable to be used on-site.

論文摘要 I ABSTRACT II 誌 謝 III 目錄 IV 圖目錄 VII 表目錄 X 第一章 緒論 1 1.1 研究動機 1 1.2 研究目的 5 1.3 研究範圍 6 1.4 研究方法 6 1.5 論文架構 10 第二章 文獻回顧 11 2.1 擴增實境技術於營建工程之應用 11 2.2 即時定位與地圖建構技術 14 2.3 BIM 4D 時程管理與資料收集概念 16 2.4 系統開發技術 17 2.4.1 ARKit 框架 – 基於 SLAM 定位技術 2.4.2 水平、垂直平面檢測技術 20 2.5 系統開發工具 20 2.5.1 Unity 2.5.2 Microsoft SQL Server 21 第三章 SLAM 於室內施工現場測試與使用模式 22 3.1 SLAM 於室內施工現場測試 22 3.1.1 特徵點掃描行為與路徑 22 3.1.2 特徵點掃描時移動速率 25 3.1.3 室內施工現場模型放置 28 3.1.4 工地環境變化對地圖建構影響 30 3.2 SLAM 於室內施工現場使用模式 30 3.2.1 特徵點掃描行為與閉合路徑規劃 31 3.2.2 特徵點掃描最佳平均移動速率 32 3.2.3 施工現場模型放置基準 34 3.2.4 特徵點地圖的更新頻率 36 第四章 系統架構與運作機制 38 4.1 系統架構 38 4.1.1 系統前置作業 38 4.1.2 系統運作流程 39 4.2 系統運作機制 42 4.2.1 BIM 元件模型與資訊 42 4.2.2 BIM 模型分割 43 4.2.3 模型資料匯出與資料庫寫入 43 4.2.4 作業排程匯入 46 4.2.5 模型貼合機制 47 4.2.6 室內施工現場定位機制 53 4.2.7 系統訊息視覺化回饋 55 第五章 系統使用情境 65 5.1 BIM 物件與進度資料呈現 65 5.2 完工元件進度檢核表單 66 5.3 作業進度差異切換比對 67 5.4 元件影響狀況呈現 68 5.5 4D 工程現場施工模擬 69 第六章 結論與未來展望 71 6.1 結論 71 6.2 未來展望 73 參考文獻 74

[1] Laine, H. (2020). GPS-avusteisen lisätyn todellisuuden sovelluksen hyödyntäminen kiinteistörajojen paikantamisessa.
[2] Indoor Positioning and Indoor Navigation, [Online],. Retrieved from https://www.openpr.com/news/1997023/global-indoor-positioning-and-indoor-navigation-ipin-market
[3] The BIM revolution in building management, [Online],. Retrieved from https://blog.drawbotics.com/2018/11/07/the-bim-revolution-in-building-management/
[4] Meža, S., Turk, Ž., & Dolenc, M. (2015). Measuring the potential of augmented reality in civil engineering. Advances in Engineering Software, 90, 1-10. doi:https://doi.org/10.1016/j.advengsoft.2015.06.005
[5] Golparvar-Fard, M., Pena-Mora, F., & Savarese, S. (2009). D4AR–a 4-dimensional augmented reality model for automating construction progress monitoring data collection, processing and communication. Journal of information technology in construction, 14(13), 129-153.
[6] Côté, S., Trudel, P., Desbiens, M., Giguère, M., & Snyder, R. (2013). Live mobile panoramic high accuracy augmented reality for engineering and construction. Proceedings of the Construction Applications of Virtual Reality (CONVR), London, England, 1-10.
[7] Meža, S., Turk, Ž., & Dolenc, M. (2014). Component based engineering of a mobile BIM-based augmented reality system. Automation in construction, 42, 1-12.
[8] Smith, R., Self, M., & Cheeseman, P. (1988). Estimating uncertain spatial relationships in robotics Machine intelligence and pattern recognition (Vol. 5, pp. 435-461): Elsevier.
[9] Klein, G., & Murray, D. (2009). Parallel tracking and mapping on a camera phone. Paper presented at the 2009 8th IEEE International Symposium on Mixed and Augmented Reality.
[10] Engel, J., Koltun, V., & Cremers, D. (2017). Direct sparse odometry. IEEE transactions on pattern analysis and machine intelligence, 40(3), 611-625.
[11] Leutenegger, S., Lynen, S., Bosse, M., Siegwart, R., & Furgale, P. (2015). Keyframe-based visual–inertial odometry using nonlinear optimization. The International Journal of Robotics Research, 34(3), 314-334.
[12] Shen, S., Mulgaonkar, Y., Michael, N., & Kumar, V. (2016). Initialization-free monocular visual-inertial state estimation with application to autonomous MAVs. Paper presented at the Experimental robotics.
[13] Shen, S., Michael, N., & Kumar, V. (2015). Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs. Paper presented at the 2015 IEEE International Conference on Robotics and Automation (ICRA).
[14] Qin, T., & Shen, S. (2017). Robust initialization of monocular visual-inertial estimation on aerial robots. Paper presented at the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).
[15] Usenko, V., Engel, J., Stückler, J., & Cremers, D. (2016). Direct visual-inertial odometry with stereo cameras. Paper presented at the 2016 IEEE International Conference on Robotics and Automation (ICRA).
[16] Yang, Z., & Shen, S. (2016). Monocular visual–inertial state estimation with online initialization and camera–IMU extrinsic calibration. IEEE Transactions on Automation Science and Engineering, 14(1), 39-51.
[17] Li, P., Qin, T., Hu, B., Zhu, F., & Shen, S. (2017). Monocular visual-inertial state estimation for mobile augmented reality. Paper presented at the 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).
[18] Liu, R., Zhang, J., Yin, K., Wu, J., Lin, R., & Chen, S. (2018). Instant SLAM initialization for outdoor omnidirectional augmented reality. Paper presented at the Proceedings of the 31st International Conference on Computer Animation and Social Agents.
[19] Yang, Y.-C. (2015). 結合竣工時程網圖與 BIM 技術以利爭議釐清之研究. National Central University.
[20] 吳翌禎, 紀宛君, & 謝尚賢. (2008). 4D 營建管理系統應用於成本控制之初步探討. 技師月刊(48), 44-56.
[21] Wang, H., Zhang, J., Chau, K., & Anson, M. (2004). 4D dynamic management for construction planning and resource utilization. Automation in construction, 13(5), 575-589.
[22] Kang, J. H., Anderson, S. D., & Clayton, M. J. (2007). Empirical study on the merit of web-based 4D visualization in collaborative construction planning and scheduling. Journal of Construction Engineering and Management, 133(6), 447-461.
[23] Cortés, S., Solin, A., Rahtu, E., & Kannala, J. (2018). ADVIO: An authentic dataset for visual-inertial odometry. Paper presented at the Proceedings of the European Conference on Computer Vision (ECCV).
[24] Lin, Y., Gao, F., Qin, T., Gao, W., Liu, T., Wu, W., . . . Shen, S. (2018). Autonomous aerial navigation using monocular visual‐inertial fusion. Journal of Field Robotics, 35(1), 23-51.
[25] Engel, J., Schöps, T., & Cremers, D. (2014). LSD-SLAM: Large-scale direct monocular SLAM. Paper presented at the European conference on computer vision.
[26] Ren, J., Liu, Y., & Ruan, Z. (2016). Architecture in an age of augmented reality: applications and practices for mobile intelligence BIM-based AR in the entire lifecycle. DEStech Transactions on Computer Science and Engineering(iceiti).
[27] Apple ARKit. (2019). [online] Available: https://developer.apple.com/arkit.
[28] Ahn, S., Lee, K., Chung, W. K., & Oh, S.-R. (2007). SLAM with visual plane: extracting vertical plane by fusing stereo vision and ultrasonic sensor for indoor environment. Paper presented at the Proceedings 2007 IEEE International Conference on Robotics and Automation.

無法下載圖示 全文公開日期 2025/10/27 (校內網路)
全文公開日期 2025/10/27 (校外網路)
全文公開日期 2025/10/27 (國家圖書館:臺灣博碩士論文系統)
QR CODE