簡易檢索 / 詳目顯示

研究生: 林芳宇
Fang-yu Lin
論文名稱: 飛時測距深度圖的雜訊去除及品質改善之研究
A Study of Time-of-Flight Depth Maps Denoising and Quality Improvement
指導教授: 吳怡樂
Yi-leh Wu
口試委員: 陳建中
Jiann-jone Chen
唐政元
Cheng-yuan Tang
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2013
畢業學年度: 101
語文別: 英文
論文頁數: 29
中文關鍵詞: 飛時測距攝影機三維影像資訊中值雜訊干擾雜訊去除品質改善
外文關鍵詞: random noise
相關次數: 點閱:308下載:12
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

目前除了各類學術研究之外,三維影像資訊也廣為發展於一般使用者應用上,因此,相關應用的執行速度和三維資料的精確度更加被看重。在眾多型式的三維攝影機中,飛時測距(Time-of-Flight)攝影機具備有簡單使用和價格逐漸一般民眾所能接受的優點,除此之外,飛時測距攝影機取得飛時測距時序深度圖(depth map)速度非常快,近於一般攝影的速率。近幾年以來,更有許多研究利用飛時測距攝影機作各種應用,尤其是在三維影像重建方面[2]。
然而,飛時測距攝影機因硬體限制仍有少許缺陷,低解析度及雜訊干擾也是廣為人知的議題。我們認為減少雜訊來改善時序深度圖品質能夠大大增加後續程序的精確度。在本論文中我們提出了一個針對飛行時間深度影像去除雜訊的方法。我們利用待改善品質的時序深度圖以及其前後連續數個時序深度圖中的深度資訊,取得多個相對應點的深度值,利用他們的中值修改其中間時度深度圖的深度值,以去除其雜訊。其中,我們所利用的飛時測距資料集為Cui等人[18, 19]所提供,其資料庫本用來三維影像重建,飛時測距攝影機繞著物體作等距離旋轉,在尋找雜訊深度圖之於相對應數個時序深度圖中對應點時,我們必須作校對時序深度圖的操作,才能找到正確的對應點。在我們的方法中,僅利用單一飛時測距攝影機所取得的深度資訊,並沒有利用其它額外的資訊,例如:明暗變化、幾何圖形。另外,也不需要使用額外的參數作調整。實驗結果顯示比起利用單一時序深度圖中相鄰點的深度資訊去除雜訊的方法,利用前後連續數個時序深度圖中的深度資訊去除雜訊的方法較能夠得到有效的品質改善效果。


Not only for various academic researches, the three-dimensional information is also widely employed on the development of user applications. Therefore, in 3D relating applications, the execution speed and the accuracy of depth values are more important. In many types of 3D cameras, the Time-of-Flight (TOF) cameras have the advantages of simplicity for use and price for general public. The TOF cameras can obtain depth maps at a quick rate.
The TOF cameras, however, have some limitations, such as low resolution and suffering from random noises. In this thesis, we propose a method, called Temporal-Median, for removing the random noise of the Time-of-Flight (TOF) depth maps to improve the accuracy of depth maps. To remove noises in noisy TOF depth map, for each point in the noise TOF depth map, we modify the depth value of the point by measuring the median value of its corresponding depth values of its corresponding points in temporally consecutive depth maps of the noisy TOF depth maps. To locate the exactly corresponding points in the consecutive depth map of each point in noisy depth map, we perform the alignment process for correct corresponding points. We only use the data captured by the TOF cameras, without any extra information, such as illumination or geometric shapes. Besides, the system also requires no complex parameters. Experiments results suggest that the proposed temporal denoising methods can effective reduce the noise in TOF depth maps for up to 44 percent.

論文摘要 Abstract Contents List of Figures List of Tables Chapter1. Introduction Time of Flight Camera Chapter2. TOF Depth Map Denoising Methods 2.1 TOF Datasets 2.2 TOF Depth Map Denoising Processes 2.3 TOF Depth Maps Alignment 2.4 Denoising Process Chapter3. Experiments 3.1 TOF Denoised Depth Maps and Performance 3.2 Without alignment process Chapter4. Conclusions and Future Works References

[1] A. K. Jain. “Fundamentals of Digital Image Processing”. Prentice-Hall, New York, 1989.
[2] A. Kolb, E. Barth, R. Koch, and R. Larsen. “Time-of-Flight Sensors in Computer Graphics”. Eurographics 2009.
[3] A. Rajagopalan, A. Bhavsar, F. Wallhoff, and G. Rigoll. “Resolution Enhancement of PMD Range Maps”. Lecture Notes in Computer Science, 5096:304-313, 2008.
[4] C. Schaller. “Time-of-Flight – A New Modality for Radiotherapy”. PhD Thesis , 2011.
[5] G. R. Arce. "Nonlinear Signal Processing: A Statistical Approach", Wiley:New Jersey, USA, 2005.
[6] J. Diebel and S. Thrun. “An application of markov random fields to range sensing”. In Advances in Neural Information Processing Systems 18. Pages 291-298. 2006.
[7] J. Kopf, M. Cohen, D. Lischinski, and M. Uyttendaele. Joint bilateral upsampling. ACM TOG, 26(3), 2007.
[8] MESA Swissranger SR4000 TOF camera, http://www.mesa-imaging.ch/index.php.
[9] P. Henry, M. Krainin, E. Herbst, X. Ren, and D. Fox. RGB-D Mapping: Using Depth Cameras for Dense 3D Modeling of Indoor Environments. In Proc. of International Symposium on Experimental Robotics (ISER), 2010.
[10] P. S. Windyga. “Fast impulsive noise removal”. IEEE Trans. On Image Processing, 10(1), 2001, pp. 173-179.
[11] Q. Yang, R. Yang, J. Davis, and D. Nister. Spatial-dapth super resolution for range images. In IEEE CVPR, 2007.
[12] R. C. Gonzalez and R. E. Woods. “Digital Image Processing”. Addison-Wesley, New York, 1992.
[13] R. Lange. “3D time-of-flight distance measurement with custom solid-state image sensors in CMOS/CCD-technology”, Dissertation, University of Siegen, 2000.
[14] S. Fuchs and S. May. “Calibration and registration for precise surface reconstruction with ToF cameras”. Proceedings of the Dynamic 3D Imaging Workshop in Conjunction with DAGM (Dyn3D), VOL. I, 2007.
[15] S. Fuchs and G. Hizinger. “Extrinsic and Depth Calibration of TOF cameras”. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2008.
[16] S. Schuon, C. Theobalt, J. Davis, and S. Thrun. “LidarBoost Depth Superresolution for ToF 3D Shape Scanning”. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2009.
[17] S. Vaddadi, L. Zhang, H. Jin, and S. K. Nayar. “Multiple View Image Denoising”. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2009.
[18] Y. Cui, S. Schuon, D. Chan, S. Thrun and C. Theobalt. TOF Datasets and Laser Scanning Datasets [Online], http://www.mpi-inf.mpg.de/~theobalt/tof/.
[19] Y. Cui, S. Schuon, D. Chan, S. Thrun and C. Theobalt. “3D Shape Scanning with a Time-of-Flight Camera”. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2010.

QR CODE