簡易檢索 / 詳目顯示

研究生: 周柏伸
Bo-Shen Jhou
論文名稱: 結合直接線性轉換於彩色-深度攝影機校正之研究
A Study of RGB-Depth Camera Calibration with Direct Linear Transformation
指導教授: 吳怡樂
Yi-Leh Wu
口試委員: 陳建中
Jiann-Jone Chen
唐政元
Cheng-Yuan Tang
閻立剛
Li-Kang Yen
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2015
畢業學年度: 103
語文別: 英文
論文頁數: 39
中文關鍵詞: 電腦視覺彩色-深度攝影機攝影機校正Kinect直接線性變換
外文關鍵詞: Computer Vision, RGB-Depth Cameras, Camera Calibration, Kinect, Direct Linear Transformation
相關次數: 點閱:377下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

近幾年搭載著深度感測器的攝影機是一種趨勢。在傳統只有顏色的影像上加入了深度資訊,使得互動式應用程式的發展得以更加卓越。這種由彩色感測器與深度感測器結合而成的RGB-Depth攝影機,如果要有效地結合深度資訊於彩色影像上,攝影機校正是必須實行的一個措施。本論文提出了一種基於直接線性轉換(Direct Linear Transformation)演算法的校正方法。直接線性轉換的優勢在於免去傳統校正方法所需要的繁瑣校正過程,利用物體與圖像上的線性關係來獲得攝影機參數,進而計算彩色攝影機與深度攝影機之間的關係,這關係包含了旋轉矩陣與位移向量。最後利用取得的參數來進行彩色幀(RGB frames)與深度幀(depth frames)的對齊,完成校正。而實驗結果表明了我們的方法是可行的,在未來我們將考慮鏡頭畸變來使得我們的校正方法更加精確。


In recent years, cameras equipped with depth sensors is a trend. The depth information makes the development of interactive applications further excellent. To make this kind of RGB-Depth cameras consisting of RGB sensors and depth sensors more concise, calibration is a must. This paper presents a novel calibration method based on the Direct Linear Transformation (DLT). The advantage of the DLT method is to get rid of the complicated calibrating procedures in the traditional calibration methods. The DLT method employs the linear relationship between objects and images to get camera parameters and measure the relationship between the RGB camera and the depth camera. The relationship contains the rotation matrix and the translation vector. Finally, we employ the acquired parameters to conduct the alignment of the RGB frames and the depth frames. The results of our experiments show that the proposed calibration method is practical. In the future, we will further consider the lens distortion and try to make the calibration more precise.

論文摘要 Abstract Contents List of Figures List of Tables Chapter 1. Introduction Chapter 2. Basic Camera Calibration 2.1 Camera intrinsic parameters 2.2 Camera extrinsic parameters Chapter 3. Proposed Method 3.1 DLT camera calibration 3.2 Stereo calibration 3.3 Pixel mapping Chapter 4. Experiments and Results 4.1 Pre-work 4.2 Evaluations 4.3 Proposed method calibration results 4.3.1 Calibration results with different board sizes 4.3.2 Calibration results with different square sizes 4.3.3 Comparison 4.4 Overview of other calibration method 4.5 Discussion Chapter 5. Conclusions and Future Work References

[1] Microsoft Xbox One Kinect Camera [Online]. Available: http://www.xbox.com/zh-TW/Kinect, referenced on June 1st, 2015.
[2] Mycestro Wearable Mouse [Online]. Available: http://www.mycestro.com/, referenced on June 1st, 2015.
[3] Open Source Computer Vision [Online]. Available: http://opencv.org/, referenced on June 1st, 2015.
[4] Nimble UX [Online]. Available: http://www.pmdtec.com/nimbleux/, referenced on June 1st, 2015.
[5] PDM Technologies [Online]. Available: http://www.pmdtec.com/, referenced on June 1st, 2015.
[6] Intel RealSense [Online]. Available: https://software.intel.com/en-us/realsense/home, referenced on June 1st, 2015.
[7] Haase. Sven, Hornegger. Joachim, Kilgus. Thomas, Maier-Hein. Lena, Schneider. Armin, Kranzfelder. Michael, Feußner. Hubertus, “Time-of-Flight Based Collision Avoidance for Robot Assisted Minimally Invasive Surgery,” International Conference on Robotics and Automation (ICRA) Workshop, Karlsruhe, 2013.
[8] Hartley. Richard, Zisserman. Andrew, “Multiple view geometry in computer vision,” Cambridge, UK: Cambridge University Press, 2003.
[9] Zhang. Zhengyou, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 11, pp. 1330-1334, 2000.
[10] Ingram. Carolyn, Marshall. Joshua, “Evaluation of a ToF camera for remote surveying of underground cavities excavated by jet boring,” Automation in Construction, vol. 49, pp. 271-282, 2015.
[11] Alenyà. G, Foix. S, Torras. Carme, “Using ToF and RGBD cameras for 3D robot perception and manipulation in human environments,” Intelligent Service Robotics, vol. 7, no. 4, pp. 211-220, 2014.
[12] Stahlschmidt. Carsten, Gavriilidis. Alexandros, Velten. Jörg, Kummert. Anton, “Applications for a people detection and tracking algorithm using a time-of-flight camera,” Multimedia Tools and Applications, 2014.
[13] Avola. Danilo, Cinque. Luigi, Levialdi. Stefano, Petracca. Andrea, Placidi. Giuseppe, Spezialetti. Matteo, “Time-of-Flight Camera Based Virtual Reality Interaction for Balance Rehabilitation Purposes,” Computational Modeling of Objects Presented in Images. Fundamentals, Methods, and Applications, Springer, pp. 363-374, 2014.
[14] Datta. Biswa Nath, “Numerical linear algebra and applications,” Pacific Grove: Brooks/Cole Pub., 1995.
[15] Hansard. Miles, Evangelidis. Georgios, Pelorson. Quentin, Horaud. Radu, “Cross-calibration of time-of-flight and colour cameras,” Computer Vision and Image Understanding, vol. 134, pp. 105-115, 2015.
[16] Prashant Jagdishchandra Bagga, “Real Time Depth Computation Using Stereo Imaging,” Journal of Electrical and Electronic Engineering, vol. 1, no. 2, p. 51, 2013.
[17] Dondi. Piercarlo, Lombardi. Luca, Cinque. Luigi, “Multisubjects Tracking by Time-of-Flight Camera,” Image Analysis and Processing (ICIAP), Springer, pp. 692-701, 2013.
[18] Zhang. Zhengyou, “Flexible camera calibration by viewing a plane from unknown orientations,” International Conference on Computer Vision (ICCV), Corfu, Greece, pp. 666-673, 1999.
[19] Duane. C. Brown, “Close-Range Camera Calibraion,” Photogram. Eng. Remote Sens, vol. 37, pp. 855-866, 1971.
[20] Staranowicz. Aaron, Brown. Garrett R, Morbidi. Fabio, Mariottini. Gian Luca, “Easy-to-Use and Accurate Calibration of RGB-D Cameras from Spheres,” Image and Video Technology, Springer, pp. 265-278, 2014.

無法下載圖示 全文公開日期 2020/07/30 (校內網路)
全文公開日期 本全文未授權公開 (校外網路)
全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
QR CODE