簡易檢索 / 詳目顯示

研究生: 黃胤璽
Yin-Xi Huang
論文名稱: 嵌入式數位訊號處理器應用於雙眼影像伺服三軸手臂擊球系統
Development of Binocular Vision Control of Ball Batting Robot by Digital Signal Processor on Embedded System
指導教授: 邱士軒
Shih-Hsuan Chiu
口試委員: 邱顯堂
Hsien-Tang Chiu
黃昌群
Chang-Chiun Huang
林其禹
Chyi-Yeu Lin
溫哲彥
Che-yen Wen
學位類別: 碩士
Master
系所名稱: 工程學院 - 材料科學與工程系
Department of Materials Science and Engineering
論文出版年: 2013
畢業學年度: 101
語文別: 英文
論文頁數: 70
中文關鍵詞: 影像伺服數位信號處理器嵌入式
外文關鍵詞: Image Servo System, DSP, Embedded System
相關次數: 點閱:330下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 過去影像伺服系統多在個人電腦基礎下開發,造成系統成本過高、設備體積龐大、效能低且不適合獨立運作,然而隨著科技進步,嵌入式數位訊號處理器(DSP: Digital Signal Processor)的功能比以往更加強大,在複雜運算上,提高準確度和效能,能滿足大部分系統在「即時性」的需求。因此,可以取代個人電腦為主的影像伺服系統。
    本論文利用TMS320DM648及DMS320F2812分別作為影像處理平台及機械手臂控制核心。先以DM648進行視覺系統計算移動物在空間中位置資訊及預測值,再由F2812透過UART接收控制命令,以反向運動學求得機械手臂轉動所需角度,進而準確做出相對反應。


    There are a lot of image servo systems in PC-based development, affecting enormous system cost, low efficiency, not appropriate independent operation and huge size hardware. In recent years, advance development in technology increase processing speed of Digital Signal Processor (DSP) and expands its functionality. In embedded system, DSP has high accuracy and high performance processing in complex arithmetic. Therefore, it could become the most system’s requirement for real-time capability and replace PC-based in visual servo system.
    In this study, TMS320DM648 and DMS320F2812 are utilized in the system. Principally, DM648 as a vision system can calculate position information of moving object and predict its value. F2812 as robot control core works based on the following sequence processes: Initially, the control commands are sent by UART to the robot control kernel. Secondly, rotation angle is got by using the inverse kinematics, and finally, the target is responded by the robot arm, i.e. hitting the target.

    摘要 Abstract 致謝 Contents Figures Index Tables Index CHAPTER 1 Introduction 1.1 Introduction 1.2 Literature Review 1.3 Research Motivation and Purpose 1.4 Thesis Structure CHAPTER 2 System Architecture 2.1 Summary of DSP Embedded System Development 2.2 Hardware Framework 2.3 TMS320DM648 2.4 DMS320F2812 2.5 Three-axis Robot Arm CHAPTER 3 Proposed Method 3.1 Image Processing Subsystem 3.1.1 Color Space 3.1.2 Camera Calibration 3.1.3 Codebook 3.1.4 Center Point Calculation 3.1.5 Binocular Vision System 3.1.6 Image Algorithm Performance Assessment 3.2 Motion Control Subsystem 3.2.1 Robot Kinematics 3.2.2 Trajectory Planning 3.2.3 Double Buffer System 3.2.4 Robot Arm Control Performance Assessment 3.3 Communication Module CHAPTER 4 System Implementation and Performance 4.1 Image Servo Batting System Experiment Profile 4.2 Image Servo Batting System Experimental Environment 4.3 Trajectory Prediction 4.4 Tactics 4.5 The Experimental Result CHAPTER 5 Conclusions and Future Work References

    Corley, D., and Jovanov, E., “A low power intelligent video-processing sensor.” Proc. of 2002 the Thirty-Fourth Southeastern Symposium on System Theory, pp. 176-178 ,(2002).
    2. Palacin, J., Sanuy, A., Clua, X., Chapinal, G., Bota, S., Moreno, M., and Herms, A., “Autonomous mobile mini-robot with embedded CMOS vision system.” Proc. of 2002 28th IEEE Annual Conf. on Industrial Electronics Society, Seville, Spain, vol. 3, pp. 2439-2444,(2002).
    3. Moritani, T., Hiura, S., Sato, K.” Object Tracking by Comparing Multiple Viewpoint Images to CG Images.”, Systems and Computers in Japan, 37 (13) 28-39 (2006)
    4. Andrian, H., and Song, K. T., “Embedded CMOS imaging system for real-time robotic vision,” Proc. of 2005.(IROS 2005).2005 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, Alberta, Canada, pp. 1096-1101, (2005).
    5. Ishikawa, M., Komuro, T., Nakabo, Y., Namiki, A.” The 1ms-Vision System and Its Application Examples.” IEEE International Conference on Robotics and Automation, (2002)
    6. Moritani, T., Hiura, S., Inokuchi, S.” Object Tracking by Comparing CG Images with Multiple Viewpoint Images.” Proc. IEEE conf. on Multisensor Fusion and Integration for Intelligent Systems, 241-246 (2003)
    7. Yamakawa, Y., Namiki, A., Ishikawa, M., Shimojo, M.” One-handed knotting of a flexible rope with a high-speed multi-fingered hand having tactile sensors.” (2007)
    8. Li, D., Jiang, Y., and Chen, G., “A Low Cost Embedded Color Vision System Based on SX52,” Proc. of 2006 IEEE International Conf. on Information Acquisition, Weihai, China, pp. 883-887, (2006).
    9. Bonato, V., Fernandes, M. M., Marques, E., “A smart camera with gesture regcognition and SLAM capabilities for mobile robots,” International Journal of Electronics, Vol. 93, Issue 6, pp. 385-401, (2006).
    10. Nakashima, A., Sugiyama, Y., Hayakawa, Y.” Paddle Juggling of One Ball by Robot Manipulator with Visual Servo.” (2006)
    11. Nakashima, A., Nagase, K., Hayakawa, Y.” Control of a Sphere Rolling on a Plane with Constrained Rolling Motion.” (2005)
    12. M. de Graaf, et al.,” Real-Time Seam Tracking for Robotic Laser Welding using Trajectory-Based Control”, Control Engineering Practice 18 (8) (2010) 944-53. (2010)
    13. Yanling Zhao, Peng Wang, HuanruiHao, Xianli Liu “The Embedded Control System of Vision Inspecting Instrument for Steel Ball Surface Defect,” Proc. of 2008. CCDC 2008.Chinese Control and Decision Conference, Yantai, Shandong, pp. 3469-3474, (2008).
    14. Zhang, Y.,“Design of robot monocular vision based on DSP and CMOS image sensor,” Proc. of ICALIP 2008 International Conf. on Audio, Language and Image Processing, Shanghai, China, pp. 614-618, (2008).
    15. Miyazaki, F., Hirai, H.” Table Tennis Rallies between a Person and a Robot.” Int. Conf. Perception and Action, (2007)
    16. Hirai, H., Miyazaki, F., et al,” Robot Jugglers: Harmonic Behaviour of Ball-Passing Robots.” Int. Conf. Perception and Action, (2007)
    17. Frese, U., Bauml, B., Haidacher, S., Schreiber, G., Schaefer, I., Hahnle, M., Hirzinger, G.” Off-the-shelf vision for a robotic ball catcher.” IEEE International Conference on Intelligent Robots and Systems, 1623-1629 (2001)
    18. Jun Yu, Peihuang Lou, Xing Wu, “A Dual-Core Real-Time Embedded System for Vision-Based Automated Guided Vehicle,” Proc. of 2009 CASE. 2009 IITA International Conf. on Control, Automation and Systems Engineering, Zhangjiajie, pp.207-211, (2009).
    19. M. Staniak, and C. Zieliński,” Structures of Visual Servos, Robotics and Autonomous Systems” (2010).
    20. Huang, S. J., Wu, S. S., “Vision-Based Robotic Motion Control for Non-autonomous Environment,” Journal of Intelligent and Robotic Systems:Theory and Application, Vol. 54, No. 5, pp. 733-754, (2009).
    21. Chyi-Yeu Lin, Yi-Pin Chiu, Chi-Ying Lin, "Robot catching system with stereo vision and DSP platform," AFRICON, 2011, vol., no., pp.1,6, 13-15 Sept. 2011doi: 10.1109/AFRCON.2011.6072132 (2011)
    22. Blake Hannaford, Jacob Rosen, Diana C. W. Friedman, Hawkeye H. I. King, Phillip Roan, Lei Cheng, Daniel Glozman, JiMa, SinaNiaKosari, Lee White, “Raven-II: An Open Platform for Surgical Robotics Research.”IEEE Trans. Biomed. Engineering 60(4): 954-959,(2013).
    23. 孫興,” DSP在機器人視覺系統的應用於研究,”貴州:貴州大學, (2006).
    24. 宋戈, “基於DM648的視訊採集系統的設計與實現,”四川:電子科技大學,(2009).
    25. “TMS320DM647/DM648 DSPExternal Memory Interface (EMIF)”,Texas Instrument
    26. “TMS320DM647/TMS320DM648 Digital Media Processor”, Texas Instrument(SPRS372H –MAY 2007–REVISED APRIL 2012)
    27. “TMS320C6000Programmer’s Guide”, Texas Instrument
    28. “TMS320C64x/C64x+ DSPCPU and Instruction SetReferenceGuide”, Texas Instrument
    29. “EDMA v3.0 (EDMA3) Migration Guide forTMS320TCI648x DSP”, Texas Instrument
    30. “TMS320LF2407A,TMS320LF2406A, TMS320LF2403A, TMS320LF2402A, TMS320LC2406A,TMS320LC2404A,TMS320LC2403A,TMS320LC2402A, DSP CONTROLLERS”, Texas Instruments
    31. “TMS320F2810, TMS320F2811, TMS320F2812TMS320C2810, TMS320C2811, TMS320C2812Digital Signal Processors Data Manual”, Texas Instrument
    32. 林義嵐,”Development of Visual Servo Control of Ball Batting Robot by Digital Signal Processor on Embedded Platform,”台北:國立台灣科技大學高分子所碩士論文, (2010)
    33. Douglas A. Kerr, “Chrominance Subsampling in Digital Images”, (January 19, 2012)
    34. KyungnamKim, “Real-time foreground background segmentation using codebook model”, (2005)
    35. Wu M, Peng X, “Spatio-temporal contextfor codebook-based dynamic background subtraction.”, AEU-International Journal of Electronics and Communications, 64(8): 739-747, (2010).
    36. Huo Dong-Hai, Yang Dan, Zhang Xiao-Hong, Hong Ming-Jian., “Principal component analysis based codebookbackground modeling algorithm.Acta
    AutomaticaSinica, 38(4): 591-600, (2012)
    37. R. Hartley and A. Zisserman, Editor, “Multiple View Geometry in Computer Vision”, Cambridge University Press, CUP Cambridge UK,(2003).
    38. BI Ping,” A Binocular Vision System for Object Distance Detection with SIFT Descriptors”, International Journal of Hybrid Information TechnologyVol. 5, No. 3, July, 2012
    39. LIU Feng-cai1, XIE Ming-hong1, YAN Guo-lin2,“ Accuracy Analysis of Binocular Stereo Vision System,” 1.College of Mechanical Engineering and Automation, Huaqiao University, Quanzhou 362021, China; 2. Department of Electromechanical Engineering, Liming University, Quanzhou 362000, China, (2011)
    40. 莊林貴, “棒球打擊揮棒時間相關因素的探討,”台北:一品文化事業有限公司,(1991).
    41. Denavit, Jacques,Hartenberg, Richard Scheunemann, "A kinematic notation for lower-pair mechanisms based on matrices." Trans ASME J. Appl. Mech 23: 215–221, (1955).
    42. “C6000 Integration Workshop - Implementing a Double Buffered System.” Texas Instrument
    43. http://www.youtube.com/watch?v=eGgO5E-0lN0&feature=youtu.be ,(2013)

    無法下載圖示 全文公開日期 2018/07/30 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE