簡易檢索 / 詳目顯示

研究生: 曾郁升
Yu-Sheng Zeng
論文名稱: 應用模型預測控制於自動對位系統之影像伺服研究
Visual Servoing of Automatic Alignment System Using Model Predictive Control
指導教授: 林紀穎
Chi-Ying Lin
口試委員: 陳亮光
Liang-kuang Chen
姜嘉瑞
Chia-Jui Chiang
學位類別: 碩士
Master
系所名稱: 工程學院 - 機械工程系
Department of Mechanical Engineering
論文出版年: 2013
畢業學年度: 101
語文別: 中文
論文頁數: 114
中文關鍵詞: 自動對位影像伺服模型預測控制攝影機模型限制處理
外文關鍵詞: Automatic alignment, visual servoing, model predictive control, camera model, constraint handling
相關次數: 點閱:179下載:6
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 結合機器視覺與運動控制之影像伺服技術進行自動對位是精密量測與製造產業中重要的關鍵技術。因為系統組裝與技術整合容易的緣故,現有業界常採用以位置誤差迴授為基礎之影像伺服架構進行精密定位,然而若校正不準確,則會累積定位誤差。因此許多學者紛紛發展影像平面誤差為基礎之影像伺服架構以解決此問題,但其工作範圍小與奇異點等衍生問題也帶來不少應用困境。由於微處理器效能不斷進步與效能要求日益嚴苛,採用可進行限制處理之進階控制方法如模型預測控制獲得影像伺服控制律是一個自動化領域中頗受重視的研究議題。本論文旨在探討模型預測控制器於自動對位系統影像伺服應用之可行性與應用潛力。本文以光機電系統中常見的插銷與孔對位組裝應用為例,首先推導結合攝影機與系統動態之系統數學模型,基於此模型建立模型預測控制設計所需的預測輸出與成本函數。在設計階段便將影像端限制一併納入成本函數,並在每個取樣周期中進行二次規劃求解,獲得馬達各軸最佳控制量。文中以兩組不同顏色的標誌分別代表插銷與孔,將其中一個標誌固定於空間座標中,另一個標誌則置於平台上,藉由所設計控制律以達成對位目的。經由模擬與實驗可知本文所提之結合影像模型預測控制可進行精密定位外,亦能輔以限制處理改善系統整體效能。


    As a key technology combining machine vision and motion control, visual servoing has been widely applied to many precise alignment applications for precision measurement and manufacturing. Because of its easy integration and quick assembly, position based visual servo which reduces the feedback position error after image processing is a well-known solution for industrial practitioners. Because position based visual servo is sensitive to image calibration errors, applying image based visual servo which performs tracking task in image plane is an effective alternative for recent researchers. However, some problems in using image based visual servo include limited workspace and singularities, thus motivating the necessity of using advanced controls to achieve precise positioning. This thesis aims to develop a visual servoing method for automatic alignment systems using model predictive control, a practical optimal control strategy which can deal with constraints in process industries. For illustration purpose this study adopts a micro peg-and-hole assembly system commonly seen in optomechatronic systems. Geometric relationship of two image feature points defining the target and moving positions and the system model containing stage dynamics and camera parameters are first derived to build the predicted model and cost function for optimization problem setup. At each sample step the control command is obtained by solving a quadratic programming optimization problem. Simulation and experiments demonstrate some preliminary results and confirm that this method is feasible in automatic alignment applications.

    摘要 I Abstract II 致謝 III 目錄 IV 圖目錄 VI 表目錄 XI 符號表 XII 第一章 緒論 1 第二章 系統架設 8 2.1 多軸對位平台…………………………………………………… 9 2.2 攝影機…………………………………………………………… 11 2.3 馬達與驅動器…………………………………………………… 11 2.4 實驗硬體………………………………………………………… 12 2.5 影像流程………………………………………………………… 14 第三章 結合影像之模型預測控制器設計 15 3.1 攝影機成像原理與座標轉換…………………………………… 15 3.2結合影像之模型預測控制器(VMPC)設計……………………… 19 3.2.1 平台動態模型建立……………………………………… 19 3.2.2 攝影機模型建立………………………………………… 20 3.2.3 不同象限幾何關係……………………………………… 22 3.2.4影像對位系統動態模型建立…………………………… 24 3.2.5 線性化…………………………………………………… 25 3.2.6 結合影像之模型預測控制器建立……………………… 27 3.3結合狀態估測器之影像模型預測控制器設計………………… 32 3.4 結合積分控制之影像模型預測控制器設計…………………… 33 3.5非線性控制器設計(NCD)工具箱參數調校…………………… 39 第四章 影像處理 41 4.1 色彩空間轉換…………………………………………………… 41 4.2 高斯濾波………………………………………………………… 43 4.3 灰階與二值化…………………………………………………… 44 4.4影像形態學……………………………………………………… 45 4.5重心運算………………………………………………………… 46 4.6 攝影機校正……………………………………………………… 46 第五章 模擬與實驗結果 49 5.1 系統識別………………………………………………………… 49 5.2馬達模型驗證…………………………………………………… 51 5.3影像模型驗證…………………………………………………… 54 5.4 模擬與實驗結果討論…………………………………………… 58 5.4.1 VMPC模擬與實驗結果………………………………… 58 5.4.2 VMPC加入輸入限制模擬與實驗結果………………… 70 5.4.3 VMPC動態定位模擬與實驗結果……………………… 79 5.4.4 IVMPC定位模擬與實驗結果…………………………… 87 5.4.5 VMPC加入狀態限制之對位模擬與實驗結果………… 94 第六章 結論與未來研究方向 101 6.1 結論……………………………………………………………… 101 6.2 未來研究方向…………………………………………………… 102 參考文獻 104

    [1] M. H. Wu, C. S. Fuh and H. Y. Chen, “Defect inspection and analysis of color filter panel,” Image and Recognition, Vol. 6, No. 2, pp. 74-90, 2000.
    [2] S. E. Hodges and R. J. Richards, “Fast multi-resolution image processing for PCB manufacture,” IEE Colloquium on Multi-resolution Modeling and Analysis in Image Processing and Computer Vision, pp. 1-8. 1995.
    [3] C. T. Lin and Y. C. Yu, “Visual servoing for mask alignment in photolithography positioning system,” Proceedings of the 2005 IEEE International Conference on Mechatronics, pp. 762-767.
    [4] C. S. Lin, Y. L. Lay, C. C. Huan, H. C. Chang and T. S. Hwang, “An image-based LCD positioning system utilizing the modified FHT method,”Optik-International Journal for Light and Electron Optics, Vol. 114, No. 4, pp. 151-159, 2003.
    [5] J. Mahon, “Automatic 3-D inspection of solder paste on surface mount printed circuit boards,” Journal of Materials Processing Technology, Vol. 26, No.2, pp. 245-256, 1991.
    [6] C. S. Lin and L. W. Lue, “An image system for fast positioning and accuracy inspection of ball grid array boards,” Microelectronics Reliability, Vol. 41, No. 1, pp. 119-128, 2001.
    [7] “面板對位” http://www.tektriune.com.tw/pic/TK2151.jpg
    [8] “光罩對位” http://opto-equipment.etrading.com.tw/positioning.htm
    [9] “表面塗裝”
    http://i02.c.aliimg.com/blog/upload/2007/07/08/8de1f9f305e44feeae1353e1b89d7789.jpg
    [10] “印刷電路板” http://www.effedi.info/images/rebal.JPG
    [11] E. T. Enikov, L. L. Minkov and S. Clark, “Microassembly experiments with transparent electrostatic gripper under optical and vision-based control,” Industrial Electronics, Vol. 52, No, 4, pp. 1005-1012, 2005.
    [12] L. Wang, L. Ren, J. K. Mills and W. L. Cleghorn, “Automatic 3d joining in microassembly. In Information Acquisition,” Proceedings of the 2007 Interna-tional Conference on Information Acquisition, pp. 292-297.
    [13] H. K. Chu, J. K. Mills and W. L. Cleghorn, “Dynamic tracking of moving objects in microassembly through visual servoing,” Proceedings of the 2010 International Conference on Mechatronics and Automation, pp. 1738-1743.
    [14] R. Murthy, A. Das and D. Popa, “Multiscale robotics framework for mems assembly,” Proceedings of the 2006 International Conference on Control, Automation, Robotics and Vision, pp. 1-6.
    [15] A. Ferreira, C. Cassier and S. Hirai, “Automatic microassembly system assisted by vision servoing and virtual reality,” IEEE/ASME Transactions on Mechatronics, Vol. 9, No. 2, pp. 321-333, 2004.
    [16] C. R. Witham, M. W. Beranek, B. R. Carlisle, E. Y. Chan and D. G. Koshinz, “Fiber-optic pigtail assembly and attachment alignment shift using a low-cost robotic platform,” Proceedings of the 2000 Conference on Electronic Components & Technology, pp. 21-25.
    [17] D. T. Pham and M. Castellani, “Intelligent control of fibre optic components assembly,” Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture, Vol. 215, No. 9, pp. 1177-1189, 2001.
    [18] C. L. Chu, S. H. Lin, Z. Y. Fu and K. K. Yen, “The development of an optical fiber alignment and fusion machine,” In Mechatronics, 2005. ICM'05. IEEE International Conference on (pp. 472-476). IEEE. (2005, July)
    [19] M. Jarabek and D. W. Capson, “Robot position servoing using visual gap measurement,” Proceedings of the 1998 Conference on Instrumentation and Measurement Technology, Vol. 1, pp. 26-30.
    [20] D. Kragic and H. I. Christensen, “Model based techniques for robotic servoing and grasping,” Proceedings of the 2002. IEEE/RSJ International Conference on Intelligent Robots and Systems, Vol. 1, pp. 299-304.
    [21] H. Wang, X. Su, X. Lu and M. Cao, “An Improved Algorithm of the Hand-Eye Vision Positioning System Based on the Constant Rotation Matrix,” In Intelligent Control and Automation, 2006. WCICA 2006. The Sixth World Congress on, Vol. 2, pp. 9222-9226.
    [22] B. Zhang, J. Wang, G. Rossano, C. Martinez and S. Kock, “Vision-guided robot alignment for scalable, flexible assembly automation,” Proceedings of the 2011 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 944-951.
    [23] G. Yang, J. A. Gaines and B. J. Nelson, “A supervisory wafer-level 3D microassembly system for hybrid MEMS fabrication,” Journal of Intelligent and Robotic Systems, Vol. 37, No. 1, pp. 43-68, 2003.
    [24] B. Bascle, N. Navab, M. Loser, B. Geiger and R. Taylor, “Needle placement under X-ray fluoroscopy using perspective invariants,” Proceedings of the 2000 IEEE Workshop on Mathematical Methods in Biomedical Image Analysis, pp. 46-53.
    [25] Q. Zhou, P. Korhonen, B. Chang and V. Sariola, “6 DOF dexterous microgripper for inspection of microparts,” Proceedings of the 2005 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, pp. 534-539.
    [26] S. H. Lai and M. Fang, “A hybrid image alignment system for fast and precise pattern localization,” Real-Time Imaging, Vol. 8, No. 1, pp. 23-33, 2002.
    [27] G. L. Mariottini, G. Oriolo and D. Prattichizzo, “Image-based visual servoing for nonholonomic mobile robots using epipolar geometry,” Robotics, Vol. 23, No. 1, pp. 87-100, 2007.
    [28] M. Berkemeier, M. Davidson, V. Bahl, Y. Chen and L. Ma, “Visual servoing of an omni-directional mobile robot for alignment with parking lot lines,” Proceedings of the 2002 IEEE International Conference on Robotics and Automation, Vol. 4, pp. 4204-4210.
    [29] O. Bourquardez and F. Chaumette, “Visual servoing of an airplane for alignment with respect to a runway,” Proceedings of the 2007 IEEE International Conference on Robotics and Automation, pp. 1330-1335.
    [30] L. Mejias, S. Saripalli, P. Campoy and G. S. Sukhatme, “Visual servoing of an autonomous helicopter in urban areas using feature tracking,” Journal of Field Robotics, Vol. 23, No.3‐4, pp.185-199, 2006.
    [31] M. T. Lam, W. C. Clem and S. “Takayama, Reversible on-demand cell alignment using reconfigurable microtopography,” Biomaterials, Vol. 29, No. 11, pp. 1705-1712, 2008.
    [32] H. Chen, W. Eakins, J. Wang, G. Zhang and T. Fuhlbrigge, “Robotic wheel loading process in automotive manufacturing automation,” Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3814-3819.
    [33] F. Chaumette and S. Hutchinson, “Visual servo control. I. Basic approaches,” Robotics & Automation Magazine, Vol. 13, No. 4, pp. 82-90, 2006.
    [34] S. Pitipong, P. Pornjit and P. Watcharin, “An automated four-DOF robot screw fastening using visual servo,” Proceedings of the 2010 IEEE/SICE International Symposium on System Integration (SII), pp. 379-383.
    [35] S. J. Huang and S. Y. Lin, “Application of visual servo-control X–Y table in color filter cross mark alignment,” Sensors and Actuators A: Physical, Vol. 152, No. 1, pp. 53-62, 2009.
    [36] C. Y. Nian and Y. S. Tarng, “An auto-alignment vision system with three-axis motion control mechanism,” The International Journal of Advanced Manufacturing Technology, Vol. 26, No. 9-10, pp. 1121-1131, 2005.
    [37] C. C. Wen, H. H. Lin, S. W. Lin, C. T. Lin, C. M. Yang and J. J. Yang, “The Integration and Application of Machine Vision by Using Ultra-low Alignment Stage,” Proceedings of the 2011 First International Conference on Robot, Vision and Signal Processing (RVSP), pp. 98-101.
    [38] S. Kwon and J. Hwang, “Kinematics, pattern recognition, and motion control of mask–panel alignment system,” Control Engineering Practice, Vol. 19, No. 8, pp. 883-892, 2011.
    [39] S. Kwon, H. Jeong and J. Hwang, “Kalman Filter-Based Coarse-to-Fine Control for Display Visual Alignment Systems,” Automation Science and Engineering, Vol. 9, No. 3, pp. 621-628, 2012.
    [40] C. Park and S.Kwon, “An efficient vision algorithm for fast and fine mask-panel alignment,” Proceedings of the 2006. International Joint Conference on SICE-ICASE, pp. 1441-1445.
    [41] H. W. Lee, C. H. Liu, Y. Y. Chiu and T. H. Fang, “Design and control of an optical alignment system using a parallel XXY stage and four CCDs for micro pattern alignment,” Proceedings of the 2012 Symposium on Design, Test, Integration and Packaging of MEMS/MOEMS (DTIP), pp. 13-17.
    [42] H. T. Kim, C. S. Song and H. J. Yang, “2-Step algorithm for automatic alignment in wafer dicing process,” Microelectronics Reliability, Vol. 44, No. 7, pp. 1165-1179, 2004.
    [43] W. J. Wilson, “Visual servo control of robots using kalman filter estimates of robot pose relative to work-pieces,” Visual Servoing, Vol. 7, pp. 71-104, 1994.
    [44] S. H. Suh and K. G. Shin, “A variational dynamic programming approach to robot-path planning with a distance-safety criterion,” Robotics and Automation, IEEE Journal of, Vol. 4, No. 3, pp. 334-349, 1988.
    [45] B. H. Yoshimi and P. K. Allen, “Active, uncalibrated visual servoing,” Proceedings of the 1994 IEEE International Conference on Robotics and Automation, pp. 156-161.
    [46] 許朝昕, “不需相機校正之影像伺服系統設計與實作,” 國立台灣科技大學機械工程研究所碩士論文, 2012.
    [47] J. Wang, A. Liu, X. Tao and H. Cho, “Microassembly of micropeg and-hole using uncalibrated visual servoing method,” Precision engineering, Vol. 32, No. 3, pp. 173-181, 2008.
    [48] C. P. Neuman, A. C. Sanderson, and L. E. Weiss, “Dynamic sensor-based control of robots with visual feedback,” IEEE Journal of Robotics and Automation, Vol. 3, No. 5, pp. 404-417, 1987.
    [49] A. Shademan, A. M. Farahmand and M. Jagersand, “Towards learning robotic reaching and pointing: An uncalibrated visual servoing approach,” Proceedings of the 2009 Canadian Conference on Computer and Robot Vision, pp. 229-236.
    [50] J. Bert, S. Dembélé and N. Lefort-Piat, “Performing weak calibration at the microscale, application to micromanipulation,” Proceedings of the2007 IEEE International Conference on Robotics and Automation, pp. 4937-4942.
    [51] G. D. Hager, “A modular system for robust positioning using feedback from stereo vision,” Robotics and Automation, Vol. 13, No. 4, pp. 582-595, 1997.
    [52] B. Tamadazte, N. F. Piat and S. Dembélé, “Robotic micromanipulation and microassembly using monoview and multiscale visual servoing,” IEEE/ASME Transactions on Mechatronics, Vol. 16, No. 2, pp. 277-287, 2011.
    [53] L. Yuan, W. Qinglin, F. Zhun, C. Hui and S. Yong, “Aperture detection and alignment control based on structured light vision,” In Control Conference (CCC), 2010 29th Chinese , pp. 3773-3778.
    [54] A. Y. Yazicioglu, B. Calli and M. Unel, “Image based visual servoing using algebraic curves applied to shape alignment,” Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 5444-5449.
    [55] B. Espiau, “Effect of camera calibration errors on visual servoing in robotics,” Proceedings of the 3rd 1993 International Symposium on Experimental Robotics, pp. 182–192.
    [56] S. Huntchinson, G. Hager and P. I. Corke, “A tutorial on visual servo control,” IEEE Transactions on Robotics and Automation, Vol. 12, No. 5, pp. 651-670, 1996.
    [57] O. Kermorgant and F. Chaumette, “Combining ibvs and pbvs to ensure the visibility constraint,” Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS), pp. 2849-2854.
    [58] E. Malis, F. Chaumette and S. Boudet, “2-1/2-D visual servoing with respect to unknown objects through a new estimation scheme of camera displacement,” International Journal of Computer Vision, Vol. 15, No. 2, pp. 79-97, 1999.
    [59] E. Malis and F. Chaumette, “Theoretical improvements in the stability analysis of a new class of model-free visual servoing methods,” Robotics and Automation, IEEE Transactions on, Vol. 18, No. 2, pp. 176-186, 2002.
    [60] K. Hashimoto, T. Kimoto, T. Ebine and H. Kimura, “Manipulator control with image-based visual servo,” Proceedings of the 1991 IEEE International Conference on Robotics and Automation, pp. 2267-2271.
    [61] J. Wang and H. Cho, “Micropeg and hole alignment using image moments based visual servoing method,” IEEE Transactions on Industrial Electronics, Vol. 55, No. 3, pp. 1286-1294, 2008.
    [62] P. I. Corke and S. Huntchinson, “A new partitioned approach to image-based visual servo control,” IEEE Transactions on Robotics and Automation, Vol. 17, No. 4, pp. 507-515, 2001.
    [63] F. Chaumette, “Potential problems of stability and convergence in image-based and position-based visual servoing,” The confluence of vision and control, Vol. 237, pp. 66-78, 1998.
    [64] G. L. Mariottini and D. Prattichizzo, “EGT for multiple view geometry and visual servoing: robotics vision with pinhole and panoramic cameras,” Robotics & Automation Magazine, Vol. 12, No. 4, pp. 26-39, 2005.
    [65] F. Chaumette and S. Hutchinson, “Visual servo control, Part II: Advanced approaches,” IEEE Robotics and Automation Magazine, Vol. 14, No.1, pp.109-118, 2007.
    [66] N. R. Gans and S. A. Hutchinson, “Stable visual servoing through hybrid switched-system control,” IEEE Transactions on Robotics, Vol. 23, No. 3, pp. 530-540, 2007.
    [67] C. Y. Lin and Y. C. Liu, “Precision Tracking Control and Constraint Handling of Mechatronic Servo Systems Using Model Predictive Control,” IEEE/ASME Transactions on Mechatronics, Vol. 17, No. 4, pp. 593-605, 2011.
    [68] L. V. D. Broeck, M. Diehl, and J. Servers, “Experimental Validation of Time Optimal MPC on a Linear Drive System,” The 11th IEEE International Workshop on Advanced Motion Control, Nagaoka, Japan, March 21-24, 2010, pp. 355-360.
    [69] M. K. Samal, M. Garratt, H. Pota, and H. T. Sangani, “Model Predictive Attitude Control of Vario Unmanned Helicopter,” IECON 2011 - 37th Annual Conference on IEEE Industrial Electronics Society, pp. 622-627.
    [70] I. Maurovic, M. Baotic, and I. Petrovic, “Explicit Model Predictive Control for Trajectory Tracking with Mobile Robots,” Proceedings of the 2011 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, pp. 712-717.
    [71] Y. Wang, H. Lang and W. Clarence, “An autonomous mobile grasping system using visual servoing and nonlinear model predictive control,” Proceedings of the 2010 8th IEEE International Conference on Control and Automation (ICCA), pp. 86-91.
    [72] C. Lazar and A. Burlacu, “Predictive control strategy for image based visual servoing of robot manipulators,” Proceedings of the 9th International Conference on Automation and Information, 2008, pp. 91-97.
    [73] C. Lazar and A. Burlacu, “Visual servoing of robot manipulators using model-based predictive control,” Proceedings of the7th IEEE International Conference on Industrial Informatics, 2009, pp. 690-695.
    [74] A. Burlacu, C. Copot, E. Cervera and C. Lazar, “Real-time visual predictive control of manipulation systems,” Proceedings of the 15th International Conference on Advanced Robotics (ICAR), 2011, pp. 383-388.
    [75] C. Copot, C. Lazar and A. Burlacu, “Predictive control of nonlinear visual servoing systems using image moments,” Control Theory & Applications, IET, Vol. 6, No. 10, pp. 1486-1496, 2012.
    [76] A. Burlacu, C. Copot and C. Lazar, “Predictive control architecture for real-time image moments based servoing of robot manipulators,” Journal of Intelligent Manufacturing, pp. 1-10, 2013.
    [77] G. Allibert, E. Courtial and Y. Touré, “Visual predictive control for manipulators with catadioptric camera,” Proceedings of the 2008 IEEE International Conference on Robotics and Automation, pp. 510-515.
    [78] M. Sauvée, P. Poignet and E. Dombre, “Ultrasound image-based visual servoing of a surgical instrument through nonlinear model predictive control,” The International Journal of Robotics Research, Vol. 27, No. 1, pp. 25-40, 2008.
    [79] M. Sauvée, P. Poignet, E. Dombre and E. Courtial, “Image based visual servoing through nonlinear model predictive control,” Proceedings of the 45th IEEE Conference on Decision and Control, 2006, pp. 1776-1781.
    [80] A. Chan, S. Leonard, E. A. Croft and J. J. Little, “Collision-free visual servoing of an eye-in-hand manipulator via constraint-aware planning and control,” In American Control Conference (ACC), 2011, pp. 4642-4648.
    [81] M. W. Spong, S. Hutchinson and M. Vidyasagar, Robot Modeling and Control, Wiley New Jersey, 2006.
    [82] “Dr. T. Burg's影像處理” http://www.clemson.edu/ces/crb/ece495/index.htm
    [83] 劉燕忠, “模型預測控制於精密機電伺服系統之應用,” 國立台灣科技大學機械工程研究所碩士論文, 2010.
    [84] R. Tsai, “A Versatile Camera Calibration Techniaue for High-Accuracy 3D Machine Vision Metrology Using Off-the-shelf TV Cameras and Lenses,” IEEE Journal of Robotics and Automation, Vol. 3, No. 4, pp. 323-344, 1987.
    [85] 黃一桓, “模型預測控制於撓性樑主動抑振控制研究之實驗探討,” 國立台灣科技大學機械工程研究所碩士論文, 2012.
    [86] G. F. Franklin, J. D., Powell and A. Emami-Naeini, Feedback control of dynamics systems, Addison-Wesley, Reading, MA, 1994.
    [87] R. E. Kalman, “A new approach to linear filtering and prediction problems,” Journal of basic Engineering, Vol. 82, No. 1, pp. 35-45, 1960.
    [88] “Nonlinear control design blockset for use with Simulink®,” http://www.weiz mann.ac.il/matlab/pdf_doc/ncd/ncd_blks.pdf
    [89] “RGB色彩空間” http://marwaelabdi.blogspot.tw/
    [90] “HSV色彩空間” https://zh.wikipedia.org/wiki/File:Triangulo_HSV.png
    [91] “YUV色彩空間” https://zh.wikipedia.org/wiki/File:YUV_UV_plane.png
    [92] “YIQ色彩空間” http://cmf-resource.blogspot.tw/2009/02/yiq-color-space.html
    [93] R. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE Journal of Robotics and Automation, Vol. 3, No. 4, pp. 323-344, 1987.
    [94] Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 22, No. 11, pp. 1330-1334, 1998.
    [95] L. Ljung, System Identification Toolbox for Use with MATLAB, The Math Works, 2007.
    [96] U. Maeder, R. Cagienard and M.Morari, “Explicit model predictive control,” In Advanced strategies in control systems with input and output constraints, Vol. 346, pp. 237-271, 2007.
    [97] A. Bemporad, F. Borrelli and M.Morari, “Model predictive control based on linear programming ~ the explicit solution,” IEEE Transactions on Automatic Control, Vol. 47, No. 12, pp. 1974-1985, 2002.

    QR CODE