簡易檢索 / 詳目顯示

研究生: QUAN NGOC NGUYEN
QUAN NGUYEN
論文名稱: Study on Human Tracking and Following Using an RGB-D Camera for Mobile Robots
Study on Human Tracking and Following Using an RGB-D Camera for Mobile Robots
指導教授: 蘇順豐
Shun-Feng Su
口試委員: 王偉彥
Wei-Yen Wang
呂藝光
Yih-Guang Leu
徐勝均
Sheng-Dong Xu
學位類別: 碩士
Master
系所名稱: 電資學院 - 電機工程系
Department of Electrical Engineering
論文出版年: 2017
畢業學年度: 105
語文別: 英文
論文頁數: 79
中文關鍵詞: Human trackingRGB-D cameraDepth of InterestCAM-shiftreal timeFuzzy Logic ControllerKalman filter
外文關鍵詞: Human tracking, RGB-D camera, Depth of Interest, CAM-shift, real time, Fuzzy Logic Controller, Kalman filter
相關次數: 點閱:260下載:3
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • This study is about human following for a mobile robot platform. By using an RGB-D camera, a novel method is proposed for tracking human targets in case of high influence from the environment. The human target is automatically detected by implementing the Histogram of Oriented Gradients method. Then the idea of extracted Depth of Interest (DOI) is proposed to cooperate with a combination of the CAM-shift algorithm and Kalman filter for the human tracking phase. Our proposed DOI can have nice performance in case of similar color interference and partial occlusion by removing backgrounds which are outside of a given range. However, in case of full occlusion, the tracking rate of the above approach is only 85%. This leads to the implementation of a new handler for tracking phase when a full occlusion occurs. In the proposed handler, the so-called corner features are automatically detected by the Harris corner detector and are considered as the good features of targets. Those features will be tracked on the next frame by using the Lucas-Kanade
    algorithm. The collected data of features are compared with the use of the proposed handler. Then the STATE_POINT containing the correction point of Kalman filter will be set in the saved position of target. By implementing this handler, the tracking rate increases to 98%. The average frame rate is approximately 24 fps, which is fast enough to be used in a real-time manner. In the human following phase, two simple
    fuzzy logic controllers are implemented to control the rotational and the translational movements of the mobile robot. In addition, a simple fuzzy logic controller performing the function of avoiding obstacles is also introduced.


    This study is about human following for a mobile robot platform. By using an RGB-D camera, a novel method is proposed for tracking human targets in case of high influence from the environment. The human target is automatically detected by implementing the Histogram of Oriented Gradients method. Then the idea of extracted Depth of Interest (DOI) is proposed to cooperate with a combination of the CAM-shift algorithm and Kalman filter for the human tracking phase. Our proposed DOI can have nice performance in case of similar color interference and partial occlusion by removing backgrounds which are outside of a given range. However, in case of full occlusion, the tracking rate of the above approach is only 85%. This leads to the implementation of a new handler for tracking phase when a full occlusion occurs. In the proposed handler, the so-called corner features are automatically detected by the Harris corner detector and are considered as the good features of targets. Those features will be tracked on the next frame by using the Lucas-Kanade
    algorithm. The collected data of features are compared with the use of the proposed handler. Then the STATE_POINT containing the correction point of Kalman filter will be set in the saved position of target. By implementing this handler, the tracking rate increases to 98%. The average frame rate is approximately 24 fps, which is fast enough to be used in a real-time manner. In the human following phase, two simple
    fuzzy logic controllers are implemented to control the rotational and the translational movements of the mobile robot. In addition, a simple fuzzy logic controller performing the function of avoiding obstacles is also introduced.

    ACKNOWLEDGMENTS........................................................ i ABSTRACT.............................................................. ii Chapter 1: INTRODUCTION ................................................1 1.1. Background and Motivation ........................................ 1 1.2. Thesis Contribution............................................... 3 1.3. Thesis Organization .............................................. 4 Chapter 2: RELATED WORK.................................................6 2.1. Human Tracking in Real-Time Applications.......................... 6 2.2. Human Following Application in Mobile Robot....................... 9 Chapter 3: RELATED MATERIALS.......................................... 13 3.1. An RGB-D Camera.................................................. 13 3.1.1. Introduction to RGB-D Camera Technologies ..................... 13 3.1.2. Microsoft Kinect v1 Specification.............................. 14 3.1.3. Generate 3D Coordinates........................................ 16 3.1.4. Calibration of Depth and Color Cameras ........................ 18 3.2. Mobile Robot - Pioneer P3DX Model ............................... 20 3.2.1. Introduction to P3DX Model..................................... 20 3.2.2. Specification of P3DX Model ................................... 21 3.3. Integrated Development Environment (IDE)......................... 22 Chapter 4: OVERVIEW OF SYSTEM ........................................ 25 4.1. Hardware Architecture............................................ 25 4.2. Programming Environment of the System............................ 26 4.3. Overview of Methodology.......................................... 27 Chapter 5: HUMAN TRACKING PHASE ...................................... 28 5.1. Proposed Human Tracking Method................................... 28 5.1.1. Algorithm for Human Tracking Process .......................... 28 5.1.2. Removing Background Method..................................... 29 5.1.3. Identifying The Specific Human Target.......................... 30 5.1.4. Target Tracking Modeling....................................... 33 5.1.5. Extracting the Depth of Interest............................... 34 5.1.6. CAM-shift Algorithm ........................................... 35 5.1.7. Kalman Filter.................................................. 37 5.2. Proposed Handler for Full Occlusion ............................. 40 5.2.1. Algorithm of Full Occlusion Handler............................ 41 5.2.2. Automatically Detect the Corners .............................. 42 5.2.3. Tracking Features Using Lucas-Kanade Algorithm................. 44 5.2.4. Full Occlusion Handler Conditions ............................. 46 Chapter 6: HUMAN FOLLOWING PHASE...................................... 49 6.1. Human Following Method........................................... 49 6.2. Fuzzy Logic Controller........................................... 50 6.3. FLC for Rotational Movement...................................... 52 6.4. FLC for Translational Movement .................................. 55 6.5. Avoiding Obstacles Fuzzy Logic Controller ....................... 58 Chapter 7: EXPERIMENT RESULT ......................................... 62 7.1. Results of Human Tracking Phase ................................. 62 7.2. Results of Kalman Filter Implementation.......................... 64 7.3. Results of Proposed Full Occlusion Handler....................... 65 7.4. Results of Frame per Second Value................................ 69 7.5. Results of Human Following Phase ................................ 71 7.6. Results of Avoiding Obstacles FLC ............................... 72 Chapter 8: CONCLUSIONS AND FUTURE WORKS .............................. 74 8.1. Conclusions...................................................... 74 8.2. Future Works .................................................... 75 REFERENCES ........................................................... 77

    [1] A. Moscaritolo. (2012). DARPA's Robotic 'AlphaDog' Could Give Soldiers' Gear a Lift. Available: http://www.pcmag.com/article2/0,2817,2409628,00.asp
    [2] S. A. A. S. B. Ilias, S. Yaacob, A.H. Adom and M.H. Mohd Razali "A Nurse Following Robot with High Speed Kinect Sensor," ARPN Journal of Engineering and Applied Sciences, vol. 9, pp. 2454-2459, 2014.
    [3] E. Machida, M. Cao, T. Murao, and H. Hashimoto, "Human motion tracking of mobile robot with Kinect 3D sensor," in 2012 Proceedings of SICE Annual Conference (SICE), 2012, pp. 2207-2211.
    [4] M. Q. Do and C. H. Lin, "Embedded human-following mobile-robot with an RGB-D camera," in 2015 14th IAPR International Conference on Machine Vision Applications (MVA), 2015, pp. 555-558.
    [5] C. C. Lee, "Fuzzy logic in control systems: fuzzy logic controller. I," IEEE Transactions on Systems, Man, and Cybernetics, vol. 20, pp. 404-418, 1990.
    [6] Fuzzy Logic. Available: https://en.wikipedia.org/wiki/Fuzzy_logic
    [7] H. Zhang, C. Reardon, and L. E. Parker, "Real-Time Multiple Human Perception With Color-Depth Cameras on a Mobile Robot," IEEE Transactions on Cybernetics, vol. 43, pp. 1429-1441, 2013.
    [8] CAM-shift. Available:
    http://docs.opencv.org/3.1.0/db/df8/tutorial_py_meanshift.html
    [9] U. Soni, A. Trivedi, and N. Roberts, "Real-time hand tracking using integrated optical flow and CAMshift algorithm," in 2016 Second International Conference on Research in Computational Intelligence and Communication Networks (ICRCICN), 2016, pp. 135-140.
    [10] S. Huang and J. Hong, "Moving object tracking system based on camshift and Kalman filter," in 2011 International Conference on Consumer Electronics, Communications and Networks (CECNet), 2011, pp. 1423-1426.
    [11] X. C. Yuan and C. M. Pun, "Invariant Digital Image Watermarking Using Adaptive Harris Corner Detector," 2011 Eighth International Conference Computer Graphics, Imaging and Visualization, pp. 109-113, 2011.
    [12] X. Ma, N. Fan, Z. He, and W. Yang, "Real-Time Affine Tracking Using Relocated Lucas-Kanade Algorithm," 2015 Third International Conference on Robot, Vision and Signal Processing (RVSP), pp. 35-38, 2015.
    [13] W. Chung, H. Kim, Y. Yoo, C. B. Moon, and J. Park, "The Detection and Following of Human Legs Through Inductive Approaches for a Mobile Robot With a Single Laser Range Finder," IEEE Transactions on Industrial Electronics, vol. 59, pp. 3156-3166, 2012.
    [14] Y. Isobe, G. Masuyama, and K. Umeda, "Target tracking for a mobile robot with a stereo camera considering illumination changes," in 2015 IEEE/SICE International Symposium on System Integration (SII), 2015, pp. 702-707.
    [15] J. Satake and J. Miura, "Multiple-Person Tracking for a Mobile Robot using Stereo," Proc. IAPR Conf. on Mach. Vision App., pp. 273-277, 2009.
    [16] M. Kim, N. Y. Chong, H. S. Ahn, and W. Yu, "RFID-enabled Target Tracking and Following with a Mobile Robot Using Direction Finding Antennas," in 2007 IEEE International Conference on Automation Science and Engineering, 2007, pp.1014-1019.
    [17] X. b. Jin, Y. Shi, and C.-X. Nie, "Tracking for indoor RFID system with UKF and EKF," in 2015 International Conference on Estimation, Detection and Information Fusion (ICEDIF), 2015, pp. 146-151.
    [18] R. C. Luo, A. C. Tsai, and C. T. Liao, "Face Detection and Tracking for Human Robot Interaction through Service Robot," in IECON 2007 - 33rd Annual Conference of the IEEE Industrial Electronics Society, 2007, pp. 2818-2823.
    [19] G. W. Kim and D. S. Kang, "Modified CAMshift Algorithm Based on HSV Color Model for Tracking Objects," International Journal of Software Engineering and Its Applications, pp. 193-200, 2015.
    [20] S. János and I. Matijevics, "Implementation of potential field method for mobile robot navigation in greenhouse environment with WSN support," in IEEE 8th International Symposium on Intelligent Systems and Informatics, 2010, pp. 319-323.
    [21] K. Morioka, L. Joo-Ho, and H. Hashimoto, "Human-following mobile robot in a distributed intelligent sensor network," IEEE Transactions on Industrial Electronics, vol. 51, pp. 229-237, 2004.
    [22] Bumblebee Camera. Available: https://www.ptgrey.com/bumblebee-xb3-1394bstereo-vision-camera-systems-2
    [23] Structured light. Available: https://en.wikipedia.org/wiki/Structured_light
    [24] Time-of-flight. Available: https://en.wikipedia.org/wiki/Time-of-flight_camera
    [25] Kinect sensor. Available: http://www.xbox.com/en-US/xbox-
    360/accessories/kinect
    [26] P3DX robot. Available:
    http://www.mobilerobots.com/ResearchRobots/PioneerP3DX.aspx
    [27] Microsoft Visual Studio. Available:
    https://www.visualstudio.com/fr/?rr=https%3A%2F%2Fwww.google.com.tw%2F
    [28] OpenCV library. Available: http://opencv.org/
    [29] OpenNI library. Available: http://openni.ru/openni-sdk/
    [30] Pioneer Software Development Kit Available:
    http://www.mobilerobots.com/Software.aspx
    [31] Y. Kong, S. Liu, J. Wang, and B. Zhang, "Person following based on Haar-like feature and HOG feature in indoor environment," in 2016 35th Chinese Control Conference (CCC), 2016, pp. 6345-6349.
    [32] P. Kumar, S. L. Happy, and A. Routray, "A real-time robust facial expression recognition system using HOG features," in 2016 International Conference on Computing, Analytics and Security Trends (CAST), 2016, pp. 289-293.
    [33] J. He and Y. Yang, "Multi-iterative tracking method using meanshift based on kalman filter," in 2014 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC), 2014, pp. 22-27.
    [34] G. Yang and H. Liu, "Visual tracking algorithm based on CAMSHIFT and multicue fusion for human motion analysis," in 2009 IEEE International Conference on Systems, Man and Cybernetics, 2009, pp. 1280-1285.
    [35] Y. Luo, H. Yang, and Z. Hu, "Human limb motion real-time tracking based on CamShift for intelligent rehabilitation system," in 2009 IEEE International Conference on Robotics and Biomimetics (ROBIO), 2009, pp. 343-348.
    [36] D. Exner, E. Bruns, D. Kurz, A. Grundhöfer, and O. Bimber, "Fast and robust CAMShift tracking," in 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, 2010, pp. 9-16.
    [37] M. Marron, J. C. Garcia, M. A. Sotelo, M. Cabello, D. Pizarro, F. Huerta, et al., "Comparing a Kalman Filter and a Particle Filter in a Multiple Objects Tracking Application," in 2007 IEEE International Symposium on Intelligent Signal Processing, 2007, pp. 1-6.
    [38] T. Zhou and Y. Yan, "Video target tracking based on mean shift algorithm with Kalman filter," in 2014 10th International Conference on Natural Computation (ICNC), 2014, pp. 980-984.
    [39] R. Shil. (2011). Simple Kalman filter for tracking using OpenCV 2.2. Available:
    http://www.morethantechnical.com/2011/06/17/simple-kalman-filter-for-trackingusing-opencv-2-2-w-code/
    [40] V. Vasco, A. Glover, and C. Bartolozzi, "Fast event-based Harris corner detection exploiting the advantages of event-driven cameras," in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2016, pp. 4144-4149.
    [41] H. Hassan, S. Yaacob, A. Radman, and S. A. Suandi, "Eye state detection for driver inattention based on Lucas Kanade optical flow algorithm," in 2016 6th International Conference on Intelligent and Advanced Systems (ICIAS), 2016, pp.1-6.
    [42] Harris corner. Available:
    http://docs.opencv.org/3.1.0/d4/d7d/tutorial_harris_detector.html
    [43] Human tracking phase. Available: https://www.youtube.com/watch?v=6RND3nyPis
    [44] Human following phase. Available:
    https://www.youtube.com/watch?v=3gm9TlnNBSY
    [45] Human following result. Available:
    https://www.youtube.com/watch?v=JrGhMUhUC6A
    [46] Avoiding Obstacles FLC. Available: https://youtu.be/_VWJIMDc86M
    [47] ArDrone2 Quadcopter. Available: https://www.parrot.com/fr/drones/parrotardrone-20-elite-edition#parrot-ardrone-20-elite-edition-details
    [48] ROS Available: http://wiki.ros.org/ROS/Introduction
    [49] K. Alisher, K. Alexander, and B. Alexandr, "Control of the Mobile Robots with ROS in Robotics Courses," Procedia Engineering, vol. 100, pp. 1475-1484, 2015/01/01 2015.

    QR CODE