簡易檢索 / 詳目顯示

研究生: 陳富強
Richard - Tondowidjojo
論文名稱: Kinect Based Real-Time Motion Comparison with Retargeting and Color-Code Feedback
Kinect Based Real-Time Motion Comparison with Retargeting and Color-Code Feedback
指導教授: 楊傳凱
Chuan-Kai Yang
口試委員: 賴源正
Yu-Chi Lai
姚智原
Chih-Yuan Yao
學位類別: 碩士
Master
系所名稱: 管理學院 - 資訊管理系
Department of Information Management
論文出版年: 2016
畢業學年度: 104
語文別: 英文
論文頁數: 53
中文關鍵詞: KinectV2Open-EndDynamicTimeWarpingRetargetingReal-TimeMotionComparisonColor-CodedFeedback
外文關鍵詞: Kinect V2, Open-End Dynamic Time Warping, Retargeting, Real-Time, Motion Comparison, Color-Coded Feedback
相關次數: 點閱:229下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • In this thesis, we propose a system to evaluate a user’s performance in a real-time
    manner. By utilizing the ability of Kinect V2 sensor to capture human motion, the system
    records a professional user’s motion. This recording can be used as a reference motion for
    novice users to practice. This way, the presence of the professional is not needed. Later on,
    the novice user can re-play the recording inside a 3-dimensional environment and follow
    through in order to practice the motion. In this 3-D environment, the system presents the
    motion by a 3-D character, retargeted according to the novice’s body. This retargeting system
    allows a novice user to view the motion using a 3-D model with the size relative to his/her
    own body size. This system also provides feedback by changing the color of the model to
    indicate the correctness of user’s motion. This way, the feedback can easily be recognized by
    the novice, thus making motion learning more effective.


    In this thesis, we propose a system to evaluate a user’s performance in a real-time
    manner. By utilizing the ability of Kinect V2 sensor to capture human motion, the system
    records a professional user’s motion. This recording can be used as a reference motion for
    novice users to practice. This way, the presence of the professional is not needed. Later on,
    the novice user can re-play the recording inside a 3-dimensional environment and follow
    through in order to practice the motion. In this 3-D environment, the system presents the
    motion by a 3-D character, retargeted according to the novice’s body. This retargeting system
    allows a novice user to view the motion using a 3-D model with the size relative to his/her
    own body size. This system also provides feedback by changing the color of the model to
    indicate the correctness of user’s motion. This way, the feedback can easily be recognized by
    the novice, thus making motion learning more effective.

    Abstract i Acknowledgement ii Contents iii List of Figures v List of Tables vii Chapter 1 Introduction 1 1.1 Motivation 1 1.2 Contribution 2 1.3 Thesis Organization 3 Chapter 2 Related Works 4 2.1 Motion Retargeting 4 2.2 Dynamic Time Warping 5 2.3 Motion Comparison 7 Chapter 3 System Architecture 9 3.1 Hardware Architecture 9 3.2 Software Architecture 12 Chapter 4 Proposed Method 17 4.1 Motion Retargeting 17 4.2 Motion Signal Comparison 19 4.3 Multi-Thread Processing 23 4.4 Color-Coded Feedback 24 Chapter 5 Experiments and Results 28 5.1 Hardware Specification 28 5.2 Accuracy Test 30 5.3 Speed Difference Result 36 5.4 Usability Test 42 Chapter 6 Conclusions and Future Work 49 6.1 Conclusion 49 6.2 Future Work 50 References 51

    [1] C.-J. Su, “Personal Rehabilitation Exercise Assistant with Kinect and Dynamic Time Warping,” Int. J. Inf. Educ. Technol., vol. 3, no. 4, p. 448, 2013.
    [2] P. Tormene, T. Giorgino, S. Quaglini, and M. Stefanelli, “Matching incomplete time series with dynamic time warping: an algorithm and an application to post-stroke rehabilitation,” Artif. Intell. Med., vol. 45, no. 1, pp. 11–34, 2009.
    [3] J. Venugopalan, C. Cheng, T. H. Stokes, and M. D. Wang, “Kinect-based rehabilitation system for patients with traumatic brain injury.,” Conf. Proc. IEEE Eng. Med. Biol. Soc., vol. 2013, pp. 4625–8, 2013.
    [4] T. Y. Lin, C. H. Hsieh, and J. Der Lee, “A kinect-based system for physical rehabilitation: Utilizing Tai Chi exercises to improve movement disorders in patients with balance ability,” Proc. - Asia Model. Symp. 2013 7th Asia Int. Conf. Math. Model. Comput. Simulation, AMS 2013, pp. 149–153, 2013.
    [5] E. Velloso, A. Bulling, and H. Gellersen, “MotionMA: motion modelling and analysis by demonstration,” CHI ’13 Proc. SIGCHI Conf. Hum. Factors Comput. Syst., p. 1309, 2013.
    [6] M. Gleicher, “Retargetting motion to new characters,” Proc. 25th Annu. Conf. Comput. Graph. Interact. Tech. - SIGGRAPH ’98, pp. 33–42, 1998.
    [7] K.-J. C. K.-J. Choi and H.-S. K. H.-S. Ko, “On-line motion retargetting,” Proceedings. Seventh Pacific Conf. Comput. Graph. Appl. (Cat. No.PR00293), 1999.
    [8] H. Sakoe and S. Chiba, “Dynamic Programming Algorithm Optimization for Spoken Word Recognition,” IEEE Trans. Acoust., vol. 26, no. 1, pp. 43–49, 1978.
    [9] Y. Kim and C. H. Park, “Query by humming by using scaled dynamic time warping,” Proc. - 2013 Int. Conf. Signal-Image Technol. Internet-Based Syst. SITIS 2013, pp. 1–5, 2013.
    [10] H. Li, X. Wan, Y. Liang, and S. Gao, “Dynamic Time Warping Based on Cubic Spline Interpolation for Time Series Data Mining,” 2014 IEEE Int. Conf. Data Min. Work., pp. 19–26, 2014.
    [11] P. K. Pisharady and M. Saerbeck, “Robust gesture detection and recognition using dynamic time warping and multi-class probability estimates,” Proc. 2013 IEEE Symp. Comput. Intell. Multimedia, Signal Vis. Process. CIMSIVP 2013 - 2013 IEEE Symp. Ser. Comput. Intell. SSCI 2013, pp. 30–36, 2013.
    [12] R. Srivastava and P. Sinha, “Hand Movements and Gestures Characterization Using Quaternion Dynamic Time Warping Technique,” vol. 16, no. 5, pp. 1333–1341, 2016.
    [13] A. Mori, S. Uchida, R. Kurazume, R. Taniguchi, T. Hasegawa, and H. Sakoe, “Early Recognition and Prediction of Gestures Conventional Gesture Recognition Algo- rithm Based on Dynamic Programming Early Recognition Based on DP,” pp. 18–21, 2006.
    [14] Microsoft, “Kinect for Windows Developer Toolkit v1.8.” [Online]. Available: http://www.microsoft.com/en-us/download/details.aspx?id=40276. [Accessed: 01-Jan-2016].
    [15] “Xbox One Kinect Teardown,” 2014. [Online]. Available: https://www.ifixit.com/Teardown/Xbox+One+Kinect+Teardown/19725. [Accessed: 16-May-2016].
    [16] M. Szymczyk, “How Does The Kinect 2 Compare To The Kinect 1.” [Online]. Available: http://zugara.com/how-does-the-kinect-2-compare-to-the-kinect-1. [Accessed: 24-May-2016].
    [17] S. Sempena, N. U. Maulidevi, and P. R. Aryan, “Human action recognition using Dynamic Time Warping,” Proc. 2011 Int. Conf. Electr. Eng. Informatics, no. July, pp. 1–5, 2011.
    [18] T. Takala and R. Pugliese, “Reality-based User Interface System.” [Online]. Available: http://blog.ruisystem.net/about/. [Accessed: 16-May-2016].

    QR CODE