簡易檢索 / 詳目顯示

研究生: 李沂芳
Yi-Fang Lee
論文名稱: 一個利用深度攝影機於運動姿態分類的人體活動能量消耗估測系統
A Posture-Classification-Based Estimation System for Energy Expenditure of Physical Activities Captured by Depth Cameras
指導教授: 范欽雄
Chin-Shyurng Fahn
口試委員: 李建德
Jiann-Der Lee
謝君偉
Jun-Wei Hsieh
陳怡伶
Yi-Ling Chen
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2019
畢業學年度: 107
語文別: 英文
論文頁數: 58
中文關鍵詞: 深度攝影機能量消耗人體活動運動姿態分類多層感知器卷積神經網路
外文關鍵詞: depth camera, energy expenditure, physical activity, posture classification, multilayer perceptron, convolutional neural network
相關次數: 點閱:267下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 瞭解不同人體活動的能量消耗可以幫助人們做出適當的運動鍛鍊計畫。本論文提出一種基於深度攝影機的人體活動能量消耗估測系統,使用此系統不需要配戴任何裝置,其中運動姿態可以區分站立、行走和跑步,經由此系統可以依據不同的運動姿態估算人體活動的能量消耗。
    在這項研究中,招募了21名受試者參加人體活動的能量消耗實驗。藉由深度攝影機獲取的受試者骨骼數據,進行不同運動姿態的分類,並且對應到氣體分析器所獲得的受試者代謝當量,以建立人體活動的能量消耗估測系統。為了準確達到能量消耗的估測,本研究在受試者的側邊、斜後方和後方各設置一台深度攝影機,通過比較不同的預測模型搭配不同位置的攝影機資料,根據站立、行走和跑步三種運動姿態,分別估測受試者的人體活動所消耗的能量。
    透過數次的實驗,多層感知器是最適合建立各種人體活動能量消耗估計系統的預測模型,搭配側邊的攝影機資料可以獲得最低的平均絕對誤差值,亦即MAE= 0.54。而在三個人體活動分別建立三個相對應的能量消耗估測系統中,多層感知器搭配後方的攝影機在跑步運動的估測中具有最佳的結果,而在站立和行走兩種運動姿態,該系統則是以卷積神經網路搭配後方攝影機資料能獲得最佳估測結果,其中站立的MAE=0.14,行走的MAE=0.46,而跑步的MAE=0.65。根據實驗結果顯示,此研究提供一個可靠的系統,讓受試者能夠使用最方便且簡單的方式來監控自身運動時的能量消耗。


    The energy expenditure of different physical activities can help people make proper exercise planning. In this thesis, we develop an estimation system for energy expenditure of physical activities captured by depth cameras. In this system, users need not wear any devices while doing the workouts. The system can first classify the postures of standing, walking, and running, then estimate energy expenditure based on different physical activities postures.
    In this study, 21 subjects were recruited to participate in energy expenditure experiments in physical activities. Subject's skeletal data acquired by the depth camera are used to classify different physical activities postures, and subject's metabolic equivalents are obtained from a gas analyzer to establish the estimation system for energy expenditure of physical activities. In order to accurately estimate the energy expenditure, this study sets up three depth cameras on the side, rear right, and rear of the subject. Then compare the energy expenditure according to the postures of standing, walking, and running by various regression prediction models with the data captured by different cameras.
    Through many experiments, the multilayer perceptron (MLP) is the most suitable predictive model for establishing the estimation system for energy expenditure of a combined physical activity. The lowest mean absolute error (MAE) can be obtained by an MLP with the camera data on the side, which is MAE = 0.54. When examining one of three physical activities (standing, walking, and running) by virtue of our proposed estimation system individually, the MLP with rear data has the best result for running postures (MAE = 0.65), and a convolutional neural network with rear data for standing posture (MAE = 0.14) and walking posture (MAE = 0.46) has the best result. According to the experimental results, our study can provide a reliable system that allows the subject to monitor the energy expenditure during the exercise in the most convenient and simple way.

    中文摘要 Abstract 致謝 List of Figures List of Tables Chapter 1 Introduction 1.1 Overview 1.2 Motivation 1.3 System Description 1.4 Organization of Thesis Chapter 2 Related Work 2.1 Different Types of Physical Activities Data Collection 2.1.1 Heart Rate Monitor 2.1.2 Accelerometer 2.1.3 Kinect Sensor 2.2 The Effect of Kinect Sensor 2.2.1 3D Imaging 2.2.2 Skeletal Tracking Chapter 3 Physical Activities Data Preprocessing 3.1 Data Acquisition 3.2 Feature Extraction 3.3 Posture Classification 3.3.1 Principal component analysis 3.3.2 Support Vector Machines Chapter 4 Energy Expenditure Estimation 4.1 Linear Regression Model 4.2 Multilayer Perceptron Model 4.3 Convolutional Neural Network Model Chapter 5 Experimental Results and Discussions 5.1 Experimental Setup 5.2 The Results of Feature Extraction and Selection 5.3 The Results of Energy Expenditure Estimation 5.3.1 Test on Linear Regression Model 5.3.2 Test on Multilayer Perceptron Model 5.3.3 Test on Convolutional Neural Network Model 5.3.4 The Comparison of Three Models Chapter 6 Conclusions and Future Work 6.1 Conclusions 6.2 Future Work References

    [1]
    World Health Organization, “World health organization: Physical activity,” 20 Apr. 2019. [Online]. Available: https://www.who.int/health-topics/news-room/fact-sheets/detail/physical-activity.
    [2]
    H. Ritchie and M. Roser, “Causes of death - Our world in data,” 20 Apr. 2019. [Online]. Available: https://ourworldindata.org/causes-of-death.
    [3]
    J. E. McLaughlin, “Validation of the COSMED K4b2 portable metabolic system,” International Journal of Sports Medicine, vol. 22, no. 4, p. 280–284, 2001.
    [4]
    A. L. et al., “Comparison of lifestyle and structured interventions to increase physical activity and cardiorespiratory fitness: a randomized trial,” The Journal of the American Medical Association, vol. 281, no. 4, pp. 327-334, 1999.
    [5]
    D. P. Swain, “Exercise equipment: assessing the advertised claims,” ACSM's Health & Fitness Journal, vol. 13, no. 5, pp. 8-11, 2009.
    [6]
    E. Gambi et al., “Heart rate detection using microsoft kinect: validation and comparison to wearable devices,” Sensors (Basel), vol. 17, no. 8, pp. 1776-1793, 2017.
    [7]
    O. D. Lara and M. A. Labrador, “A survey on human activity recognition using wearable sensors,” IEEE Communications Surveys & Tutorials, vol. 15, no. 3, pp. 1192-1209, 2013.
    [8]
    B. Tomek, “Using a heart rate monitor to improve fitness,” 13 Jun. 2019. [Online]. Available: https://www.active.com/fitness/articles/using-a-heart-rate-monitor-to-improve-fitness-872943?page=1.
    [9]
    S. Muangsrinoon and P. Boonbrahm, “Burn in zone: Real time heart rate monitoring for physical activity,” in Proceedings of International Joint Conference on Computer Science and Software Engineering, Nakhon Si Thammarat, Thailand, 2017.
    [10]
    M. Sawh, “ECG explained: The science behind the new wearable health tech revolution,” 14 Jun. 2019. [Online]. Available: https://www.wareable.com/health-and-wellbeing/ecg-heart-rate-monitor-watch-guide-6508.
    [11]
    Valencell, “Valencell | Technology,” 14 Jun. 2019. [Online]. Available: https://valencell.com/technology/.
    [12]
    C. SE, C. JR, and B. D. Jr, “Estimating energy expenditure using accelerometers,” European Journal of Applied Physiology , vol. 98, no. 6, pp. 601-612, 2006.
    [13]
    M. Altini et al., “Estimating energy expenditure using body-worn accelerometers: A comparison of methods, sensors number and positioning,” IEEE Journal of Biomedical and Health Informatics, vol. 19, no. 1, pp. 219-226, 2014.
    [14]
    Dynastream Innovations Inc., “AMP331P sports data collection unit user manual AMP 331 instructions for users 0.E DynaStream innovations,” 16 Jun. 2019. [Online]. Available: https://fccid.io/O6RAMP331P/User-Manual/Manual-345178.iframe.
    [15]
    X-TECH CREATIVE STUDIO, “Kinect games and apps,” 17 Jun. 2019. [Online]. Available: http://x-tech.am/kinect/.
    [16]
    Z. Zhang, “Microsoft Kinect Sensor and Its Effect,” IEEE MultiMedia, vol. 19, no. 2, pp. 4-10, 2012.
    [17]
    MathWorks, “Skeleton Viewer for Kinect V2 Skeletal Data,” 17 Jun. 2019. [Online]. Available: https://ww2.mathworks.cn/help/supportpkg/kinectforwindowsruntime/examples/plot-skeletons-with-the-kinect-v2.html.
    [18]
    MathWorks, “Using Skeleton Viewer for Kinect Skeletal Data,” 17 Jun. 2019. [Online]. Available: https://ww2.mathworks.cn/help/imaq/using-the-skeleton-viewer-for-kinect-skeletal-data.html.
    [19]
    A. Taha et al., “Human action recognitionbased on MSVM and depth images,” International Journal of Computer Science Issues, vol. 11, no. 4, pp. 42-51, 2014.
    [20]
    M. H. Ahmed and A. T. Sabir, “Human gender classification based on gait features using kinect sensor,” in Proceedings of the International Conference on Cybernetics, Exeter, UK, 2017.
    [21]
    M. A. Livingstonl et al., “Measurements for the microsoft kinect skeleton,” in Proceedings of the International Conference on IEEE Virtual Reality Workshops, Costa Mesa, CA, 2012.
    [22]
    Y. L. Chou, “Statistical analysis,” in Title of Published Book. Holt,Rinehart & Winston of Canada Ltd., 1975.
    [23]
    D. A. Winter “Winter, Biomechanics and motor control of human movement ,” in Title of Published Book, 4th ed. John Wiley & Sons, Inc., 2009.
    [24]
    B.-S. Lin et al., “Depth-camera-based system for estimating energy expenditure of physical activities in gyms,” IEEE Journal of Biomedical and Health Informatics, vol. 23, no. 3, pp. 1086-095, 2019.
    [25]
    K. Pearson, “On lines and planes of closest fit to systems of points in space,” Philosophical Magazine, vol. 2, no. 11, pp. 559-572, 1901.
    [26]
    M. Richardson, “Principal component analysis,” 7 Jun. 2019. [Online]. Available: http://www.dsc.ufcg.edu.br/~hmg/disciplinas/posgraduacao/rn-copin-2014.3/material/SignalProcPCA.pdf.
    [27]
    J. Shlens, “A tutorial on principal component analysis,” International Journal of Remote Sensing, vol. 51, no. 2, 2014.
    [28]
    C.-W. Hsu, C.-C. Chang, and C.-J. Lin, “A practical guide to support vector classification,” Department of Computer Science, National Taiwan University, Taiwan, 10 Jun. 2019. [Online]. Available: https://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf.
    [29]
    L. Auria and R. A. Moro, “Support vector machines (SVM) as a technique for solvency analysis,” SSRN Electronic Journal, 2008.
    [30]
    G. Panchal et al., “Behaviour analysis of multilayer perceptrons with multiple hidden neurons and hidden layers,” International Journal of Computer Theory and Engineering, vol. 3, no. 2, pp. 332-337, 2011.
    [31]
    I. Namatēvs, “Deep convolutional neural networks: Structure, feature fxtraction and training,” Information Technology and Management Science, vol. 20, pp. 40-47, 2017.
    [32]
    X. Zhao et al., “Mixture gases classification based on multi-label one-dimensional deep convolutional neural network,” IEEE Access, vol. 7, pp. 12630-12637, 2019.
    [33]
    R. Yamashita et al., “Convolutional neural networks: an overview and application in radiology,” Insights into Imaging, vol. 9, no. 4, p. 611–629, 2018.
    [34]
    G. KE, F. DP, and K. AD, “Metabolic and mechanical energy costs of reducing vertical center of mass movement during gait,” Archives of Physical Medicine and Rehabilitation, vol. 90, no. 1, pp. 136-144, 2009.
    [35]
    I. T. Jolliffe, “Principal component analysis,” in Title of Published Book, Springer, 1986.
    [36]
    I. T. Jolliffe and J. Cadima, “Principal component analysis: A review and recent developments,” Philosophical Transactions of The Royal Society A Mathematical Physical and Engineering Sciences, vol. 374, 2016.

    無法下載圖示 全文公開日期 2024/07/21 (校內網路)
    全文公開日期 2024/07/21 (校外網路)
    全文公開日期 2029/07/21 (國家圖書館:臺灣博碩士論文系統)
    QR CODE