簡易檢索 / 詳目顯示

研究生: 陳士峰
Shi-Feng Chen
論文名稱: 24GHz混合雷達系統之人體動作辨識
Human Motion Recognition Based on 24GHz Hybrid Radar Systems
指導教授: 陳維美
Wei-Mei Chen
口試委員: 陳維美
Wei-Mei Chen
林昌鴻
Chang Hong Lin
林淵翔
Yuan-Hsiang Lin
阮聖彰
Shanq-Jang Ruan
學位類別: 碩士
Master
系所名稱: 電資學院 - 電子工程系
Department of Electronic and Computer Engineering
論文出版年: 2021
畢業學年度: 109
語文別: 中文
論文頁數: 61
中文關鍵詞: 人體動作辨識都卜勒雷達支援向量機連續波
外文關鍵詞: human motion recognition, doppler radar, Support vector machine, continuous wave
相關次數: 點閱:421下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本碩士論文最主要是利用頻率為 24GHz 的 Doppler radar,透過 Doppler radar 所提供的 Continuous­Wave (CW) 雷達和 Frequency­Modulated Continuous Wave (FMCW)雷達,前者是發射一個固定頻率的信號,因此最主要是偵測動作的速度以及頻率的變化率;後者則是利用時間上改變發射信號的頻率,並測量接收信號相對於發射信號的頻率方來測定目標距離,其發射頻率和接收頻率的相對關係不但可測量目標距離還可測量出目標的徑向速度,利用這兩種方法即可達成利用非接觸式的方式收集人類常見的動作特徵,並搭配 Machine Learning 中的支援向量機 (Support Vector Machine),針對人運動所常見的七種動作像是跌倒、原地跳、跑步、坐下、彎腰、蹲下以及拄拐杖走路這幾種動作進行辨識,之後將數據進行訓練,調整相關的參數,確立最後所使用的模型,並進行人體動作辨識的實驗以及分析,透過實驗的結果及分析,本篇論文可以針對七種人體動作辨識的準確率平均可以達 96%,說明本文利用混合式雷達系統來進行人體動作辨識具有不錯的辨識能力。


    This thesis focuses on devising a 24GHZ hybrid radar system for human motion recognition based on the Continuous Wave radar and the Frequency Modulated Continuous Wave
    radar. The distance and velocity of targets can be estimated by the relationship between
    the transmitted frequency and the received frequency of the 24GHZ hybrid radar model.
    The hybrid radar system first collects the characteristics of everyday human activities in
    a non­contact manner, and inputs to our algorithm based on Support Vector Machine to
    identify seven popular human movements of daily life, including falling, jumping in place,
    running, sitting, bending, squatting, and walking on crutches. Through analysis and evaluation of the experimental results, the accuracy of this system can reach an average of 96%
    for seven kinds of human motion recognition, indicating that this article uses a hybrid
    radar system that has good recognition ability for practical applications.

    論文摘要 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . I Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . II 誌謝 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . III 目錄 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . IV 圖目錄 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . VI 表目錄 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . IX 1 緒論 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 研究背景 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 研究動機 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.3 論文架構 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2 文獻探討 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.1 與人體動作辨識相關的研究 . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.1.1 接觸式人體動作辨識 . . . . . . . . . . . . . . . . . . . . . . . . 3 2.1.2 基於攝影機之非接觸式人體動作辨識 . . . . . . . . . . . . . . . . 5 2.1.3 基於雷達之非接觸式人體動作辨識 . . . . . . . . . . . . . . . . . 7 2.2 雷達系統 (Radar System) 簡介 . . . . . . . . . . . . . . . . . . . . . . . 8 2.3 支援向量機 (Support Vector Machine) 簡介 . . . . . . . . . . . . . . . . 12 3 研究方法 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 3.1 問題描述 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 3.2 雷達系統硬體架構 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 3.3 雷達動作辨識資料收集方法 . . . . . . . . . . . . . . . . . . . . . . . . . 18 IV 3.4 特徵擷取方法 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3.5 基於支援向量機的人體動作辨識分類演算法 . . . . . . . . . . . . . . . . 27 4 實驗結果與分析 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 4.1 實驗環境 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 4.2 實驗資料與設定 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 4.2.1 人體動作辨識動作選定 . . . . . . . . . . . . . . . . . . . . . . . 40 4.3 實驗結果分析 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 4.3.1 最佳參數調整之準確率比較 . . . . . . . . . . . . . . . . . . . . 40 4.3.2 連續波雷達以及混和式雷達準確率之比較 . . . . . . . . . . . . . 41 4.3.3 七種動作個別動作的 confusion matrix . . . . . . . . . . . . . . . 42 4.3.4 七種人體動作之辨識率 . . . . . . . . . . . . . . . . . . . . . . . 43 5 結論 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 參考文獻 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

    [1] O. D. Lara and M. A. Labrador, “A survey on human activity recognition using wearable sensors,” IEEE communications surveys & tutorials, vol. 15, no. 3, pp. 1192–
    1209, 2012.
    [2] E. Cippitelli, F. Fioranelli, E. Gambi, and S. Spinsante, “Radar and rgb­depth sensors
    for fall detection: A review,” IEEE Sensors Journal, vol. 17, no. 12, pp. 3585–3604,
    2017.
    [3] C. Ding, H. Hong, Y. Zou, H. Chu, X. Zhu, F. Fioranelli, J. Le Kernec, and C. Li,
    “Continuous human motion recognition with a dynamic range­doppler trajectory
    method based on fmcw radar,” IEEE Transactions on Geoscience and Remote Sensing, vol. 57, no. 9, pp. 6821–6831, 2019.
    [4] Z.­Y. He and L.­W. Jin, “Activity recognition from acceleration data using ar model
    representation and svm,” in 2008 international conference on machine learning and
    cybernetics, vol. 4, pp. 2245–2250, IEEE, 2008.
    [5] O. D. Lara, A. J. Pérez, M. A. Labrador, and J. D. Posada, “Centinela: A human
    activity recognition system based on acceleration and vital sign data,” Pervasive and
    mobile computing, vol. 8, no. 5, pp. 717–729, 2012.
    [6] K. Wang, J. He, and L. Zhang, “Attention­based convolutional neural network for
    weakly labeled human activities’recognition with wearable sensors,” IEEE Sensors
    Journal, vol. 19, no. 17, pp. 7598–7604, 2019.
    [7] H. Peng, F. Long, and C. Ding, “Feature selection based on mutual information criteria of max­dependency, max­relevance, and min­redundancy,” IEEE Transactions
    on pattern analysis and machine intelligence, vol. 27, no. 8, pp. 1226–1238, 2005.
    [8] L. C. Jatoba, U. Grossmann, C. Kunze, J. Ottenbacher, and W. Stork, “Context­aware
    mobile health monitoring: Evaluation of different pattern recognition methods for
    classification of physical activity,” in 2008 30th annual international conference of
    the ieee engineering in medicine and biology society, pp. 5250–5253, IEEE, 2008.
    45
    [9] C. Randell and H. Muller, “Context awareness by analysing accelerometer data,”
    in Digest of Papers. Fourth International Symposium on Wearable Computers,
    pp. 175–176, IEEE, 2000.
    [10] T.­P. Kao, C.­W. Lin, and J.­S. Wang, “Development of a portable activity detector
    for daily activity recognition,” in 2009 ieee international symposium on industrial
    electronics, pp. 115–120, IEEE, 2009.
    [11] I. A. Lawal and S. Bano, “Deep human activity recognition using wearable sensors,”
    in Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments, pp. 45–48, 2019.
    [12] W. Jiang and Z. Yin, “Human activity recognition using wearable sensors by deep
    convolutional neural networks,” in Proceedings of the 23rd ACM international conference on Multimedia, pp. 1307–1310, 2015.
    [13] I. A. Lawal and S. Bano, “Deep human activity recognition with localisation of wearable sensors,” IEEE Access, vol. 8, pp. 155060–155070, 2020.
    [14] J. Zhao, J. Katupitiya, and J. Ward, “Global correlation based ground plane estimation using v­disparity image,” in Proceedings 2007 IEEE international conference
    on robotics and automation, pp. 529–534, IEEE, 2007.
    [15] P. Bilski, P. Mazurek, and J. Wagner, “Application of k nearest neighbors approach
    to the fall detection of elderly people using depth­based sensors,” in 2015 IEEE 8th
    International Conference on Intelligent Data Acquisition and Advanced Computing
    Systems: Technology and Applications (IDAACS), vol. 2, pp. 733–739, IEEE, 2015.
    [16] R. Planinc and M. Kampel, “Introducing the use of depth data for fall detection,”
    Personal and ubiquitous computing, vol. 17, no. 6, pp. 1063–1072, 2013.
    [17] C. K. Lee and V. Y. Lee, “Fall detection system based on kinect sensor using novel
    detection and posture recognition algorithm,” in International Conference on Smart
    Homes and Health Telematics, pp. 238–244, Springer, 2013.
    [18] C. Kawatsu, J. Li, and C.­J. Chung, “Development of a fall detection system with
    microsoft kinect,” in Robot Intelligence Technology and Applications 2012, pp. 623–
    630, Springer, 2013.
    46
    [19] S.­W. Yang and S.­K. Lin, “Fall detection for multiple pedestrians using depth image
    processing technique,” Computer methods and programs in biomedicine, vol. 114,
    no. 2, pp. 172–182, 2014.
    [20] Z.­P. Bian, J. Hou, L.­P. Chau, and N. Magnenat­Thalmann, “Fall detection based
    on body part tracking using a depth camera,” IEEE journal of biomedical and health
    informatics, vol. 19, no. 2, pp. 430–439, 2014.
    [21] N. Noury, A. Fleury, P. Rumeau, A. K. Bourke, G. Laighin, V. Rialle, and J. Lundy,
    “Fall detection­principles and methods,” in 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 1663–1666,
    IEEE, 2007.
    [22] A. Amini, K. Banitsas, and J. Cosmas, “A comparison between heuristic and machine
    learning techniques in fall detection using kinect v2,” in 2016 IEEE International
    Symposium on Medical Measurements and Applications (MeMeA), pp. 1–6, IEEE,
    2016.
    [23] M. Kepski and B. Kwolek, “Unobtrusive fall detection at home using kinect sensor,”
    in International Conference on Computer Analysis of Images and Patterns, pp. 457–
    464, Springer, 2013.
    [24] A. Dubois and F. Charpillet, “Human activities recognition with rgb­depth camera
    using hmm,” in 2013 35th Annual International Conference of the IEEE Engineering
    in Medicine and Biology Society (EMBC), pp. 4666–4669, IEEE, 2013.
    [25] B. Jokanovic, M. G. Amin, and F. Ahmad, “Effect of data representations on deep
    learning in fall detection,” in 2016 IEEE Sensor Array and Multichannel Signal Processing Workshop (SAM), pp. 1–5, IEEE, 2016.
    [26] B. Erol, M. Amin, F. Ahmad, and B. Boashash, “Radar fall detectors: A comparison,” in Radar Sensor Technology XX, vol. 9829, p. 982918, International Society
    for Optics and Photonics, 2016.
    [27] B. Erol, M. Amin, Z. Zhou, and J. Zhang, “Range information for reducing fall false
    alarms in assisted living,” in 2016 IEEE Radar Conference (RadarConf), pp. 1–6,
    IEEE, 2016.
    47
    [28] M. Mercuri, P. J. Soh, G. Pandey, P. Karsmakers, G. A. Vandenbosch, P. Leroux, and
    D. Schreurs, “Analysis of an indoor biomedical radar­based system for health monitoring,” IEEE Transactions on Microwave Theory and Techniques, vol. 61, no. 5,
    pp. 2061–2068, 2013.
    [29] C. Garripoli, M. Mercuri, P. Karsmakers, P. J. Soh, G. Crupi, G. A. Vandenbosch,
    C. Pace, P. Leroux, and D. Schreurs, “Embedded dsp­based telehealth radar system
    for remote in­door fall detection,” IEEE journal of biomedical and health informatics, vol. 19, no. 1, pp. 92–101, 2014.
    [30] X. Ma, H. Wang, B. Xue, M. Zhou, B. Ji, and Y. Li, “Depth­based human fall detection via shape features and improved extreme learning machine,” IEEE journal
    of biomedical and health informatics, vol. 18, no. 6, pp. 1915–1922, 2014.
    [31] M. Aslan, A. Sengur, Y. Xiao, H. Wang, M. C. Ince, and X. Ma, “Shape feature
    encoding via fisher vector for efficient fall detection in depth­videos,” Applied Soft
    Computing, vol. 37, pp. 1023–1028, 2015.
    [32] E. Akagündüz, M. Aslan, A. Şengür, H. Wang, and M. C. Ince, “Silhouette orientation volumes for efficient fall detection in depth videos,” IEEE journal of biomedical
    and health informatics, vol. 21, no. 3, pp. 756–763, 2016.
    [33] A.­K. Seifert, M. G. Amin, and A. M. Zoubir, “Toward unobtrusive in­home gait
    analysis based on radar micro­doppler signatures,” IEEE Transactions on Biomedical Engineering, vol. 66, no. 9, pp. 2629–2640, 2019.
    [34] M. G. Amin, Y. D. Zhang, F. Ahmad, and K. D. Ho, “Radar signal processing for
    elderly fall detection: The future for in­home monitoring,” IEEE Signal Processing
    Magazine, vol. 33, no. 2, pp. 71–80, 2016.
    [35] E. E. Stone and M. Skubic, “Fall detection in homes of older adults using the microsoft kinect,” IEEE journal of biomedical and health informatics, vol. 19, no. 1,
    pp. 290–301, 2014.
    [36] C.­C. Chang and C.­J. Lin, “Libsvm: a library for support vector machines,” ACM
    transactions on intelligent systems and technology (TIST), vol. 2, no. 3, pp. 1–27,
    2011.
    48
    [37] C. Ding, L. Zhang, C. Gu, L. Bai, Z. Liao, H. Hong, Y. Li, and X. Zhu, “Non­contact
    human motion recognition based on uwb radar,” IEEE Journal on Emerging and
    Selected Topics in Circuits and Systems, vol. 8, no. 2, pp. 306–315, 2018.

    無法下載圖示 全文公開日期 2026/08/27 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE