簡易檢索 / 詳目顯示

研究生: 林昱明
Yu-Ming Lin
論文名稱: 基於距離-督普勒圖之手勢辨識系統設計與實作
Design and Implementation of a Hand Gesture Recognition System Based on Range-Doppler Map
指導教授: 呂政修
Jenq-Shiou Leu
口試委員: 陳維美
林昌鴻
許德俊
學位類別: 碩士
Master
系所名稱: 電資學院 - 電子工程系
Department of Electronic and Computer Engineering
論文出版年: 2021
畢業學年度: 109
語文別: 英文
論文頁數: 41
中文關鍵詞: 手勢辨識頻率調變連續波雷達距離-督普勒圖深度學習
外文關鍵詞: hand gesture recognition, FMCW radar sensor, range­-Doppler map, deep learning
相關次數: 點閱:225下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 過去已有許多關於手勢辨識應用於人機介面的相關研究。在早期的研究中,大多數的解決方案都是基於現實影像。基於現實影像的感測器所採集到的資料通常都會有隱私權問題,並不能使用在所有應用場域中。為了解決此問題,越來越多基於非現實影像感測器的手勢辨識研究被提出。

    本研究提出了一個可用於非接觸式設備控制的動態手勢辨識系統,本系統所使用的雷達是60GHz之頻率調變連續波雷達。在本文中我們解析了手勢的雷達訊號,並將其轉換為人類可以理解的物理量,如距離、速度、角度。從這些物理量,我們可以根據不同需求客製化我們的系統。我們提出了可端到端訓練的深度學習模型(使用NN+LSTM),將轉換過的雷達訊號進一步提取特徵,並根據此特徵辨識手勢,本研究所提出的模型可以被部署到為嵌入式平台優化過的深度學習框架,如Tensorflow Lite.

    我們在收集訓練資料的工作中,使用了攝影機來輔助標籤訓練資料。為了解決手勢之間的長度不同的問題,本文提出固定雷達資料之幀長的方法,在實驗中驗證準確度損失低於1%。在實驗中我們使用獨立驗證資料集驗證,而本研究所提出的模型準確度可達98%。


    There have been several studies of hand gesture recognition for human-machine interfaces. In early work, most solutions are vision-based. The data capture from the vision-based sensor usually has privacy problems that make it not usable in some scenarios. To solve the privacy issues, more and more studies about the techniques of non-vision-based hand gesture recognition are proposed.

    This paper proposes a dynamic hand gesture system that can be used in non-contact device control. Our system is based on 60GHz FMCW radar. In this paper, we resolve the radar signals of hand gestures and transform them to the human-understandable domain such as range, velocity, and angle. With these signatures, we can customize our system based on different scenarios. We proposed an end-to-end training deep learning model(NN+LSVM) that extracts the transformed radar signals to features and classifies the extracted features to hand gesture labels, and our model can be deployed to deep learning platforms for embedded systems such as Tensorflow lite.

    In our training data collecting work, a camera is used to support labeling hand gesture data. To solve the dynamic frame length of the hand gesture data, the experiment for our proposal shows the dropping of the accuracy is less than 1%. Using an external testing set, the accuracy of our model can reach 98%.

    Contents Abstract in Chinese iii Abstract in English iv Acknowledgments v Contents vi List of Figures viii List of Tables x 1 Introduction 1 2 Background and Related Work .3 2.1 Vision Based Hand Gesture Recognition 3 2.2 Radar System 3 2.2.1 FMCW Radar 3 2.2.2 Hand Gesture Recognition with Radar System 3 3 Proposed Method 5 3.1 System Description 5 3.2 Pre­processing Radar Data 6 3.2.1 Processing Range­Doppler Images 6 3.2.2 The Problems of Using Range­Doppler Map 8 3.2.3 Using Range­Angle Map 10 3.2.4 Calculate Range­Angle Feature 12 3.3 Data Capturing 13vi 3.4 Classification 15 3.4.1 Padding Frames 15 3.4.2 Gesture Recognition Model 15 4 Experiments 18 4.1 Data collection 18 4.1.1 Experiment Setup 18 4.1.2 Collecting and Labeling Data with an Image Based Algorithm 18 4.1.3 Hand Gesture Dataset 19 4.2 Evaluation Metrics 23 4.2.1 Confusion Matrix 23 4.2.2 Accuracy 23 4.2.3 Recall and Precision 23 4.3 Normal and Bidirectional LSTM Layer 24 4.4 Input padding 25 4.5 The Strategy of Data Collecting 25 5 Conclusions 27 References 28

    [1]S. S. Rautaray and A. Agrawal, “Vision based hand gesture recognition for humancomputer interaction: a survey,”Artificial Intelligence Review, vol. 43, no. 1, pp. 1–54, 2015.
    [2]A. Butler, S. Izadi, and S. Hodges, “Sidesight: Multi­”touch” interaction aroundsmall devices,” inProceedings of the 21st Annual ACM Symposium on User In­terface Software and Technology, UIST ’08, (New York, NY, USA), p. 201–204,Association for Computing Machinery, 2008.
    [3]P. K. Pisharady and M. Saerbeck, “Recent methods and databases in vision­basedhand gesture recognition: A review,”Computer Vision and Image Understanding,vol. 141, pp. 152–165, 2015. Pose & Gesture.
    [4]E. Ohn­Bar and M. M. Trivedi, “Hand gesture recognition in real time for automo­tive interfaces: A multimodal vision­based approach and evaluations,”IEEE Trans­actions on Intelligent Transportation Systems, vol. 15, no. 6, pp. 2368–2377, 2014.
    [5]G. Simion and C.­D. Caleanu, “A tof 3d database for hand gesture recognition,”in2012 10th International Symposium on Electronics and Telecommunications,pp. 363–366, 2012.
    [6]M. Van den Bergh and L. Van Gool, “Combining rgb and tof cameras for real­time3d hand gesture interaction,” in2011 IEEE Workshop on Applications of ComputerVision (WACV), pp. 66–72, 2011.
    [7]E. Kollorz, J. Penne, J. Hornegger, and A. Barke, “Gesture recognition with a time­of­flight camera,”IJISTA, vol. 5, pp. 334–343, 01 2008.
    [8]R. K. Megalingam, V. Rangan, S. Krishnan, and A. B. Edichery Alinkeezhil, “Irsensor­based gesture control wheelchair for stroke and sci patients,”IEEE SensorsJournal, vol. 16, no. 17, pp. 6755–6765, 2016.28
    [9]H.­T. Cheng, A. M. Chen, A. Razdan, and E. Buller, “Contactless gesture recog­nition system using proximity sensors,” in2011 IEEE International Conference onConsumer Electronics (ICCE), pp. 149–150, 2011.
    [10]S.Y.Kim,H.G.Han,J.W.Kim,S.Lee,andT.W.Kim,“Ahandgesturerecognitionsensor using reflected impulses,”IEEE Sensors Journal, vol. 17, no. 10, pp. 2975–2976, 2017.
    [11]J. McIntosh, A. Marzo, M. Fraser, and C. Phillips,EchoFlex: Hand Gesture Recog­nition Using Ultrasound Imaging,p.1923–1934. NewYork,NY,USA:Associationfor Computing Machinery, 2017.
    [12]J. Lien, N. Gillian, M. E. Karagozler, P. Amihood, C. Schwesig, E. Olson, H. Raja,and I. Poupyrev, “Soli: Ubiquitous gesture sensing with millimeter wave radar,”ACM Trans. Graph., vol. 35, July 2016.
    [13]S. Wang, J. Song, J. Lien, I. Poupyrev, and O. Hilliges, “Interacting with soli: Ex­ploring fine­grained dynamic gesture recognition in the radio­frequency spectrum,”inProceedings of the 29th Annual Symposium on User Interface Software and Tech­nology, UIST ’16, (New York, NY, USA), p. 851–860, Association for ComputingMachinery, 2016.
    [14]Y.KimandB.Toomajian,“Handgesturerecognitionusingmicro­dopplersignatureswith convolutional neural network,”IEEE Access, vol. 4, pp. 7125–7130, 2016.
    [15]M. Yu, N. Kim, Y. Jung, and S. Lee, “A frame detection method for real­time handgesture recognition systems using cw­radar,”Sensors, vol. 20, no. 8, 2020.
    [16]X.zhang,Q.Wu,andD.Zhao,“Dynamichandgesturerecognitionusingfmcwradarsensor for driving assistance,” in2018 10th International Conference on WirelessCommunications and Signal Processing (WCSP), pp. 1–6, 2018.
    [17]J.S.Suh,S.Ryu,B.Han,J.Choi,J.­H.Kim,andS.Hong,“24ghzfmcwradarsystemfor real­time hand gesture recognition using lstm,” in2018 Asia­Pacific MicrowaveConference (APMC), pp. 860–862, 2018.
    [18]L.Stankovic, T.Thayaparan, M.Dakovic, andP.­B.Vesna, “Micro­dopplerremovalintheradarimaginganalysis,”IEEE Transactions on Aerospace Electronic Systems,vol. 49, pp. 1234–1250, 04 2013.

    無法下載圖示 全文公開日期 2026/10/13 (校內網路)
    全文公開日期 2026/10/13 (校外網路)
    全文公開日期 2026/10/13 (國家圖書館:臺灣博碩士論文系統)
    QR CODE