簡易檢索 / 詳目顯示

研究生: 李約瑟
Yueh-Se Li
論文名稱: 基於人臉關鍵點特徵之實時疼痛強度估計
Real Time Pain Intensity Estimation Based on Facial Landmark Features
指導教授: 洪西進
Shi-Jinn Horng
口試委員: 洪西進
Shi-Jinn Horng
楊朝棟
Chao-Tung Yang
林韋宏
Wei-Hung Lin
楊昌彪
Chang-Biau Yang
吳怡樂
Yi-Leh Wu
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2022
畢業學年度: 110
語文別: 中文
論文頁數: 55
中文關鍵詞: 疼痛強度估計UNBC 肩痛資料集PSPI 疼痛分數皮爾森相關係數
外文關鍵詞: Pain Intensity Estimation, UNBC Shoulder Pain Dataset, PSPI Pain Score, Pearson correlation coefficient
相關次數: 點閱:168下載:2
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 了解一個病人目前是否處於疼痛的狀態對醫療領域至關重要,然而評估疼痛相當耗時耗力,且許多意識不清的病人沒辦法自行向醫護人員反映自己疼痛的狀況,因此便需要使用自動疼痛強度估計技術來解決上述問題。本研究提出一個疼痛強度估計架構,使用基於深度學習的表格型資料回歸模型──TabNet──作為模型骨幹,透過皮爾森相關係數找出有意義的人臉關鍵點組合,並使用這些人臉關鍵點組合之間的距離作為模型輸入。我們使用 UNBC 肩痛資料集進行實驗,實驗結果顯示本研究之方法可在皮爾森相關係數達 0.545、平均絕對誤差達 1.008、均方誤差達 2.176,且能夠在僅使用 CPU 進行運算的狀況下,達成每秒 30 幀以上的速度。


    Knowing whether a patient is currently in pain is crucial in medical field; however,
    assessing pain is time-consuming and labor-intensive, and many unconscious patients
    cannot report their pain status to medical staff on their own, so automatic pain intensity estimation techniques is required to solve the above problem. This study proposes a pain intensity estimation framework, which utilizes a deep learning tabular data regression model, TabNet, as the model backbone, finds meaningful facial landmark combinations through the Pearson correlation coefficient, and uses the distance between these facial landmarks as model input. We use the UNBC shoulder pain dataset to conduct experiments. The experimental results show that the methodology of this study can achieve a Pearson correlation coefficient of 0.545, mean absolute error of 1.008, and mean square error of 2.176, and can achieve a speed of more than 30 frames per second using only CPU to do computations.

    目錄 摘要 ················································································ I Abstract ·········································································· II 致謝 ·············································································· III 目錄 ·············································································· IV 圖目錄 ·········································································· VII 表目錄 ··········································································· IX 1 緒論 ································································· 1 1.1 研究動機和目標 ·················································· 1 1.2 研究方法 ··························································· 2 1.3 研究貢獻 ··························································· 3 1.4 本論文之結構 ····················································· 3 2 文獻探討 ··························································· 4 2.1 疼痛評估 ··························································· 4 2.2 使用表情進行疼痛強度估計 ··································· 5 2.3 基於臉部表情之疼痛強度估計相關研究 ···················· 7 2.4 基於人臉關鍵點特徵的疼痛強度估計 ····················· 10 3 研究方法 ························································· 12 3.1 UNBC 肩痛資料集 ············································ 12 3.2 人臉關鍵點偵測 ················································ 13 3.3 關鍵點選擇以及距離正規化 ································· 14 3.4 特徵選擇 ························································· 15 3.4.1 基於臉部動作編碼系統之特徵選擇 .................................. 16 3.4.2 基於皮爾森相關係數之特徵選擇 ...................................... 17 3.5 基於比較的資料增強 ·········································· 18 3.6 迴歸模型 ························································· 20 4 實驗設計 ························································· 24 4.1 系統設置 ························································· 24 4.2 模型性能評估 ··················································· 24 4.3 實驗參數設定 ··················································· 26 4.4 回歸模型訓練參數設置 ······································· 26 4.5 實時運行速度測試 ············································· 27 4.6 即時疼痛警示系統 ············································· 27 5 實驗結果與分析 ················································ 28 5.1 基於臉部動作編碼系統之特徵選擇 ························ 28 5.2 基於皮爾森相關係數之特徵選擇 ··························· 29 5.3 與相關研究進行比較 ·········································· 33 5.4 實時運行速度實驗 ············································· 34 5.5 即時疼痛警示系統展示 ······································· 34 6 結論與未來工作 ················································ 37 參考資料 ········································································ 39

    參考資料
    [1] S. O. Arık and T. Pfister, "Tabnet: Attentive interpretable tabular learning,"
    in AAAI, 2021, vol. 35, pp. 6679-6687.
    [2] P. H. Berry and J. L. Dahl, "The new JCAHO pain standards: Implications
    for pain management nurses," Pain Management Nursing, vol. 1, no. 1, pp.
    3-12, 2000.
    [3] T. F. Cootes, G. J. Edwards, and C. J. Taylor, "Active appearance models,"
    IEEE Transactions on pattern analysis and machine intelligence, vol. 23, no.
    6, pp. 681-685, 2001.
    [4] Y. N. Dauphin, A. Fan, M. Auli, and D. Grangier, "Language modeling with
    gated convolutional networks," in International conference on machine
    learning, 2017, pp. 933-941: PMLR.
    [5] C. Eccleston, "Role of psychology in pain management," British journal of
    anaesthesia, vol. 87, no. 1, pp. 144-152, 2001.
    [6] P. Ekman and W. V. Friesen, "Facial action coding system," Environmental
    Psychology & Nonverbal Behavior, 1978.
    [7] S. Erden, N. Demir, G. A. Ugras, U. Arslan, and S. Arslan, "Vital signs: Valid
    indicators to assess pain in intensive care unit patients? An observational,
    descriptive study," Nursing & health sciences, vol. 20, no. 4, pp. 502-508,
    2018.
    [8] C. Florea, L. Florea, and C. Vertan, "Learning pain from emotion:
    Transferred hot data representation for pain intensity estimation," in
    European Conference on Computer Vision, 2014, pp. 778-790: Springer.
    [9] C. Gélinas, L. Fillion, K. A. Puntillo, C. Viens, and M. Fortier, "Validation
    of the critical-care pain observation tool in adult patients," American Journal
    of Critical Care, vol. 15, no. 4, pp. 420-427, 2006.
    [10] I. Grishchenko, A. Ablavatski, Y. Kartynnik, K. Raveendran, and M.
    Grundmann, "Attention Mesh: High-fidelity Face Mesh Prediction in Real-
    time," arXiv preprint arXiv:2006.10962, 2020.
    [11] K. Herr, P. J. Coyne, T. Key, R. Manworren, M. McCaffery, S. Merkel, J.
    Pelosi-Kelly, and L. Wild, "Pain assessment in the nonverbal patient:
    Position statement with clinical practice recommendations," Pain
    Management Nursing, vol. 7, no. 2, pp. 44-52, 2006.
    [12] S. Hochreiter and J. Schmidhuber, "Long short-term memory," Neural
    computation, vol. 9, no. 8, pp. 1735-1780, 1997.
    [13] X. Hong, G. Zhao, S. Zafeiriou, M. Pantic, and M. Pietikäinen, "Capturing
    correlations of local features for image representation," Neurocomputing,
    vol. 184, pp. 99-106, 2016.
    [14] Y. Huang, L. Qing, S. Xu, L. Wang, and Y. Peng, "HybNet: A hybrid network
    structure for pain intensity estimation," The Visual Computer, vol. 38, no. 3,
    pp. 871-882, 2022.
    [15] S. Kaltwang, O. Rudovic, and M. Pantic, "Continuous pain intensity
    estimation from facial expressions," in International Symposium on Visual
    Computing, 2012, pp. 368-377: Springer.
    [16] T. Kenzaka, M. Okayama, S. Kuroki, M. Fukui, S. Yahata, H. Hayashi, A.
    Kitao, D. Sugiyama, E. Kajii, and M. Hashimoto, "Importance of vital signs
    to the early diagnosis and severity of sepsis: Association between vital signs
    and sequential organ failure assessment score in patients with sepsis,"
    Internal Medicine, vol. 51, no. 8, pp. 871-876, 2012.
    [17] C. G. Kohler, T. Turner, N. M. Stolar, W. B. Bilker, C. M. Brensinger, R. E.
    Gur, and R. C. Gur, "Differences in facial expressions of four universal
    emotions," Psychiatry research, vol. 128, no. 3, pp. 235-244, 2004.
    [18] P. Lanser and S. Gesell, "Pain management: The fifth vital sign," Healthcare
    benchmarks, vol. 8, no. 6, pp. 68-70, 62, 2001.
    [19] Y. LeCun, B. Boser, J. S. Denker, D. Henderson, R. E. Howard, W. Hubbard,
    and L. D. Jackel, "Backpropagation applied to handwritten zip code
    recognition," Neural computation, vol. 1, no. 4, pp. 541-551, 1989.
    [20] T. Ledowski, J. Bromilow, J. Wu, M. Paech, H. Storm, and S. Schug, "The
    assessment of postoperative pain by monitoring skin conductance: results of
    a prospective study," Anaesthesia, vol. 62, no. 10, pp. 989-993, 2007.
    [21] Y. Lin, L. Wang, Y. Xiao, R. D. Urman, R. Dutton, and M. Ramsay,
    "Objective pain measurement based on physiological signals," in the
    international symposium on human factors and ergonomics in health care,
    2018, vol. 7, no. 1, pp. 240-247: SAGE Publications Sage CA: Los Angeles,
    CA.
    [22] D. Lopez-Martinez, K. Peng, A. Lee, D. Borsook, and R. Picard, "Pain
    Detection with FNIRS-measured brain signals: a personalized machine
    learning approach using the wavelet transform and bayesian hierarchical
    modeling with dirichlet process priors," in 2019 8th International
    Conference on Affective Computing and Intelligent Interaction Workshops
    and Demos (ACIIW), 2019, pp. 304-309: IEEE.
    [23] P. Lucey, J. F. Cohn, K. M. Prkachin, P. E. Solomon, and I. Matthews,
    "Painful data: The UNBC-McMaster shoulder pain expression archive
    database," in 2011 IEEE International Conference on Automatic Face &
    Gesture Recognition (FG), 2011, pp. 57-64: IEEE.
    [24] M. Lynch, "Pain as the fifth vital sign," Journal of Infusion Nursing, vol. 24,
    no. 2, pp. 85-94, 2001.
    [25] A. Martins and R. Astudillo, "From softmax to sparsemax: A sparse model
    of attention and multi-label classification," in International conference on
    machine learning, 2016, pp. 1614-1623: PMLR.
    [26] N. Neshov and A. Manolova, "Pain detection from facial characteristics
    using supervised descent method," in 2015 IEEE 8th International
    Conference on Intelligent Data Acquisition and Advanced Computing
    Systems: Technology and Applications (IDAACS), 2015, vol. 1, pp. 251-256:
    IEEE.
    [27] K. M. Prkachin, "The consistency of facial expressions of pain: A
    comparison across modalities," Pain, vol. 51, no. 3, pp. 297-306, 1992.
    [28] K. M. Prkachin and P. E. Solomon, "The structure, reliability and validity of
    pain expression: Evidence from patients with shoulder pain," Pain, vol. 139,
    no. 2, pp. 267-274, 2008.
    [29] S. Rezaei, A. Moturu, S. Zhao, K. M. Prkachin, T. Hadjistavropoulos, and B.
    Taati, "Unobtrusive pain monitoring in older adults with dementia using
    pairwise and contrastive training," IEEE Journal of Biomedical and Health
    Informatics, vol. 25, no. 5, pp. 1450-1462, 2020.
    [30] P. Rodriguez, G. Cucurull, J. Gonzàlez, J. M. Gonfaus, K. Nasrollahi, T. B.
    Moeslund, and F. X. Roca, "Deep pain: Exploiting long short-term memory
    networks for facial expression classification," IEEE transactions on
    cybernetics, 2017.
    [31] P. Schober, C. Boer, and L. A. Schwarte, "Correlation coefficients:
    Appropriate use and interpretation," Anesthesia & Analgesia, vol. 126, no.
    5, pp. 1763-1768, 2018.
    [32] K. Simonyan and A. Zisserman, "Very deep convolutional networks for
    large-scale image recognition," arXiv preprint arXiv:1409.1556, 2014.
    [33] J. A. Sturgeon and A. J. Zautra, "Social pain and physical pain: Shared paths
    to resilience," Pain management, vol. 6, no. 1, pp. 63-74, 2016.
    [34] J. Wang and H. Sun, "Pain intensity estimation using deep spatiotemporal
    and handcrafted features," IEICE Transactions on Information and Systems,
    vol. 101, no. 6, pp. 1572-1580, 2018.
    [35] C.-L. Wu, S.-F. Liu, T.-L. Yu, S.-J. Shih, C.-H. Chang, S.-F. Y. Mao, Y.-S. Li,
    H.-J. Chen, C.-C. Chen, and W.-C. Chao, "Deep Learning-Based Pain
    Classifier Based on the Facial Expression in Critically Ill Patients,"
    Frontiers in Medicine, vol. 9, 2022.
    [36] R. Zhao, Q. Gan, S. Wang, and Q. Ji, "Facial expression intensity estimation
    using ordinal information," in IEEE conference on computer vision and
    pattern recognition, 2016, pp. 3466-3474.

    QR CODE