簡易檢索 / 詳目顯示

研究生: 鄭雅文
Ya-Wen Zheng
論文名稱: 利用臉部表情以CNN建置網站滿意度預測模型
Using CNN to build a facial expression model to predict website satisfaction
指導教授: 林久翔
Chiuhsiang Joe Lin
口試委員: 林久翔
林承哲
許聿靈
學位類別: 碩士
Master
系所名稱: 管理學院 - 工業管理系
Department of Industrial Management
論文出版年: 2022
畢業學年度: 110
語文別: 中文
論文頁數: 76
中文關鍵詞: CNN臉部表情網站滿意度使用者經驗
外文關鍵詞: CNN, facial expressions, website satisfaction, user experience
相關次數: 點閱:137下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在現今網站及臉部識別應用相當廣泛的時代下,透過臉部表情預測使用者對於網站的滿意度作為使用者經驗(UX)之數據,不僅可以透過客觀生理數據做為網站滿意度的評估依據,也可降低蒐集主觀數據之時間成本,來提升分析的準確度。
    本研究建置CNN模型來預測受測者的臉部表情,並與SUS量表問卷進行網站滿意度分析,共招募15位受測者參與實驗,受測者須操作3種類型網站並完成所交付之任務後填寫SUS量表,整個實驗過程將蒐集受測者臉部表情資料進行分析,探討臉部表情與網站滿意度之間的關係。
    研究結果顯示臉部表情與網站滿意度有顯著的正相關,且透過皮爾森相關分析與單因子變異分析後得知,若使用者在操作網站時,出現開心(happy)或生氣(angry)之臉部表情特徵,表示使用者對該網站的滿意度為佳,相反之,若使用者在操作網站時,出現傷心(sad)之臉部表情特徵,則無法判別使用者對該網站的滿意度。期望本研究之結果能對於臉部表情預測網站滿意度帶來貢獻。


    Nowadays, the application of face recognition is very extensive. The user's satisfaction with the website can possibly be predicted by facial expressions as the feedback of user experience (UX). Not only can objective physiological data be used as a basis for evaluating website satisfaction, but also the time and cost of collecting respondent data can be reduced. This study builds a CNN model to predict the participants' facial expressions, and conducts website satisfaction analysis with the System Usability Scale. A total of 15 respondents were recruited to participate in the website evaluation task. Respondents were required to fill in the System Usability Scale after interacting with three types of website and complete the assigned tasks. The facial expression data of the respondents were collected for analysis to explore the relationship between facial expression and website satisfaction. The results show that some of the facial expressions are significantly positively correlated with website satisfaction. Pearson's correlation analysis and one-way variance analysis further shows that if the user was happy or angry when interacting with the website, the facial expression features predicted that the user had a satisfied experience. If the user was sad, the facial expression features could not determine the user's satisfaction with the website. Further research is needed to develop a better prediction model based on the initial results revealed in the present facial expression prediction study.

    摘要 i Abstract ii 誌謝 iii 目錄 iv 圖目錄 vi 表目錄 vii 第一章 緒論 1 1.1研究背景與動機 1 1.2研究目的 2 第二章 文獻探討 3 2.1臉部辨識應用現況 3 2.2臉部表情識別的情緒分類 3 2.3監督式學習(Supervised learning) 4 2.4卷積神經網路(Convolutional Neural Networks,CNN) 5 2.5 Resnet50 6 2.6遷移學習(Transfer Learning) 8 2.7滿意度 8 第三章 研究方法 9 3.1實驗設計 9 3.2受測者 9 3.3實驗設備 10 3.3.1無線生理訊號蒐集系統 10 3.3.2相機 10 3.3.3操作網站 10 3.3.4主觀評量問卷 11 3.3.5桌上型電腦 12 3.3.6 Free Video to JPG Converter 12 3.4實驗任務與流程 14 3.4.1實驗任務 14 3.4.2實驗流程 14 3.5資料處理與分析方法 15 3.5.1資料預處理 15 3.5.2模型訓練 18 3.5.3預測結果 19 3.5.4統計分析 23 第四章 研究結果 24 4.1重複量測 25 4.2臉部表情分類分析 26 4.2.1臉部表情分七類 27 4.2.2表情分五類結果 28 4.2.3表情分三類結果 30 4.3編碼分析 31 4.3.1分類編七碼結果 33 4.3.2分類編五碼結果 33 4.3.3分類編三碼結果 34 4.4後續分析 35 4.5臉部表情分類-三分法 37 4.5.1編碼分析-三分法 39 4.5.2臉部表情與SUS分析-三分法 40 4.5.3 SUS量表逐題分析 40 4.5.4正向題分析結果 41 4.5.5負向題分析結果 43 4.6臉部表情分類-二分法 45 4.6.1編碼分析-二分法 46 4.6.2臉部表情與SUS分析-二分法 47 4.6.3正向題分析結果 47 4.6.4負向題分析結果 49 4.7研究結論 51 第五章 研究結果討論 52 5.1臉部表情與SUS分析 52 5.2多元迴歸分析 54 5.3 3D座標點臉部表情分類 55 5.4假設驗證 57 第六章 結論與展望 59 6.1結論 59 6.2研究限制與未來展望 59 參考文獻 61 附錄一 參與研究同意書 63 附錄二 SUS量表 64 附錄三 操作網站主頁 65

    Aaron Bangor, Philip T. Kortum & James T. Miller (2008). An Empirical Evaluation of the System Usability Scale. International Journal of Human-Computer Interaction, 24:6, 574-594
    A. Pentland. Looking at people: Sensing for ubiquitous and wearable computing. IEEE Transac tions on Pattern Analysis and Machine Intelligence, 22(1):107–119, 2000.
    Beyer, H., & Holtzblatt, K. (1998). Contextual design: Defining customer-centered systems. San Francisco: Morgan Kaufmann.
    Byoung Chul Ko (2018). A Brief Review of Facial Emotion Recognition Based on Visual Information.
    Chowdary, M.K., Nguyen, T.N. & Hemanth, D.J. (2021). Deep learning-based facial emotion recognition for human–computer interaction applications. Neural Comput & Applic (2021).
    D. Meng, X. Peng, K. Wang and Y. Qiao (2019). Frame Attention Networks for Facial Expression Recognition in Videos. 2019 IEEE International Conference on Image Processing (ICIP), 2019, pp. 3866-3870, doi: 10.1109/ICIP.2019.8803603.
    DeLone, W. H., & McLean, E. R. (1992). Information system success: The quest for the dependent variable. Information Systems Research, 3(1),60-95.
    Ekman P (1992) An argument for basic emotions. Cogn Emot 6(3–4):169–200
    Ekman, P. (1972). Universal and cultural differences in facial expression of emotion. In J. K. Cole (Ed.), Nebraska Symposium on Motivation 1971 (pp. 207-286). Lincoln, NE: University of Nebraska Press.
    Elfenbein, H. A., & Ambady, N. (2002a). On the universality and cultural specificity of emotion recognition: A meta-analysis. Psychological Bulletin, 128, 203-235.
    Izard, C. E. (1971). The face of emotion. New York: Appleton-Century-Crofts.
    Izard, C. E. (2007). Basic emotions, natural kinds, emotional schemas, and a new paradigm. Perspectives on Psychological Science, 2, 260-280.
    James R. Lewis (2018). The System Usability Scale: Past, Present, and Future. International Journal of Human-Computer Interaction, 34:7, 577-590
    Jennifer R. B., & Schall, A. J. (2014). Eye tracking in user experience design. Amsterdam: Morgan Kaufmann.
    Lewis, J. R. (2015). Introduction to the special issue on usability and user experience: Methodological evolution. International Journal of Human-Computer Interaction, 31(9), 555-556. doi: 10.1080/10447318.2015.1065689
    Mahlke, S., Minge, M., & Thüring, M. (2006). Measuring multiple components of emotions in interactive contexts. In CHI'06 extended abstracts on Human factors in computing systems (pp. 1061-1066). New York: ACM Press. doi: 10.1145/1125451.1125653
    Mckinney,Yoon & Zahedi (2002). The Measurement of Web-Customer Satisfaction: An Expectation and Disconfirmation Approach. Information System Research, 13(3), 296-315.
    M. Minsky. The Society of Mind. Simon & Schuster, New York, N.Y., 1988.
    Martinez, A., & Du, S. (2012). A model of the perception of facial expressions of emotion by humans: research overview and perspectives. Journal of Machine Learning Research, 13(5).
    Norden, F., Reis Marlevi, F. (2019). A Comparative Analysis of Machine Learning Algorithms in Binary Facial Expression Recognition.
    Padraig Cunningham, Matthieu Cord, and Sarah Jane Delany (2008). Supervised Learning.
    Pettersson, I., Lachner, F., Frison, A.-K., Riener, A., & Butz, A. (2018). A Bermuda Triangle. A Review of Method Application and Triangulation in User Experience Evaluation. Paper presented at the Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems.
    Poteat, L. F., Shockley, K. M., & Allen, T. D. (2009). Mentor-protegee commitment fit and relationship satisfaction in academic mentoring. Journal of vocational behavior, 74(3), 332-337.
    Qiong Liu, Ying Wu (2012). Supervised Learning.
    Rathi, Pulkit and Kuwar Gupta, Raj and Agarwal, Soumya and Shukla, Anupam. (2020). Sign Language Recognition Using ResNet50 Deep Neural Network Architecture. 5th International Conference on Next Generation Computing Technologies, Available at SSRN.
    Rosenzweig, E. (2015). Successful user experience: Strategies and roadmaps. Burlington, VT: Morgan Kaufmann.
    Serra, G., Cucchiara, R., Kitani, K. M., & Civera, J. (2017). Guest editorial special issue on wearable and ego-vision systems for augmented experience. IEEE Transactions on Human-Machine Systems, 47(1), 1-5. doi: 10.1109/THMS.2016.2646600
    Tao, J., & Tan, T. (2005). Affective computing: A review. Paper presented at the International Conference on Affective computing and intelligent interaction.
    Theckedath, D., Sedamkar, R.R. (2020). Detecting Affect States Using VGG16, ResNet50 and SE-ResNet50 Networks. SN COMPUT. SCI. 1, 79.
    Ventura, D., & Warnick, S. (2007). A theoretical foundation for inductive transfer. Brigham Young University, College of Physical and Mathematical Sciences.
    W. Zhao, R. Chellappa, P. J. Phillips & A. Rosenfeld (2003). Face Recognition: A Literature Survey. ACM Computing Surveys, Vol. 35, No. 4, December 2003, pp. 399–458.
    Yassin Kortli, Maher Jridi, Ayman Al Falou and Mohamed Atri (2020). Face Recognition Systems: A Survey.
    襲充文,黃世琤,葉娟妤(2012)。台灣地區華人情緒與相關心理生理資料庫—大學生基本情緒臉部表情資料庫。中華心理學刊 民102,55卷,4期,455-474。
    鄭至峰(2022)。建構基於開源軟硬體的低成本無線生理訊號蒐集系統。

    無法下載圖示 全文公開日期 2025/06/27 (校內網路)
    全文公開日期 2120/06/27 (校外網路)
    全文公開日期 2120/06/27 (國家圖書館:臺灣博碩士論文系統)
    QR CODE