研究生: |
鄭雅文 Ya-Wen Zheng |
---|---|
論文名稱: |
利用臉部表情以CNN建置網站滿意度預測模型 Using CNN to build a facial expression model to predict website satisfaction |
指導教授: |
林久翔
Chiuhsiang Joe Lin |
口試委員: |
林久翔
林承哲 許聿靈 |
學位類別: |
碩士 Master |
系所名稱: |
管理學院 - 工業管理系 Department of Industrial Management |
論文出版年: | 2022 |
畢業學年度: | 110 |
語文別: | 中文 |
論文頁數: | 76 |
中文關鍵詞: | CNN 、臉部表情 、網站滿意度 、使用者經驗 |
外文關鍵詞: | CNN, facial expressions, website satisfaction, user experience |
相關次數: | 點閱:137 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
在現今網站及臉部識別應用相當廣泛的時代下,透過臉部表情預測使用者對於網站的滿意度作為使用者經驗(UX)之數據,不僅可以透過客觀生理數據做為網站滿意度的評估依據,也可降低蒐集主觀數據之時間成本,來提升分析的準確度。
本研究建置CNN模型來預測受測者的臉部表情,並與SUS量表問卷進行網站滿意度分析,共招募15位受測者參與實驗,受測者須操作3種類型網站並完成所交付之任務後填寫SUS量表,整個實驗過程將蒐集受測者臉部表情資料進行分析,探討臉部表情與網站滿意度之間的關係。
研究結果顯示臉部表情與網站滿意度有顯著的正相關,且透過皮爾森相關分析與單因子變異分析後得知,若使用者在操作網站時,出現開心(happy)或生氣(angry)之臉部表情特徵,表示使用者對該網站的滿意度為佳,相反之,若使用者在操作網站時,出現傷心(sad)之臉部表情特徵,則無法判別使用者對該網站的滿意度。期望本研究之結果能對於臉部表情預測網站滿意度帶來貢獻。
Nowadays, the application of face recognition is very extensive. The user's satisfaction with the website can possibly be predicted by facial expressions as the feedback of user experience (UX). Not only can objective physiological data be used as a basis for evaluating website satisfaction, but also the time and cost of collecting respondent data can be reduced. This study builds a CNN model to predict the participants' facial expressions, and conducts website satisfaction analysis with the System Usability Scale. A total of 15 respondents were recruited to participate in the website evaluation task. Respondents were required to fill in the System Usability Scale after interacting with three types of website and complete the assigned tasks. The facial expression data of the respondents were collected for analysis to explore the relationship between facial expression and website satisfaction. The results show that some of the facial expressions are significantly positively correlated with website satisfaction. Pearson's correlation analysis and one-way variance analysis further shows that if the user was happy or angry when interacting with the website, the facial expression features predicted that the user had a satisfied experience. If the user was sad, the facial expression features could not determine the user's satisfaction with the website. Further research is needed to develop a better prediction model based on the initial results revealed in the present facial expression prediction study.
Aaron Bangor, Philip T. Kortum & James T. Miller (2008). An Empirical Evaluation of the System Usability Scale. International Journal of Human-Computer Interaction, 24:6, 574-594
A. Pentland. Looking at people: Sensing for ubiquitous and wearable computing. IEEE Transac tions on Pattern Analysis and Machine Intelligence, 22(1):107–119, 2000.
Beyer, H., & Holtzblatt, K. (1998). Contextual design: Defining customer-centered systems. San Francisco: Morgan Kaufmann.
Byoung Chul Ko (2018). A Brief Review of Facial Emotion Recognition Based on Visual Information.
Chowdary, M.K., Nguyen, T.N. & Hemanth, D.J. (2021). Deep learning-based facial emotion recognition for human–computer interaction applications. Neural Comput & Applic (2021).
D. Meng, X. Peng, K. Wang and Y. Qiao (2019). Frame Attention Networks for Facial Expression Recognition in Videos. 2019 IEEE International Conference on Image Processing (ICIP), 2019, pp. 3866-3870, doi: 10.1109/ICIP.2019.8803603.
DeLone, W. H., & McLean, E. R. (1992). Information system success: The quest for the dependent variable. Information Systems Research, 3(1),60-95.
Ekman P (1992) An argument for basic emotions. Cogn Emot 6(3–4):169–200
Ekman, P. (1972). Universal and cultural differences in facial expression of emotion. In J. K. Cole (Ed.), Nebraska Symposium on Motivation 1971 (pp. 207-286). Lincoln, NE: University of Nebraska Press.
Elfenbein, H. A., & Ambady, N. (2002a). On the universality and cultural specificity of emotion recognition: A meta-analysis. Psychological Bulletin, 128, 203-235.
Izard, C. E. (1971). The face of emotion. New York: Appleton-Century-Crofts.
Izard, C. E. (2007). Basic emotions, natural kinds, emotional schemas, and a new paradigm. Perspectives on Psychological Science, 2, 260-280.
James R. Lewis (2018). The System Usability Scale: Past, Present, and Future. International Journal of Human-Computer Interaction, 34:7, 577-590
Jennifer R. B., & Schall, A. J. (2014). Eye tracking in user experience design. Amsterdam: Morgan Kaufmann.
Lewis, J. R. (2015). Introduction to the special issue on usability and user experience: Methodological evolution. International Journal of Human-Computer Interaction, 31(9), 555-556. doi: 10.1080/10447318.2015.1065689
Mahlke, S., Minge, M., & Thüring, M. (2006). Measuring multiple components of emotions in interactive contexts. In CHI'06 extended abstracts on Human factors in computing systems (pp. 1061-1066). New York: ACM Press. doi: 10.1145/1125451.1125653
Mckinney,Yoon & Zahedi (2002). The Measurement of Web-Customer Satisfaction: An Expectation and Disconfirmation Approach. Information System Research, 13(3), 296-315.
M. Minsky. The Society of Mind. Simon & Schuster, New York, N.Y., 1988.
Martinez, A., & Du, S. (2012). A model of the perception of facial expressions of emotion by humans: research overview and perspectives. Journal of Machine Learning Research, 13(5).
Norden, F., Reis Marlevi, F. (2019). A Comparative Analysis of Machine Learning Algorithms in Binary Facial Expression Recognition.
Padraig Cunningham, Matthieu Cord, and Sarah Jane Delany (2008). Supervised Learning.
Pettersson, I., Lachner, F., Frison, A.-K., Riener, A., & Butz, A. (2018). A Bermuda Triangle. A Review of Method Application and Triangulation in User Experience Evaluation. Paper presented at the Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems.
Poteat, L. F., Shockley, K. M., & Allen, T. D. (2009). Mentor-protegee commitment fit and relationship satisfaction in academic mentoring. Journal of vocational behavior, 74(3), 332-337.
Qiong Liu, Ying Wu (2012). Supervised Learning.
Rathi, Pulkit and Kuwar Gupta, Raj and Agarwal, Soumya and Shukla, Anupam. (2020). Sign Language Recognition Using ResNet50 Deep Neural Network Architecture. 5th International Conference on Next Generation Computing Technologies, Available at SSRN.
Rosenzweig, E. (2015). Successful user experience: Strategies and roadmaps. Burlington, VT: Morgan Kaufmann.
Serra, G., Cucchiara, R., Kitani, K. M., & Civera, J. (2017). Guest editorial special issue on wearable and ego-vision systems for augmented experience. IEEE Transactions on Human-Machine Systems, 47(1), 1-5. doi: 10.1109/THMS.2016.2646600
Tao, J., & Tan, T. (2005). Affective computing: A review. Paper presented at the International Conference on Affective computing and intelligent interaction.
Theckedath, D., Sedamkar, R.R. (2020). Detecting Affect States Using VGG16, ResNet50 and SE-ResNet50 Networks. SN COMPUT. SCI. 1, 79.
Ventura, D., & Warnick, S. (2007). A theoretical foundation for inductive transfer. Brigham Young University, College of Physical and Mathematical Sciences.
W. Zhao, R. Chellappa, P. J. Phillips & A. Rosenfeld (2003). Face Recognition: A Literature Survey. ACM Computing Surveys, Vol. 35, No. 4, December 2003, pp. 399–458.
Yassin Kortli, Maher Jridi, Ayman Al Falou and Mohamed Atri (2020). Face Recognition Systems: A Survey.
襲充文,黃世琤,葉娟妤(2012)。台灣地區華人情緒與相關心理生理資料庫—大學生基本情緒臉部表情資料庫。中華心理學刊 民102,55卷,4期,455-474。
鄭至峰(2022)。建構基於開源軟硬體的低成本無線生理訊號蒐集系統。