簡易檢索 / 詳目顯示

研究生: 許文正
Wen-Cheng Hsu
論文名稱: 應用臉孔表情變化偵測預測消費者對有趣度感受程度之可行性研究
A feasibility study for using facial expression detection to forecast the consumers’ degree of interestingness
指導教授: 周碩彥
Shuo-Yan Chou
口試委員: 楊文鐸
Wen-Dwo Yang
張聖麟
Sheng-Lin Chang
學位類別: 碩士
Master
系所名稱: 管理學院 - 工業管理系
Department of Industrial Management
論文出版年: 2005
畢業學年度: 93
語文別: 英文
論文頁數: 65
中文關鍵詞: 臉孔偵測笑容強度
外文關鍵詞: face detection, smile magnitude
相關次數: 點閱:241下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 隨著數位化影音的潮流到來,我們希望能發展一套當受試者正在看娛樂影片時,可以偵察出受試者當時感受強度的工具。然後,我們便可以知道這部影片在消費者心中受歡迎程度。我們通常用問卷調查法、或是晤談...等方法來瞭解受試者的內心世界,但是使用這樣的方法有其偏誤與限制,所以我們希望能有一套工具來自然地協助量測。
    由於人類的臉部表情在沒有社交與壓抑情緒的情況下,會自然的力用表情流露他的感受與強度。臉孔偵測已經被發展日漸趨於成熟,我們期望能應用臉孔偵測來得知我們的內心感受。
    由於我們不夠時間與資源來徹底研究,因此先行作笑容與感覺強度的研究。我們利用實驗來得知笑容強度與內心感受程度的相關與趨向。


    With the trend of digital video and audio coming, we wish to develop a tool that can be used to conjecture the consumer’s intensity of feeling when consumer watches the movie. Then, we can know how popular the movie in the consumer’s heart. Due to understand consumer’s inner world is very difficult, we often use questionnaire or interview to know what and how the consumers feel in the past. But using questionnaires or interviews maybe make a bias. Because the diction maybe influence on subjects and the investigators are different that maybe influence the uniformity. So, we wish to develop a tool can detect consumer’s inner world naturally.
    The face detection system was developed in the past and it grew up every day. We wish to use the face detection system that can help us to understand the consumer’s intensity of feeling to form a feeling detection system. Our face can express our feeling and intensity naturally without social contact and constrained consciousness.
    Due to we have not enough time to study, we focus on the relation between smile magnitude and intensity of feeling in this study. We use a experiment to understand if there are relations between smile magnitude and intensity of feeling and relations between smile magnitude and price that consumers anticipate. In this study we also find some significant relation between smile magnitude and intensity of feeling. They will be represented in chapter four.

    Abstract------------------------------------------------------------------------I Acknowledgements---------------------------------------------------------------II Content-----------------------------------------------------------------------III List of Figures----------------------------------------------------------------Ⅴ List of Tables-----------------------------------------------------------------Ⅶ Chapter 1 Introduction----------------------------------------------------------1 1.1 Motivation and Background---------------------------------------------------1 1.2 Objective-------------------------------------------------------------------2 1.3 Experimental Limitation-----------------------------------------------------3 1.4 Organization of Thesis------------------------------------------------------4 Chapter 2 Literature Review-----------------------------------------------------5 2.1 Facial Expressions and Emotion----------------------------------------------5 2.1.1 Facial Measurement--------------------------------------------------------7 2.1.1.1 Muscle Tonus Measurement------------------------------------------------7 2.1.1.2 Measurement of Visible Action-------------------------------------------8 2.1.2 Facial Action Coding System----------------------------------------------10 2.2 Consumer Emotion-----------------------------------------------------------11 2.3 Automated Face Analysis----------------------------------------------------13 2.4 Smile Magnitude------------------------------------------------------------29 Chapter 3 Research Method and Process------------------------------------------32 3.1 Research Method------------------------------------------------------------32 3.1.1 Independent variable-----------------------------------------------------32 3.1.2 Dependent Variable-------------------------------------------------------32 3.2 Subject information--------------------------------------------------------33 3.3 Experimental Material and Equipment----------------------------------------33 3.4 Experimental Environment---------------------------------------------------34 3.5 Experimental Design and Process--------------------------------------------35 3.5.1 Experimental Design------------------------------------------------------35 3.5.2 Experimental Process-----------------------------------------------------39 Chapter 4 Discussion of analysis result----------------------------------------41 4.1 Discussion-----------------------------------------------------------------41 4.1.1 Subject’s State---------------------------------------------------------41 4.1.2 Facial Variation---------------------------------------------------------44 4.1.3 Pricing and Subjective Estimate------------------------------------------48 4.1.4 Relation Between Facial Variation and Subjective Estimate----------------50 4.1.5 Relation between Facial Variation and Pricing----------------------------53 Chapter 5 Conclusion and Future Work-------------------------------------------54 Reference----------------------------------------------------------------------54 Appendix-----------------------------------------------------------------------61

    [1] Birdwhistell, R. L. 1970 Kinesics and Context Philaddelphia Univ. Pennsylvanta Press.
    [2] Bolzani-Dinehart, L., Messinger, D. S., Acosta, S., Cassel, T., Ambadar, Z, & Cohn, J. F. (2003, April). A dimensional approach to infant facial expressions. Poster session presented at the Society for Research in Child Development, Tampa, FL.
    [3] Camras, L. A., Lambrecht, L. & Michel, G. (1996). Infant "surprise" expressions as coordinative motor structures. Journal of Nonverbal Behavior, 20, 183-195.
    [4] De Carlo, D. & Metaxas, D. (1996). The integration of optical flow and deformable models with applications to human face shape and motion estimation. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 15, 231-238.
    [5] Ekman P., Ed. Darwin and Facial Expression: A Century of Research in Review. New York: Academic Press, 1973.
    [6] Ekman P., Priesen W. V. & Ellsworth, P., Emotions in the Human Face: Guidelines for Research and an Integration of Findings. New York: Pergamon, 1972.
    [7] Ekman, P, Friesen, W.V 1975 Unmasking the Face Englewood Cliffs, NJ Prentice-Hall.
    [8] Ekman, P., & Friesen, W. V. (1978). Facial action coding system. Palo Alto, CA: Consulting Psychologists Press.
    [9] Ekman, P., Friesen, W.V., Tomkins, S.S. 1971 Facial Affect Scoring Technique (FAST). A first validity study Semiotica 3 (1) 37-38.
    [10] Ekman, P., Schwartz, G. E., & Friesen, W. V. (1978). Electrical and visible signs of facial action. San Francisco, California: Human Interaction Laboratory, University of California San Francisco.
    [11] Essa, I., Pentland, A., 1997, ‘Coding analysis interpretation recognition of facial expressions’, IEEE Trans. PAMI 19(7): 757-763.
    [12] Frank, MG., & Ekman, P. (1997). The ability to detect deceit generalizes across different types of high-stakes lies. Journal of Personality and Social Psychology 72, 1429-1439.
    [13] H.A Rowley, S.Baluja, T.Kanade, “ Rotation Invariant neural network-based face detection “, Proc. IEEE Conf. Computer Vision and Pattern Recognition , pp.38-44,1998
    [14] In Dalgleish, T., & Power, M. (1999). Handbook of Cognition and Emotion. New York: John Wiley & Sons Ltd.
    [15] Jeffrey, F. Cohn, Adena J. Zlochower, James Lien, and Takeo Kanade. Automated Face Analysis by Feature Point Tracking Has High Concurrent Validity with Manual FACS Coding. Department of Psychology, University of Pittsburgh and Robotics Institute, Carnegie Mellon University.
    [16] Landis, C. studies of emotional expression: General behavior and facial expression. psychol. 1942, 4, 447-498.
    [17] Lien, Kanade, Cohn, & Li, “Detection, Tracking, and Classification of Action Units in Facial Expression” Journal of Robotics and Autonomous Systems, May 2000.
    [18] Lynn, J. G. An apparatus and method for simulating, recording and measuring facial expressions. Psychol., 1940, 27, 81-88.
    [19] Messinger, D.S., Fogel, A., & Dickson, K.L. (2001a). All smiles are positive, but some smiles are more positive than others. Developmental Psychology, 37 (5), 642–653.
    [20] Moriyama, T., Xiao, J., Cohn, J.F., & Kanade, T. (2004). Meticulously detailed eye model and its application to analysis of facial image. Proceedings of the IEEE Conference on Systems, Man, and Cybernetics. The Hague, the Netherlands, 629-634.
    [21] R.E. Kraut and R. Johnson, Social and emotional messages of smiling: An ethological approach, Journal of Personality and Social Psychology, 37, 1539-1553, 1979.
    [22] Ronald, E. Shor, the production and judgement of smile magnitude, the journal of general psycology, 1978, 98, 79-96.
    [23] Rubsenstein, L. Facial expressions: An objective method in the quantitative evaluation of emotional change. Behav. Res. Meth. & Instrum., 1969, 1, 305-306.
    [24] Schlosberg, G. E., Fair, P.L., Salt, P, Mandel, M.K., Klerman, G.L. 1976 Facial muscle patterning to affective imagery in depressed and non-depressed subjects Science 192 (4238) 489-91.
    [25] Schwartz, G. E., Fair P. L., Greenberg P. S., Forman J. M. & Klerman G. L. Self-generated affective imagery elicits discrete patterns of facial muscle activity. Paper read at the 40th Annual Meeting of the Society of Psychophysiological Research, Salt Lake City, Utah, October 24-27, 1974.
    [26] Smith, R.P. 1973 Frontalis muscle tension and personality Psychophysiology 10(3) 311-12.
    [27] Sully, J. Essays, on Laughter. New York: Longmans Green, 1902.
    [28] T. Moriyama, J. Xiao, J. Cohn, and T. Kanade, Proceedings of the IEEE Conference on Systems, Man, and Cybernetics, 2004, pp. 629 - 634.
    [29] Tian, Y. L., Kanade, T., & Cohn, J. F. (2001). Recognizing action units for facial expression analysis. IEEE.
    [30] Tian, Y.L., Kanade, T., & Cohn, J.F. (May 2002). Evaluation of Gabor-wavelet-based facial action unit recognition in image sequences of increasing complexity. Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition (FG'02), Washington, DC, 229-234.
    [31] Tomkins, S.S 1962 Affect, Imagery, Consciousness, Vol. 1, The Positive Affects. New York Springer.
    [32] Tomkins, S.S 1963 Affect, Imagery, Consciousness, Vol. 2, The Negative Affect. New York Springer.
    [33] Tompkins, S. S., & McCaarter, R., What and where are the primary affects? Some evidence for theory. Percept. & Motor skills, 1964, 18, 119-158.
    [34] Welb, N. C., The use of myoelectric feedback in teaching facial expressions to the blind. Amer. Fed. for Blind, Res. Bull., 1974, 27, 231-262.
    [35] Wu, Y. T., T. Kanade, J. Cohn and C. C. Li, “Optical Flow Estimation Using Wavelet Motion Model,” Proceedings of IEEE International Conference on Computer Vision, pp. 992-998, 2000.
    [36] Xiao, J., Moriyama, T., Kanade, T., & Cohn, J. F. (in press). Robust full-motion recovery of head by dynamic templates and re-registration techniques, International Journal of Imaging Systems and Technology.

    無法下載圖示 全文公開日期 本全文未授權公開 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE