簡易檢索 / 詳目顯示

研究生: 陳美杏
Mei-Xing Chen
論文名稱: 考量注視焦點之文字配樂系統
A-gaze directed music composition system for text browsing
指導教授: 楊傳凱
Chuan-Kai Yang
口試委員: 林伯慎
Bor-Shen Lin
鮑興國
Hsing-Kuo Pao
學位類別: 碩士
Master
系所名稱: 管理學院 - 資訊管理系
Department of Information Management
論文出版年: 2011
畢業學年度: 99
語文別: 中文
論文頁數: 61
中文關鍵詞: 眼控滑鼠眼球追蹤文字情緒分析音樂情緒分析
外文關鍵詞: eye-control-mouse, eye tracking, text emotion, music emotion
相關次數: 點閱:474下載:3
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

音樂可以影響人們的情緒,許多媒介透過音樂可加深其想傳達的意境,例如:電影、廣告。在本篇論文中,我們打破了以往只有為影像配樂的形式,結合不同技術提出創新的配樂模式-在閱讀時,適時播放配合文句意境的音樂,以提升閱讀品質,給人們不同的體驗。
考量注視焦點,讓使用者可以在自然閱讀狀態下使用本系統,利用事先建立的情緒字詞庫及副詞詞庫,即時分析使用者注視的文字情緒,更改音樂屬性以配合文字之情緒,而音樂屬性的情緒對應是參考其他文獻整理集結而成的。整套系統除了一部電腦外,只需要一台普通的網路攝影機,便可實現系統功能。雖準確度不及市面上的眼動儀,但比起一台動軏數十萬的設備,本系統只需要數百至千元不等即可。
眼控系統、文字情緒,及音樂情緒屬性對應,三種技術分別來看或是排列組合都有許多應用的可能,像是熱門的互動式體感控制遊戲、有聲書、互動廣告看板…等,都是值得我們繼續研究的其他應用。


Music can change our lives. Many media such as movies and advertising convey their ideas through the help of music. Instead of adopting the traditional schemes for matching images with music, we opt for a new paradigm by composing music that can fit the “mood” of the text that a user is reading, thus enhancing the reading efficiency, as well as offering drastically difference experience.

By taking gazing direction into consideration, readers can read text as usual, while our system, based on a pre-built emotion and adverb word database, can generate music with particular attributes to align with the emotions presented in the text. Compared with the much expensive devices (up to a few hundred thousand dollars) involved in others’ research, such as an eye-tracker, our system rely only on a common web-cam, which may costs less than thousand.

It is interesting to note that, the three core techniques developed in this work: eye-control-mouse, text emotion, and music emotion, can be very useful, for being used individually, in a combined fashion, or even in many different possible combinations. For example, eye-driven mouse, audio book, interactive billboards, etc. are some possible interesting directions which are worth of further research.

1. 緒論 1 1.1  研究目的與動機 1 1.2  論文架構 2 1.3  論文貢獻 3 2. 相關文獻 4 2.1  眼控系統 4 2.2  情緒模型 18 2.3 文字情緒 20 2.3 音樂情緒 24 2.4 文字結合音樂 27 3. 眼控滑鼠 31 3.1 眼睛偵測 31 3.2 注視點估計 34 3.3 平移與旋轉 36 4 文字情緒分析 40 5 音樂情緒對應 42 6. 系統實作 46 7. 實驗結果 48 7.1 注視點估計準確度 48 7.2 軌距比對 49 7.3 文字情緒 52 7.4 音樂情緒 53 8 結論 55 9 參考文獻 58 附錄一 61

[1] 郭中仁, "以CCD 影像作視向偵測", 清華大學電機工程研究所碩士論文, 1997
[2] 詹永舟, "瞳位追蹤應用於眼控系統及眼球動態量測儀器之製作與分析", 逢甲大學自動控制工程研究所碩士論文, 1999
[3] 莊英杰, "追瞳系統之研發於身障者之人機介面應用", 中央大學資訊工程研究所碩士論文, 2004
[4] J A Russell, "A circumplex model of affect, " Journal of Personality and Social Psychology, 1980
[5] R Plutchik, "A general psychoevolutionary theory of emotion", Emotion: Theory, research, and experience,1980
[6] P Ekman, "An argument for basic emotions ",Cognition and Emotion, 1992
[7] M M Bradley, P J Lang, "Affective norms for English words (anew):Stimuli, instruction manual and affective ratings. " Technical report, The Center for Research in Psychophysiology, University of Florida, 1999
[8] W Paik, S Yilmazel, E Brown, M Poulin, "Applying natural language processing (NLP) based metadata extraction to automatically acquire user preferences", international conference on Knowledge capture, 2001
[9] JG Wang, E Sung, R. Venkateswarlu , "Eye Gaze Estimation from a Single Image of One Eye", International Conference on Computer Vision, 2003
[10] R Valitutti, "WordNet-Affect: an Affective Extension of WordNet" , International Conference on Language Resources and Evaluation, 2004
[11] D Roth, "Emotions from text : machine learning for text-based emotion prediction", Human Language Technology and Empirical Methods in Natural Language Processing, 2005
[12] Z Zhu, Q Ji, "Eye Gaze Tracking Under Natural Head Movements", Computer Vision and Pattern Recognition, 2005
[13] K Ishizuka , "Generation of Variations on Theme Music Based on Impressions of Story Scenes Considering Human’s Feeling of Music and Stories", International Journal of Computer Games Technology, 2006
[14] TL Wu , "Automatic emotion classification of musical segments", International Conference on Music Perception and Cognition, 2006
[15] YH Yang , "Music Emotion Classification: A Fuzzy Approach", International Conference on Multimedia , 2006
[16] C Yang, KHY Lin , "Building emotion lexicon from weblog corpora", Proceedings of 45th Annual Meeting of Association for Computational Linguistics , 2007
[17] A P Oliveira, A Cardoso, " Affective-Driven Music Production: Selection and Transformation of Music", International Conference on Digital Arts, 2008
[18] A Oliveira , "Affective-Driven Music Production: Selection and Transformation of Music", International Conference on Digital Arts , 2008
[19] B He, C Macdonald, J He, " An effective statistical approach to blog post opinion retrieval", conference on Information and knowledge management, 2008
[20] B Pang L Lee, "Opinion Mining and Sentiment Analysis , Journal Foundations and Trends in Information Retrieval ",2008
[21] T Danisman , "Feeler: emotion classification of text using vector space model", The Society for the Study of Artificial Intelligence and the Simulation of Behavior, 2008
[22] YH Yang, YC Lin, YF Su , "A Regression Approach to Music Emotion Recognition ",IEEE transactions on audio, speech, and language processing, 2008
[23] Y Hu, X Chen, " Lyric-based Song Emotion Detection with Affective Lexicon and Fuzzy Clustering Method", International Society for Music Information Retrieval Conference, 2009
[24] CT Li, HC Lai, CT Ho, CL Tseng , "Pusic:Musicalize Microblog Messages for Summarization and Exploration", International World Wide Web Conference, 2010
[25] DW Hansen, "In the Eye of the Beholder: A Survey of Models for Eyes and Gaze" , IEEE Transactions on Pattern Analysis and Machine Intelligence , 2010
[26] T Nagamatsu, R Sugano, Y Iwamoto, "User-calibration-free Gaze Tracking with Estimation of the Horizontal Angels between the Visual and the Optical Axes of Both Eyes", Eye-Tracking Research & Applications, 2010
[27] S M Kim, "Evaluation of unsupervised emotion models to textual affect recognition ", Computational Approaches to Analysis and Generation of Emotion in Text, 2010

QR CODE