簡易檢索 / 詳目顯示

研究生: 林子虹
Zih-Hong - Lin
論文名稱: 使用者經驗於行動介面音效設計之研究
User Experience Study on the Sound Design of Mobile User Interface
指導教授: 陳建雄
Chien-Hsiung Chen
口試委員: 鄭金典
Jin-Dean Cheng
曹永慶
Yung-Chin Tsao
學位類別: 碩士
Master
系所名稱: 設計學院 - 設計系
Department of Design
論文出版年: 2017
畢業學年度: 105
語文別: 中文
論文頁數: 124
中文關鍵詞: 音效設計互動設計使用者經驗使用性評估
外文關鍵詞: Sound Design, Interaction Design, User Experience, Usability Evaluation
相關次數: 點閱:490下載:69
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 在人人皆可成為行動應用程式開發者的世代,傳遞美好的使用者經驗成為設計數位產品時不可或缺的要素。相較於在視覺層面對於使用者介面及資訊架構的琢磨,音效設計一直是互動設計中鮮少被人探索的領域。本研究針對互動音效應用於App之主觀感受進行探索,分析影響使用者經驗之音效構成與設計方式,並進一步評估音效是否影響使用者對於App本身之使用性及整體操作體驗,試圖提供給產品設計師及使用者一個不同以往的思考及新型態的互動方式,創造出嶄新的使用者經驗。

    本研究包含兩部分實驗:(1)前導性實驗:透過文獻探討之理論依據進行音效之初步調查與研究測試,釐清音效之功能種類與分類方式,並藉由前導性實驗樣本探勘音效加入的有無對App使用性之差異;(2)驗證實驗:基於前導性實驗之成果,將音效樣本之構成元素拆解,深入探討音效內之細微差異對使用者在使用性及主觀感受上之變化。

    驗證實驗採2(音調變化有無) x 3(音色類型)雙因子混合設計方式進行,「音調變化有無」採取兩層級:(1)有音調變化:隨操作事件產生升降階音調變化;(2)無音調變化:操作事件時不產生升降階音調變化。而「音色類型」分為三層級:(1)現實合成音:與現實物件產生對應關係之現實取樣音色;(2)複合數位合成音:與現實物件產生對應之數位合成音色;(3)電子合成音:與現實物件無對應關係之數位合成音色。

    研究結果顯示:(1)不論有無樂器學習經驗,在互動音效之辨別與感受上並無顯著性差異,皆認為加入音效後之樣本使用性比無音效更好;(2)複合數位合成音與電子合成音在基礎功能性及提示性評價較好;(3)現實合成音與複合數位合成音在使用性及學習性之表現較為優異;(4)在現實合成音中加入音調變化將降低其使用性評價,反之在複合數位合成音中加入音調變化則會提升其使用性評價;(5)在特定操作任務加入音效,可為使用者帶來多層次的趣味性及成就感,這能讓使用者感到愉悅,並願意持續使用該產品。


    In an era that everyone can be a developer of mobile application, delivering a wonderful user experience has become an essential element when designing digital products. Compared to the extensive explorations of the user interface and information architecture in the visual level, the sound design is an area that has remained rarely explored in the interactive design. In this study, the subjective experience of the interactive sound effects applied in the App has been explored in order to analyze the construction and design methods of sound effects that have an impact on the user experience. After that, further evaluation in terms of whether the sound effects affect interface usability and overall operation experience of the App. In th end, it is hoped to provide designers and users with an innovative way of thinking and interacting to create a new user experience.

    The experiment in this study consists of two stages: (1) Pre-test: Through the theoretical basis of the literature, the preliminary investigation and research testing of the sound effects have been conducted to clarify the function types and classification methods of sound effects, and further explore the differences in the usability of App by adding the sound effects based on the samples in the pre-test. (2) Verification experiment: Based on the results of pre-test, the constructing elements of sound effect samples have been decomposed in order to thoroughly explore whether the subtle differences in sound effects would have an impact on users’ usability and subjective feelings toward the App.

    The verification experiment is conducted on a 3 (with or without tone changes) by 2 (tone types) two-factor mixed design. The tone changes are divided into two levels: (1) With tone changes: The tone has been rising or falling during the operation; (2) Without tone changes: The tone has not been rising or falling during the operation. The tone types are divided into three levels: (1) Auditory Icons: On-site samples in correspondence with actual objects; (2) Compound Earcons: Digitally generated sounds in correspondence with actual objects; (3) Earcons: Digitally generated sounds that have no correspondence with actual objects.

    The results show that: (1) There is no significant difference in the recognition and feeling of the interactive sound effects whether the user has learned to play a musical instrument or not; they all suggest that the samples with sound effects are better than the ones with no sound effects; (2) Compound earcons and earcons have better evaluations in the indication and usability of basic functions. (3) Auditory icons and compound earcons have better performances in being easy to use and learn; (4) The addition of tone changes in auditory icons would lower the evaluation of usability, whereas the addition of tone changes in compound earcons would improve the evaluation of usability; (5) The addition of sound effects to a specific operation task can bring multi-level feelings of enjoyment and accomplishment to users, which would make them feel amused and willing to continue to use the product.

    摘 要 i ABSTRACT ii 誌 謝 iv 目 錄 v 圖目錄 ix 表目錄 x 第一章 緒論 1 1.1 研究背景與動機 1 1.2 研究目的 4 1.3 研究架構與流程 5 1.4 研究範圍與限制 7 第二章 文獻探討 8 2.1 音效設計 8 2.1.1 音效之功能分類 8 2.1.2 音效之聲音種類 9 2.1.3 非語言聲音的分類 11 2.1.4 音效之意義對照 13 2.1.5 音效在行動介面上的重要性 14 2.1.6 音效之設計要點 15 2.1.7 音效在行動介面上的優勢 16 2.1.8 小結 18 2.2 人的感知與認知 19 2.2.1 感覺系統 19 2.2.2 視覺感知 19 2.2.3 聽覺感知 20 2.2.4 視覺與聽覺的比較 20 2.2.5 認知過程 20 2.2.6 小結 21 2.3 使用者導向設計 23 2.3.1 使用者導向設計之原則 23 2.3.2 使用者經驗 25 2.3.3 小結 26 2.4 使用性工程 27 2.4.1 使用性工程原則 27 2.4.2 使用性工程評估 28 第三章 研究方法與實驗步驟 31 3.1 研究進行步驟 31 3.2 實驗流程與建構 32 3.3 實驗方法 34 第四章 前導性實驗 35 4.1 前導性實驗研究架構 35 4.2 前導性實驗樣本 36 4.2.1 現有音效樣本分析 37 4.2.2 音效使用性實驗設計 38 4.2.3 前導性實驗問卷設計 38 4.2.4 實驗步驟與設備 39 4.3 前導性實驗對象 39 4.4 前導性實驗結果 40 4.5 受測者主觀感受訪談 43 4.6 綜合討論分析 47 4.6.1 前導性實驗結果 47 4.6.2 後續建議與規劃 48 第五章 驗證實驗 49 5.1 驗證實驗方法 49 5.1.1 驗證實驗研究變項 49 5.1.2 驗證實驗App原型製作 51 5.1.3 驗證實驗設計 54 5.1.4 驗證實驗對象 57 5.2 驗證實驗結果 59 5.2.1 系統使用性尺度(SUS)量表之分析 59 5.2.2 知覺有用性與易用性(PUEU)量表之分析 62 5.2.2.1 知覺有用性與易用性量表「更快速完成任務」之分析 62 5.2.2.2 知覺有用性與易用性量表「提升操作表現」之分析 65 5.2.2.3 知覺有用性與易用性量表「提高工作效率」之分析 67 5.2.2.4 知覺有用性與易用性量表「更容易完成任務」之分析 70 5.2.2.5 知覺有用性與易用性量表「學習理解容易度」之分析 72 5.2.2.6 知覺有用性與易用性量表「察覺操作狀態容易度」之分析 75 5.2.2.7 知覺有用性與易用性量表「互動方式理解程度」之分析 77 5.2.2.8 知覺有用性與易用性量表「互動彈性程度」之分析 80 5.2.2.9 知覺有用性與易用性量表「熟悉記憶程度」之分析 83 第六章 結論與建議 87 6.1 研究結果 87 6.1.1 音調變化與音色類型之使用經驗研究結果重點整理 88 6.2 結論與建議 92 6.3 後續研究建議 95 參考文獻 96 英文文獻 96 中文文獻 100 網路文獻 101 附錄一、前導性實驗問卷 102 附錄二、驗證實驗問卷 105

    英文文獻
    (1) Blattner, M. M., & Dannenberg, R. B. (1992). Multimedia interface design. New York: ACM Press.
    (2) Bloch, P. H. (1995). Seeking the ideal form: Product design and consumer response. Journal of Marketing, 59, pp. 16-29.
    (3) Booker, J. E., Chewar, C. M., & McCrickard, D. S. (2004,). Usability testing of notification interfaces: are we focused on the best metrics?. In Proceedings of the 42nd annual Southeast regional conference, pp. 128-133.
    (4) Brewster, S. A. (2002). “Non-speech auditory output.” Human-Computer Interaction Handbook. London: Lawrence Erlbaum Associates. pp. 220-239.
    (5) Brewster, S. A. (2008). Non-speech auditory output. In J. Jacko, and A. Sears, (Eds.). The Human Computer Interaction Handbook (2nd Edition). London: Lawrence Erlbaum Associates. pp. 247-264.
    (6) Eisenhardt, K. M. (1989). Building theories from case study research. Academy of Management Review, 14 (4), pp. 532-550.
    (7) Fontana, A., & Frey, J. H. (1994). Interviewing: The art of science. In N. Denzin and Y. Lincoln (Eds.), Handbook of qualitative research. London: Sage, pp. 361-376.
    (8) Garrett, J. J. (2000). The Elements of User Experience: User-Centered Design for the Web. San Francisco, CA: New Riders Publishing.
    (9) Garzonis, S., Jones, S., Jay, T. and O’Neill, E., (2009). Auditory Icon and Earcon Mobile Service Notifications: Intuitiveness, Learnability, Memorability and Preference. In Proceedings of the SIGCHI Conference on Human Factors in Computing System (pp. 1513-1522). New York, NY: ACM Press.
    (10) Gaver, W. W. (1986). Auditory icons: Using sound in computer interfaces. Human-Computer Interaction. 2, pp. 167-177.
    (11) Gaver, W. W. (1989). The SonicFinder: An interface that uses auditory icons. Human-Computer Interaction, 4(1), pp. 67-94.
    (12) Gaver, W. W. (1997). Auditory interfaces. In M. Helander, T. Landauer & P. Prabhu (Eds.), Handbook of Human-Computer Interaction, 1, Amsterdam: Elsevier Science.
    (13) Heim, S. (2007). The Resonant Interface HCI Foundations for Interaction Design: Pearson/Addison Wesley.
    (14) Helander, M. G., Landauer, T. K., & Prabhu, P. V. (1997). Handbook of human-computer interaction: North Holland.
    (15) Jordan, P. W. (2000). Designing pleasurable products: An introduction to the new human factors. London: Taylor & Francis.
    (16) Klein, J., Moon, Y., & Picard, R. W. (2002). This computer responds to user frustration: theory, design, and results. Interacting with Computers, 14, pp. 119-140.
    (17) Kramer, G. (1994). An introduction to auditory display. In G. Kramer (Ed.), Auditory display, Reading, MA: Addison-Wesley.
    (18) Lemaitre, G., Houix, O., Franinović, K., Visell, Y., & Susini, P. (2009). The Flops glass: a device to study the emotional reactions arising from sonic interactions. In Proc. Sound and Music Computing Conference, Porto, Portugal.
    (19) Leplatre, G., & Brewster, S. A. (2000). Designing non-speech sounds to support navigation in mobile phone menus. Paper presented at ICAD 2000, Atlanta, USA.
    (20) Hassenzahl, M., & Tractinsky, N. (2006). User experience-a research agenda. Behaviour & information technology, 25(2), 91-97.
    (21) McKellar, P. (1965). The investigation of mental images. Harmondsworth, England: Penguin Books.
    (22) Neuhoff, J. G. (Ed.). (2004). Ecological psychoacoustics. San Diego, CA, USA: Elsevier Academic Press.
    (23) Nielsen, J. (1993). Usability Engineering. Boston, MA: Academic Press.
    (24) Norman, D. (1993). Things that make us smart: Defending human attributes in the age of the machine. Boston, MA: Addison-Wesley.
    (25) Norman, D. (2004). Emotional designs: Why we love (or hate) everyday things. New York: Basic Books.
    (26) Sisler, P., & Titta, C. (2001). User Experience Design for Working Web Sites and Applications. In Annual Conference-Society for Technical Communication.
    (27) Roeber, U., Widmann, A., & Schroger, E. (2003). Auditory distraction by duration and location deviants: a behavioral and event-related potential study. Cognitive Brain Research, 17(2), pp. 347-357.
    (28) Thimbleby, H. (1990). User interface design. New York: ACM Press, Addison-Wesley.
    (29) Vargas, M., & Anderson, S. (2003). Combining Speech and Earns to Assist Menu Navigation. Proceedings of the International Conference on Auditory Display (ICAD 2003), Boston, MA, USA, pp. 38-41.
    (30) Wake, S. H. (2005). SUI (Sound User Interface): Auditory information display used design sound and its design. Fujisawa interface design 2005, pp. 105-110.
    (31) Waugh, N. C., & Norman, D. A. (1965), Primary memory. Psychological Review, 72, pp. 89-104.
    (32) Weiser, M. (1991). The computer for the 21st century. Scientific American, 265, pp. 94-104.

    中文文獻
    (33) Gleitman, H. (1981/1995). Psychology. 洪蘭(譯)。心理學。台北:遠流出版社。
    (34) Lowdermilk, T. (2013/2014). User-Centered Design. 黃朝秋(譯)。使用者導向設計。台北:碁峰資訊。
    (35) Preece, J., Rogers, Y., & Sharp, H. (2002/2009). Interaction Design: beyond human-computer interaction. 陳建雄(譯)。互動設計:跨越人-電腦互動。台北:全華圖書。
    (36) Solso, R. L. (1979/1998). Cognitive Psychology. 吳玲玲(譯)。認知心理學。台北:華泰書局。
    (37) 山岡俊樹(1998)。使用者介面設計原則之研究。1998中日教育設計研討會論文集,77-80頁。
    (38) 李佳穎(2008)。網路書店使用者介面之資訊呈現與商品圖片型式對使用者體驗感受之影響。國立台灣科技大學設計研究所碩士論文,未出版,台北。
    (39) 徐仁輝 (2004)。績效評估與績效預算。國家政策季刊,3(2),21-36頁。
    (40) 崔恩銓(2013)。提示音音調高低應用於尋路系統之研究。國立臺灣科技大學設計研究所碩士論文,未出版,台北。
    (41) 張春興(1995)。現代心理學。台北:東華書局。
    (42) 張紹勳(2007),研究方法(第二版),台中:滄海書局。
    (43) 細谷多聞(1995)。Auditory Sounds and Product User Interface,日本デザイソ學会誌,3(2),53-58。
    (44) 許勝雄、彭游、吳水丕(2010)。人因工程(第四版)。台北:滄海書局。
    (45) 廖宜賢(2008)。手機提示音與功能適合度及感性知覺之關聯性研究。國立成功大學工業設計學系碩士論文,未出版,台南。
    (46) 鄭亦均 (2013)。以設計者需求為導向的心智圖App互動介面研究。大同大學工業設計研究所碩士論文,未出版,台北。
    (47) 謝明憲(2004)。即時傳訊系統之聲音提示對使用者工作績效的影響。國立清華大學資訊系統與應用研究所碩士論文,未出版,新竹。

    網路文獻
    (48) Apple, (2015). iOS Human interface guidelines. Retrieved April 22, 2015, from https://developer.apple.com/library/ios/documentation/UserExperience/Conceptual/MobileHIG/Sound.html
    (49) Paluch, K. (2006). What Is User Experience Design. Retrieved April 28, 2015, from http://www.montparnas.com/articles/what-is-user-experience-design/

    QR CODE