簡易檢索 / 詳目顯示

研究生: 賈曉毓
Xiao-Yu Jia
論文名稱: 社交機器人面部特徵及社交線索設計對面部意象之影響
The Effects of Social Robot’s Facial Feature and Social Cue Design on the Facial Impression
指導教授: 陳建雄
Chien-Hsiung Chen
口試委員: 宋同正
Tung-Jung Sung
柯志祥
Chih-Hsiang Ko
吳志富
Chih-Fu Wu
許言
Yen Hsu
學位類別: 博士
Doctor
系所名稱: 設計學院 - 設計系
Department of Design
論文出版年: 2023
畢業學年度: 111
語文別: 中文
論文頁數: 140
中文關鍵詞: 人-機器人互動社交機器人社交線索意象評價嬰兒圖式效應使用者體驗
外文關鍵詞: human-robot interaction, social robots, social cues, image evaluation, baby schema effect, user experience
相關次數: 點閱:187下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 社交機器人已經逐漸進入人類生活的各個層面,透過扮演不同角色來輔助人類工作和提升生活品質、以及促進人與人之間更多的交流。目前,社交機器人被廣泛應用於家庭、醫療、教育、餐飲、娛樂等多個領域。但是,人-機器人互動中仍存在的一些問題直接影響了互動效率和體驗。先前的研究表明,如果機器人表現出豐富的社會行為並試圖建立人與機器人的關係,它將被大多數人所接受。一方面,社交機器人的視覺外觀尤其是面部特徵可以系統地影響人們的感知或行為,並且可以為機器人的能力和傾向提供線索;另一方面,機器人表現出的社交線索能夠大幅提升人-機器人互動的效率,讓人對機器人更加信任、友善。
    本研究之目的為以量化研究方法研究社交機器人面部特徵及社交線索在人-機器人互動中對機器人面部意象之影響,同時嘗試將研究結果拓展到現實存在的消費類機器人中,從機器人應用之視角探索人-機器人互動的相關準則。
    研究議題一透過兩個實驗(實驗一、實驗二)分別討論了虛擬和真實互動場景中,機器人面部意象評價問題。其中實驗一探討了渲染機器人面部之意象感知,研究變數包括機器人頭部形狀、面部特徵、機器人攝影機鏡頭和受測者性別;實驗二進一步在真實人-機器人互動場景中探索了機器人頭部形狀和攝影機鏡頭對機器人頭部的意象感知,研究變數包括機器人頭部形狀和機器人攝影機鏡頭。綜合實驗一和實驗二之結果顯示:(1)機器人頭部形狀對機器人感知有顯著影響;(2)明顯的攝影機鏡頭在一定程度上影響了受測者意象評價;(3)女性受測者分數比男性受測者分數更敏感。
    研究議題二透過兩個實驗(實驗三、實驗四)探討了嬰兒圖式效應、機器人類型對機器人面部可愛度和可信度的影響。其中實驗三研究了嬰兒圖式程度、可愛度和可信度幾個因子之間是否存在統計學上的相關性;實驗四研究了機器人類型和嬰兒圖式程度高低對社交機器人面部意象的影響。綜合實驗三和實驗四之結果顯示:(1)社交機器人面部的嬰兒圖式程度和感知情緒對可信度有積極影響:高嬰兒圖式的面部被認為比不受控制或低嬰兒圖式的面部更可愛、更值得信賴;(2)機器人類型和嬰兒圖式與可愛和可信度有顯著的交互作用;(3)但對於某些類型的機器人,嬰兒圖式效應也可能會產生反作用。
    研究議題三旨在透過兩個實驗(實驗五、實驗六)探討機器人面部表現出的部分社交線索及相關因素對受測者意象感知的影響。實驗五探索了特定場景中機器人所體現出的社交線索對受測者意象評價的影響,研究變數為機器人眨眼動作和機器人頭部轉動,結果顯示:(1)社交機器人在特定場景中的社交線索,可以提高受測者對機器人的有生性、人格化、可愛度和感知的智力意象;(2)機器人的社交線索會在一定程度上吸引遊客注意力並引導跟隨遊覽,但吸引注意力的能力有限。實驗六探索了特定的社交線索、頭部形狀、機器人身體比例和受測者性別對意象評價之影響,研究變數為機器人面部社交線索,機器人身體比例,機器人頭部形狀以及受測者性別。結果顯示:(1)圓形頭部機器人比矩形頭部的機器人更值得信賴,身體比例高的機器人被認為比身體比例低的機器人更值得信賴;(2)微小的社交線索對提高機器人的可信度也有積極作用。


    Social robots have gradually permeated all aspects of human life, playing diverse roles in assisting human work, improving quality of life, and fostering communication between people. Currently, social robots find wide application in various fields such as family, medical care, education, catering, and entertainment. However, human-robot interaction still faces challenges that directly impact interaction efficiency and user experience. Previous studies have indicated that if a robot exhibits rich social behaviors and endeavors to establish a human-robot relationship, it will be accepted by the majority of humans. On one hand, the visual appearance of social robots, especially facial features, can systematically influence people’s perception and behavior, providing clues about the robot’s abilities and tendencies. On the other hand, the social cues exhibited by robots can significantly enhance the efficiency of human-robot interaction, fostering trust and friendliness towards robots.
    The purpose of this study is to investigate the influence of facial features and social cues of social robots on robot facial images in human-robot interaction using quantitative research methods. Additionally, the study aims to extend the research findings to real consumer robots and explore relevant guidelines for human-robot interaction from a robot application perspective.
    Research topic 1 examines the perception of robot facial images in virtual and real interactive scenarios through two experiments: Experiment 1 and Experiment 2. In Experiment 1, the study explores the image evaluation of rendered robot faces, considering variables such as robot head shape, facial features, robot camera lenses, and the sex of the participants. Experiment 2 further investigates the relationship between robot head shape and the image perception of the robot head by the camera lens, with the variables being the robot head shape and the robot camera lens. The comprehensive results of Experiment 1 and Experiment 2 indicate the following: (1) The shape of the robot’s head significantly impacts the perception of the robot. (2) The presence of visible camera lenses moderately affects the subjective evaluation of the robot’s face. (3) Female participants’ scores exhibit higher sensitivity compared to male participants’ scores.
    Research topic 2 investigates the influence of the baby schema effect and robot type on the cuteness and trustworthiness of robot faces through two experiments: Experiment 3 and Experiment 4. Experiment 3 aims to explore the statistical correlation among the degree of baby schema, cuteness, and trustworthiness. On the other hand, Experiment 4 examines the impact of robot type and the degree of baby schema on the facial image perception of social robots. The combined results of Experiment 3 and Experiment 4 reveal the following findings: (1) The degree of baby schema and perceived emotion in social robot faces positively impact trustworthiness. Faces with high baby schema, characterized by features such as large eyes and specific eye and mouth positions, are perceived as more trustworthy and cute compared to faces with low baby schema or uncontrolled features. (2) Robot type and baby schema interact significantly with cuteness and trustworthiness. The specific type of robot and the level of baby schema influence how the robot’s face is perceived in terms of cuteness and trustworthiness. (3) However, for certain types of robots, an excessively high baby schema may have a counterproductive effect on users’ perceptions of cuteness and trustworthiness. The impact of the baby schema effect can vary depending on the specific characteristics and context of the robot.
    Research topic 3 aims to explore the influence of social cues and related factors on participants’ image perception through two experiments: Experiment 5 and Experiment 6. In Experiment 5, the research variables were the robot’s eye blinking and head turning, and their influence on the participants’ image evaluation was examined. The results indicate that: (1) Social cues displayed by the robot in a specific scene can enhance the participants’ imagery of the robot’s Anthropomorphism, Animacy, Likeability, Perceived Intelligence. (2) The social cues of the robot have the ability to attract tourists’ attention to a certain extent and guide them to follow a tour. However, the attention-grabbing ability is limited. In Experiment 6, the effects of specific social cues, head shape, robot body proportions, and participants’ gender on image evaluation were explored. The research variables included robot facial social cues, robot body proportions, robot head shape, and participants’ gender. The results reveal that: (1) Robots with round heads are perceived as more trustworthy compared to robots with rectangular heads. (2) Robots with high body proportions are considered more trustworthy than those with low body proportions. (3) participants’ gender also influences the perception of robot trustworthiness, although the specific details of this effect are not mentioned.

    中文摘要 i Abstract iii 誌謝 vi 目錄 vii 圖索引 xi 表索引 xiii 第一章 緒論 1 1.1 研究背景 1 1.2 研究目的 2 1.3 研究範圍與限制 3 1.4 研究流程 3 第二章 文獻探討 6 2.1 社交機器人與人-機器人互動(Human-robot interaction, HRI) 6 2.1.1 社交機器人定義及研究現狀 9 2.1.2 人-機器人互動範圍與研究現狀 12 2.1.3 人-機器人互動的關鍵問題 15 2.1.4 社交機器人倫理問題 17 2.1.5 小結 18 2.2 社交機器人面部特徵及相關因素研究 19 2.2.1 機器人的面部驅動形式 19 2.2.2 人對機器人頭部比例及相關尺寸之感知 21 2.2.3 渲染機器人的面部特徵 21 2.2.4 小結 23 2.3 機器人面部意象評價及相關理論 24 2.3.1 機器人面部意象評價 24 2.3.2 恐怖谷(The Uncanny Valley)理論 24 2.3.3 嬰兒圖式效應(Baby Schema Effect, BSE) 26 2.3.4 嬰兒圖式效應與機器人可信度 29 2.3.5 小結 30 2.4 人-機器人互動中的社交線索(Social cues) 30 2.4.1 社交線索與機器人感知對人-機器人互動的影響 30 2.4.2 賦予機器人心智理論 33 2.4.3 電腦作爲社會參與者(Computer As Social Actor, CASA)理論 34 2.4.4 社交線索對機器人可信度的影響 35 2.4.5 小結 36 2.5 機器人面部意象評價方法 36 2.5.1 外觀對機器人可信度的影響 37 2.5.2 機器人面部特徵對意象評價之影響 38 2.5.3 情緒認知評價與主觀評價量表 39 2.5.4 行為測試、眼動追蹤、肌電與腦電信號數據探測技術 40 2.5.5 小結 41 第三章 研究議題一:機器人面部特徵及相關因素對面部意象之影響 43 3.1 研究議題一之研究目的與架構 43 3.2 實驗一:研究目的與變數 44 3.3 實驗一:刺激物 44 3.4 實驗一:受測者與實驗步驟 47 3.5 實驗一:結果與分析 48 3.6 本階段成果討論 52 3.7 實驗二:研究目的與變數 53 3.8 實驗二:受測者與實驗步驟 53 3.9 實驗二:結果與分析 57 3.10 本階段成果討論 59 3.10.1 機器人頭部形狀 59 3.10.2 機器人面部特徵 59 3.10.3 機器人攝影機鏡頭 60 3.10.4 不同變數交互作用對機器人感知的影響 60 3.10.5 不同互動情境對受測者判斷的影響 61 第四章 研究議題二:嬰兒圖式效應及機器人面部特徵對面部意象之影響 62 4.1 研究議題二之研究目的與架構 62 4.2 實驗三:研究目的與變數 63 4.3 實驗三:刺激物 63 4.4 實驗三:受測者與實驗步驟 64 4.5 實驗三:結果與分析 66 4.6 實驗四:研究目的與變數 68 4.7 實驗四:受測者與實驗步驟 72 4.8 實驗四:結果與分析 72 4.9 本階段成果討論 78 4.9.1 嬰兒圖式對可愛度和可信度的影響 78 4.9.2 不同機器人類型的可愛和可信度 78 4.9.3 可感知的情感和可信度 79 第五章 研究議題三:機器人面部特徵及社交線索對面部意象之影響 81 5.1 研究議題三之研究目的與架構 81 5.2 實驗五:研究目的與變數 82 5.3 實驗五:刺激物 82 5.4 實驗五:受測者與實驗步驟 83 5.4.1 受測者訊息 83 5.4.2 實驗步驟 83 5.5 實驗五:結果與分析 85 5.5.1 人格化(Anthropomorphism)維度的感知 85 5.5.2 有生性(Animacy)維度的感知 85 5.5.3 可愛度(Likeability)維度的感知 86 5.5.4 感知的智力(Perceived Intelligence)維度的感知 86 5.5.5 眼動追蹤採集指標分析 86 5.6 本階段成果討論 92 5.7 實驗六:研究目的與變數 93 5.8 實驗六:刺激物與問卷 94 5.8.1 刺激物 94 5.8.2 問卷設計與實驗流程 96 5.9 實驗六:結果與分析 97 5.10 本階段成果討論 100 5.10.1 社交線索對機器人可信度的影響 100 5.10.2 機器人頭部形狀對機器人可信度的影響 101 5.10.3 身體比例對機器人可信度的影響 101 第六章 結論與未來研究 102 6.1 結論與建議 102 6.2 未來研究計劃 105 英文參考文獻 106 中文參考文獻 128 附錄A 130 附錄B 132 附錄C 135 附錄D 136 附錄E 137 附錄F 138 研究成果 139

    英文參考文獻

    1. Aaltonen, I., Arvola, A., Heikkilä, P., & Lammi, H. (2017, March). Hello Pepper, may I tickle you? Children’s and adults’ responses to an entertainment robot at a shopping mall. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction (pp. 53-54).
    2. Admoni, H., & Scassellati, B. (2017). Social eye gaze in human-robot interaction: a review. Journal of Human-Robot Interaction, 6(1), 25-63.
    3. Aggarwal, P., & McGill, A. L. (2007). Is that car smiling at me? Schema congruity as a basis for evaluating anthropomorphized products. Journal of Consumer Research, 34(4), 468-479. https://doi.org/10.1086/518544.
    4. Alarcon, G. M., Gibson, A. M., Jessup, S. A., & Capiola, A. (2021). Exploring the differential effects of trust violations in human-human and human-robot interactions. Applied Ergonomics, 93, 103350. https://doi.org/10.1016/j.apergo.2020.103350.
    5. Aldebaran (2022). NAO6. Retrieved from Aldebaran & United Robotics Group Web site: https://www.aldebaran.com/en/nao
    6. Andriella, A., Torras, C., & Alenyà, G. (2019, October). Learning robot policies using a high-level abstraction persona-behaviour simulator. In 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)(pp. 1-8). IEEE.
    7. Andriella, A., Torras, C., & Alenya, G. (2020). Short-term human–robot interaction adaptability in real-world environments. International Journal of Social Robotics, 12(3), 639-657.
    8. Anonymous (2022). Pepper. Retrieved from United Robotics Group Web site: https://www.softbankrobotics.com/emea/en/pepper
    9. Anonymous. (n.d.). MEET AIDO. Retrieved from InGen Dynamics Inc. Web site: https://aidorobot.com/
    10. Apperly, I. A., & Butterfill, S. A. (2009). Do humans have two systems to track beliefs and belief-like states? Psychological Review, 116(4), 953.
    11. Arnow, B., Kenardy, J., & Agras, W. S. (1995). The Emotional Eating Scale: The development of a measure to assess coping with negative affect by eating. International Journal of Eating Disorders, 18(1), 79-90.
    12. Atkinson, R. K., Mayer, R. E., and Merrill, M. M. (2005). Fostering social agency in multimedia learning: examining the impact of an animated agent’s voice. Contemporary Educational Psychology, 30(1), 117–139. doi:10.1016/j.cedpsych.2004.07.001.
    13. Bartneck, C., Kulić, D., Croft, E., & Zoghbi, S. (2009). Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International Journal of Social Robotics, 1(1), 71-81.
    14. Belkaid, M., Kompatsiari, K., De Tommaso, D., Zablith, I., & Wykowska, A. (2021). Mutual gaze with a robot affects human neural activity and delays decision-making processes. Science Robotics, 6(58), eabc5044.
    15. Benninghoff, B., Kulms, P., Hoffmann, L., & Krämer, N. C. (2013). Theory of Mind in Human-Robot-Communication: Appreciated or not? Kognitive Systeme, 2013(1).
    16. Bernotat, J., & Eyssel, F. (2018, August). Can(‘t) Wait to Have a Robot at Home?-Japanese and German Users’ Attitudes Toward Service Robots in Smart Homes. In 2018 27th IEEE International Symposium on Robot and Human Interactive Communication(RO-MAN)(pp. 15-22). IEEE.
    17. Berry, D. S., & McArthur, L. Z. (1985). Some components and consequences of a babyface. Journal of Personality and Social Psychology, 48(2), 312.
    18. Bianco, F., & Ognibene, D. (2019). Functional advantages of an adaptive theory of mind for robotics: a review of current architectures. 2019 11th Computer Science and Electronic Engineering(CEEC), 139-143.
    19. Björling, E. A., & Rose, E. (2019). Participatory research principles in human-centered design: engaging teens in the co-design of a social robot. Multimodal Technologies and Interaction, 3(1), 8.
    20. Blow, M., Dautenhahn, K., Appleby, A., Nehaniv, C. L., & Lee, D. C. (2006, September). Perception of robot smiles and dimensions for human-robot interaction design. In ROMAN 2006-The 15th IEEE International Symposium on Robot and Human Interactive Communication (pp. 469-474). IEEE.
    21. Borgi, M., Cogliati-Dezza, I., Brelsford, V., Meints, K., & Cirulli, F. (2014). Baby schema in human and animal faces induces cuteness perception and gaze allocation in children. Frontiers in Psychology, 5, 411.
    22. PHOENIX (2023). CARE-O-BOT Fraunhofer. Retrieved from PHOENIX Web site: https://www.phoenixdesign.com/project/care-o-bot
    23. Breazeal, C. (2004). Social interactions in HRI: the robot view. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 34(2), 181-186.
    24. Breazeal, C., & Scassellati, B. (1999). A context-dependent attention system for a social robot. In Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence (pp. 1146–1153).
    25. Breazeal, C., Kidd, C. D., Thomaz, A. L., Hoffman, G., & Berlin, M. (2005, August). Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In 2005 IEEE/RSJ international conference on intelligent robots and systems (pp. 708-713). IEEE.
    26. Broadbent, E., Kumar, V., Li, X., Sollers 3rd, J., Stafford, R. Q., MacDonald, B. A., & Wegner, D. M. (2013). Robots with display screens: a robot with a more humanlike face display is perceived to have more mind and a better personality. PlOS ONE, 8(8), e72589.
    27. Bruce, A., Nourbakhsh, I., & Simmons, R. (2002, May). The role of expressiveness and attention in human-robot interaction. In Proceedings 2002 IEEE international conference on robotics and automation (Cat. No. 02CH37292)(Vol. 4, pp. 4138-4142). IEEE.
    28. Caudwell, C., & Lacey, C., (2020). What do home robots want? The ambivalent power of cuteness in robotic relationships. Convergence, 26(4), 956-968.
    29. Chen, C., Garrod, O. G., Zhan, J., Beskow, J., Schyns, P. G., & Jack, R. E. (2018, May). Reverse engineering psychologically valid facial expressions of emotion into social robots. In 2018 13th IEEE international conference on automatic face & gesture recognition (FG 2018) (pp. 448-452). IEEE.
    30. Chou, Y. H., Wang, S. Y. B., & Lin, Y. T. (2019). Long-term care and technological innovation: the application and policy development of care robots in Taiwan. Journal of Asian Public Policy, 12(1), 104-123.
    31. Chu, M. T., Khosla, R., Khaksar, S. M. S., & Nguyen, K. (2017). Service innovation through social robot engagement to improve dementia care quality. Assistive Technology, 29, 8–18.
    32. Colquitt, J. A., Scott, B. A., & LePine, J. A. (2007). Trust, trustworthiness, and trust propensity: a meta-analytic test of their unique relationships with risk taking and job performance. Journal of Applied Psychology, 92(4), 909.
    33. Cowell, A. J., & Stanney, K. M. (2005). Manipulation of non-verbal interaction style and demographic embodiment to increase anthropomorphic computer character credibility. International Journal of Human-Computer Studies, 62(2), 281-306.
    34. Dahlbäck, N., Jönsson, A., & Ahrenberg, L. (1993). Wizard of Oz studies—why and how. Knowledge-based Systems, 6(4), 258-266.
    35. Danev, L., Hamann, M., Fricke, N., Hollarek, T., & Paillacho, D. (2017, October). Development of animated facial expressions to express emotions in a robot: RobotIcon. In 2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM) (pp. 1-6). IEEE.
    36. Dautenhahn, K., & Billard, A. (1999). Bringing up robots or—the psychology of socially intelligent robots: From theory to implementation. In Proceedings of the third annual conference on Autonomous Agents (pp. 366-367).
    37. Dereshev, D., & Kirk, D. (2017). Form, function and etiquette–potential users’ perspectives on social domestic robots. Multimodal Technologies and Interaction, 1 (2), 12.
    38. DiSalvo, C. F., Gemperle, F., Forlizzi, J., & Kiesler, S. (2002, June). All robots are not created equal: the design and perception of humanoid robot heads. In Proceedings of the 4th conference on Designing interactive systems: Processes, practices, methods, and techniques (pp. 321-326).
    39. Doherty, M. J. (2008). Theory of mind: how children under-stand others’ thoughts and feelings. psychology press.
    40. Edward T. Hall. (1966). Proxemic Theory. CSISS Classics.
    41. Edwards, C., Edwards, A., Albrehi, F., & Spence, P. (2021). Interpersonal impressions of a social robot versus human in the context of performance evaluations. Communication Education, 70(2), 165-182.
    42. Edwards, C., Edwards, A., Stoll, B., Lin, X., & Massey, N. (2019). Evaluations of an artificial intelligence instructor’s voice: Social Identity Theory in human-robot interactions. Computers in Human Behavior, 90, 357-362.
    43. Embodied, Inc. (2023). Moxie ® Robot. Retrieved from Embodied, Inc. Web site: https://moxierobot.com/products/ai-robot
    44. Emery, N. J. (2000). The eyes have it: the neuroethology, function and evolution of social gaze. Neuroscience & Biobehavioral Reviews, 24(6), 581-604.
    45. Engineered Arts (2019). RoboThespian. Retrieved from Engineered Arts Web site:https://robot-rental.com/robothespian-hire/
    46. Fiore, S. M., Wiltshire, T. J., Lobato, E. J., Jentsch, F. G., Huang, W. H., & Axelrod, B. (2013). Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior. Frontiers in Psychology, 4, 859.
    47. Fitter, N. T., & Kuchenbecker, K. J. (2016, November). Designing and assessing expressive open-source faces for the Baxter robot. In International Conference on Social Robotics (pp. 340-350). Springer, Cham.
    48. Fong, T., Nourbakhsh, I., & Dautenhahn, K. (2003). A survey of socially interactive robots. Robotics and Autonomous Systems, 42(3-4), 143-166.
    49. Frischen, A., Bayliss, A. P., & Tipper, S. P. (2007). Gaze cueing of attention: visual attention, social cognition, and individual differences. Psychological Bulletin, 133(4), 694.
    50. Gambino, A., Fox, J., & Ratan, R. A. (2020). Building a stronger CASA: Extending the computers are social actors paradigm. Human-Machine Communication, 1, 71-85.
    51. Ghazali, A. S., Ham, J., Barakova, E., & Markopoulos, P. (2019). Assessing the effect of persuasive robots interactive social cues on users’ psychological reactance, liking, trusting beliefs and compliance. Advanced Robotics, 33(7-8), 325-337.
    52. Gilboa-Schechtman, E., & Shachar-Lavie, I. (2013). More than a face: a unified theoretical perspective on nonverbal social cue processing in social anxiety. Frontiers in Human Neuroscience, 7, 904.
    53. Glas, D. F., Satake, S., Ferreri, F., Kanda, T., Ishiguro, H., & Hagita, N. (2013). The Network Robot System: Enabling Social Human-Robot Interaction in Public Spaces. Journal of Human-Robot Interaction, 1, 5–32.
    54. Glocker, M. L., Langleben, D. D., Ruparel, K., Loughead, J. W., Gur, R. C., & Sachser, N. (2009). Baby schema in infant faces induces cuteness perception and motivation for caretaking in adults. Ethology, 115(3), 257-263.
    55. Gockley, R., Forlizzi, J., & Simmons, R. (2006). Interactions with a moody robot. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction, 186-193.
    56. Goetz, J., Kiesler, S., & Powers, A. (2003, November). Matching robot appearance and behavior to tasks to improve human-robot cooperation. In The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003. (pp. 55-60). Ieee.
    57. Goodrich, M. A., & Schultz, A. C. (2008). Human–robot interaction: a survey. Foundations and Trends® in Human–Computer Interaction, 1(3), 203-275.
    58. Gorn, G. J., Jiang, Y., & Johar, G. V., (2008). Babyfaces, trait inferences, and company evaluations in a public relations crisis. Journal of Consumer Research, 35(1), 36-49.
    59. Green, R. D., MacDorman, K. F., Ho, C. C., & Vasudevan, S. (2008). Sensitivity to the proportions of faces that vary in human likeness. Computers in Human Behavior, 24(5), 2456-2474.
    60. Hancock, P. A., Billings, D. R., Schaefer, K. E., Chen, J. Y., De Visser, E. J., & Parasuraman, R. (2011). A meta-analysis of factors affecting trust in human-robot interaction. Human Factors, 53(5), 517-527.
    61. Harun, A., Abd Razak, M. R., Abd Rahim, R., & Radzuan, L. E. M., (2016). Anthropomorphic Stimuli in Brand Design: The Effect of Human Face Schema. In Conference Proceeding: 2nd International Conference on Creative Media, Design & Technology(REKA2016).
    62. Hayles, N. K. (2000). How we became posthuman: Virtual bodies in cybernetics, literature, and informatics. Public Understanding of Science, 9,464.
    63. Heerink, M. (2011, March). Exploring the influence of age, gender, education and computer experience on robot acceptance by older adults. In 2011 6th ACM/IEEE International Conference on Human-Robot Interaction(HRI)(pp. 147-148). IEEE.
    64. Hegel, F., Eyssel, F., & Wrede, B. (2010, September). The social robot ‘flobi’: Key concepts of industrial design. In 19th International Symposium in Robot and Human Interactive Communication (pp. 107-112). IEEE.
    65. Hegel, F., Lohse, M., & Wrede, B. (2009, September). Effects of visual appearance on the attribution of applications in social robotics. In RO-MAN 2009-The 18th IEEE International symposium on robot and human interactive communication (pp. 64-71). IEEE.
    66. Heuer, T. (2019). Who do you want to talk to? User-centered Design for human-like Robot Faces. In Proceedings of Mensch und Computer 2019(pp. 617-620).
    67. Hinde, R. A., & Barden, L. A. (1985). The evolution of the teddy bear. Animal Behaviour, 33(4), 1371-1373.
    68. Hoff, K. A., & Bashir, M. (2015). Trust in automation integrating empirical evidence on factors that influence trust. Human Factors: The Journal of the Human Factors and Ergonomics Society, 57(3), 407–434.
    69. Hoffman, G., Forlizzi, J., Ayal, S., Steinfeld, A., Antanitis, J., Hochman, G., ... & Finkenaur, J. (2015, March). Robot presence and human honesty: Experimental evidence. In 2015 10th ACM/IEEE International Conference on Human-Robot Interaction(HRI)(pp. 181-188). IEEE.
    70. Hwang, J., Park, T., & Hwang, W. (2013). The effects of overall robot shape on the emotions invoked in users and the perceived personalities of robot. Applied Ergonomics, 44(3), 459-471.
    71. Hyun, E., Yoon, H., & Son, S. (2010, March). Relationships between user experiences and children’s perceptions of the education robot. In 2010 5th ACM/IEEE International Conference on Human-Robot Interaction(HRI)(pp. 199-200). IEEE.
    72. Iwamura, Y., Shiomi, M., Kanda, T., Ishiguro, H., & Hagita, N. (2011, March). Do elderly people prefer a conversational humanoid as a shopping assistant partner in supermarkets? In Proceedings of the 6th international conference on Human-robot interaction (pp. 449-456).
    73. Jian, J. Y., Bisantz, A. M., & Drury, C. G. (2000). Foundations for an empirically determined scale of trust in automated systems. International Journal of Cognitive Ergonomics, 4(1), 53-71.
    74. Kaiser, F. G., Glatte, K., & Lauckner, M. (2019). How to make nonhumanoid mobile robots more likable: Employing kinesic courtesy cues to promote appreciation. Applied Ergonomics, 78, 70-75. https://doi.org/10.1016/j.apergo.2019.02.004.
    75. Kalegina, A., Schroeder, G., Allchin, A., Berlin, K., & Cakmak, M. (2018, February). Characterizing the design space of rendered robot faces. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (pp. 96-104).
    76. Kanda, T., Shiomi, M., Miyashita, Z., Ishiguro, H., & Hagita, N. (2010). A communication robot in a shopping mall. IEEE Transactions on Robotics, 26, 897–913.
    77. Kelley, J. F. (1984). An iterative design methodology for user-friendly natural language office information applications. ACM Transactions on Information Systems, 2(1), 26–41. http://dx.doi.org/10.1145/357417.357420.
    78. Kidd, C. D., & Breazeal, C. (2004, September). Effect of a robot on user perceptions. In 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS)(IEEE Cat. No. 04CH37566)(Vol. 4, pp. 3559-3564). IEEE.
    79. Kim, S. Schmitt, B., & Thalmann, N. (2019). Eliza in the Uncanny Valley: Anthropomorphizing Consumer Robots Increases their Perceived Warmth but Decrease Liking. Marketing Letters, 30(1), pp.1-12.
    80. Kim, W., Kim, N., Lyons, J. B., & Nam, C. S. (2020). Factors affecting trust in high-vulnerability human-robot interaction contexts: A structural equation modelling approach. Applied Ergonomics, 85, 103056.
    81. Kirandziska, V., & Ackovska, N. (2014, July). A concept for building more humanlike social robots and their ethical consequence. In Proceedings of the International Conferences, ICT, Society and Human Beings(MCCSIS).
    82. Kishi, T., Otani, T., Endo, N., Kryczka, P., Hashimoto, K., Nakata, K., & Takanishi, A. (2012, October). Development of expressive robotic head for bipedal humanoid robot. In 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 4584-4589). IEEE.
    83. Kok, B. C., & Soh, H. (2020). Trust in robots: Challenges and opportunities. Current Robotics Reports, 1, 297-309.
    84. Komatsu, T., & Kamide, M. (2017, December). Designing robot faces suited to specific tasks that these robots are good at. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication(RO-MAN)(pp. 1-5). IEEE.
    85. Krämer, N. C., & Winter, S. (2008). Impression management 2.0: The relationship of self-esteem, extraversion, self-efficacy, and self-presentation within social networking sites. Journal of Media Psychology, 20(3), 106-116.
    86. Kuraguchi, K., Taniguchi, K., & Ashida, H. (2015). The impact of baby schema on perceived attractiveness, beauty, and cuteness in female adults. Springerplus, 4(1), 1-8.
    87. Lacey, C., & Caudwell, C. (2019, March). Cuteness as a ‘dark pattern’in home robots. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction(HRI)(pp. 374-381). IEEE.
    88. Lange, B. P., & Holtfrerich, S. K. (2017, September). Cuteness rules–Selective attention to baby schema traits in media figures. In 10th Conference of the Media Psychology Division of the German Psychological Society,10.
    89. Lange, B. P., & Schwab, F. (2016). Ein Igel wird erwachsen: Die Evolution von Sonic the Hedgehog. Vortrag auf der, 16.
    90. Laohakangvalvit, T., Iida, I., Charoenpit, S., & Ohkura, M. (2017). A study of Kawaii feeling using eye tracking. International Journal of Affective Engineering, 16(3), 183-189. https://doi.org/10.5057/ijae.IJAE-D-16-00016.
    91. Law, E., Cai, V., Liu, Q. F., Sasy, S., Goh, J., Blidaru, A., & Kulić, D. (2017, August). A wizard-of-oz study of curiosity in human-robot interaction. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication(RO-MAN)(pp. 607-614). IEEE.
    92. Lee, J. J., Breazeal, C., and DeSteno, D. (2017). Role of speaker cues in attention inference. Frontiers in Robotics and AI, 4, 47. doi: 10.3389/frobt.2017.00047.
    93. Lehmann, H., Sureshbabu, A. V., Parmiggiani, A., & Metta, G. (2016, November). Head and face design for a new humanoid service robot. In International Conference on Social Robotics (pp. 382-391). Springer, Cham.
    94. Lehmann, V., Huis, E. M., & Vingerhoets, A. J. (2013). The human and animal baby schema effect: Correlates of individual differences. Behavioural Processes, 94, 99-108.
    95. Li, J. (2015). The benefit of being physically present: A survey of experimental works comparing copresent robots, telepresent robots and virtual agents. International Journal of Human-Computer Studies, 77, 23-37.
    96. Li, J., Kizilcec, R., Bailenson, J., & Ju, W. (2016). Social robots and virtual agents as lecturers for video instruction. Computers in Human Behavior, 55, 1222-1230.
    97. Liu, C., Ishi, C. T., Ishiguro, H., & Hagita, N. (2012, March). Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction. In 2012 7th ACM/IEEE International Conference on Human-Robot Interaction(HRI)(pp. 285-292). IEEE.
    98. Liu, Y., Li, F., Tang, L. H., Lan, Z., Cui, J., Sourina, O., & Chen, C. H. (2019, October). Detection of humanoid robot design preferences using EEG and eye tracker. In 2019 International Conference on Cyberworlds(CW)(pp. 219-224). IEEE.
    99. Lombard, M., & Xu, K. (2021). Social responses to media technologies in the 21st century: The media are social actors paradigm. Human-Machine Communication, 2, 29-55.
    100. Lorenz, K. (1943). Die angeborenen formen möglicher erfahrung. Ethology, 5(2), 235-409.
    101. Lorenz, K. (1971). Part and parcel in animal and human societies. Studies in Animal and Human Behavior, 115-195.
    102. Lukin, N. (2020). Do Cute Nursing Robots Get a Free Pass? Exploring How a Robot’s Appearance Influences Human Judgments on Forced Medication Decision (Master’s thesis, University of Helsinki, Helsinki, Finland). Retrived from https://ethesis.helsinki.fi/repository/handle/123456789/29991.
    103. Luo, L., Ma, X., Zheng, X., Zhao, W., Xu, L., Becker, B., & Kendrick, K. M. F. (2015). Neural systems and hormones mediating attraction to infant and child faces. Frontiers in Psychology, 6, 970.
    104. Luria, M., Forlizzi, J., & Hodgins, J. (2018, August). The effects of eye design on the perception of social robots. In 2018 27th IEEE International Symposium on Robot and Human Interactive Communication(RO-MAN)(pp. 1032-1037). IEEE.
    105. MacDorman, K. F. (2005). Androids as an experimental apparatus: Why is there an uncanny valley and can we exploit it. In CogSci-2005 workshop: Toward social mechanisms of android science (Vol. 106118).
    106. Maestripieri, D., & Pelka, S. (2002). Sex differences in interest in infants across the lifespan. Human Nature, 13(3), 327.
    107. Hello Robotics (2023). MAKI edu. Retrieved from Hello Robotics Web site: https://www.hello-robo.com/
    108. Malmir, M., Forster, D., Youngstrom, K., Morrison, L., & Movellan, J. (2013). Home alone: Social robots for digital ethnography of toddler behavior. In Proceedings of the IEEE international conference on computer vision workshops (pp. 762-768).
    109. Mara, M., & Appel, M. (2015). Effects of lateral head tilt on user perceptions of humanoid and android robots. Computers in Human Behavior, 44, 326-334.
    110. Mather, G. (2010). Head-body ratio as a visual cue for stature in people and sculptural art. Perception, 39(10), 1390-1395.
    111. Mathur, M. B., & Reichling, D. B. (2009, March). An uncanny game of trust: social trustworthiness of robots inferred from subtle anthropomorphic facial cues. In 2009 4th ACM/IEEE International Conference on Human-Robot Interaction(HRI)(pp. 313-314). IEEE.
    112. Mathur, M. B., & Reichling, D. B. (2016). Navigating a social world with robot partners: A quantitative cartography of the Uncanny Valley. Cognition, 146, 22-32.
    113. McGinn, C. (2019). Why do robots need a head? The role of social interfaces on service robots. International Journal of Social Robotics, 12(1), 281-295.
    114. Mead, R., & Matarić, M. J. (2015, September). Proxemics and performance: Subjective human evaluations of autonomous sociable robot distance and social signal understanding. In 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS)(pp. 5984-5991). IEEE.
    115. Miesler, L., Leder, H., & Herrmann, A. (2011). Isn’t it cute: An evolutionary perspective of baby-schema effects in visual product designs. International Journal of Design, 5(3).
    116. Mordor Intelligence (2021). GLOBAL SOCIAL ROBOTS MARKET - GROWTH, TRENDS, COVID-19 IMPACT, AND FORECASTS (2022 - 2027). Retrieved from Mordor Intelligence Web site: https://www.mordorintelligence.com/industry-reports/social-robots-market#
    117. Mori, M., MacDorman, K. F., & Kageki, N. (2012). The uncanny valley [from the field]. IEEE Robotics & Automation Magazine, 19(2), 98-100. 10.1109/MRA.2012.2192811.
    118. Mou, W., Ruocco, M., Zanatto, D., & Cangelosi, A. (2020). When would you trust a robot? a study on trust and theory of mind in human-robot interactions. In 2020 29th IEEE International Conference on Robot and Human Interactive Communication(RO-MAN)(pp. 956-962). IEEE.
    119. Murphy, J., Gretzel, U., & Pesonen, J. (2019). Marketing robot services in hospitality and tourism: the role of anthropomorphism. Journal of Travel & Tourism Marketing, 36(7), 784-795.
    120. Mutlu, B., Roy, N., & Šabanović, S. (2016). Cognitive human–robot interaction. Springer Handbook of Robotics, 1907-1934.
    121. Mutlu, B., Yamaoka, F., Kanda, T., Ishiguro, H., & Hagita, N. (2009, March). Nonverbal leakage in robots: communication of intentions through seemingly unintentional behavior. In Proceedings of the 4th ACM/IEEE international conference on Human robot interaction(pp. 69-76).
    122. Nass, C., Steuer, J., Henriksen, L., & Dryer, D. C. (1994). Machines, social attributions, and ethopoeia: Performance assessments of computers subsequent to “self-” or “other-” evaluations. International Journal of Human – Computer Studies, 40, 543–559.
    123. Natarajan, M., & Gombolay, M. (2020, March). Effects of anthropomorphism and accountability on trust in human robot interaction. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction(pp. 33-42).
    124. Nurimbetov, B., Saudabayev, A., Temiraliuly, D., Sakryukin, A., Serekov, A., & Varol, H. A. (2015, December). ChibiFace: A sensor-rich Android tablet-based interface for industrial robotics. In 2015 IEEE/SICE International Symposium on System Integration (SII)(pp. 587-592). IEEE.
    125. Onnasch, L., & Roesler, E. (2021). A taxonomy to structure and analyze human–robot interaction. International Journal of Social Robotics, 13(4), 833-849.
    126. Onuki, T., Ishinoda, T., Kobayashi, Y., & Kuno, Y. (2013, March). Design of robot eyes suitable for gaze communication. In 2013 8th ACM/IEEE International Conference on Human-Robot Interaction(HRI)(pp. 203-204). IEEE.
    127. Onyeulo, E. B., & Gandhi, V. (2020). What makes a social robot good at interacting with humans? Information, 11(1), 43.
    128. Phillips, E., Zhao, X., Ullman, D., & Malle, B. F. (2018, March). What is human-like?: Decomposing robots’ human-like appearance using the anthropomorphic robot(abot)database. In 2018 13th ACM/IEEE International Conference on Human-Robot Interaction(HRI)(pp. 105-113). IEEE.
    129. Anonymous (2019). Pillo: Your personal home health robot. Retrieved from health house Web site: https://www.health-house.be/en/health-innovations/pillo-your-personal-home-health-robot/
    130. Pittenger, J. B. (1990). Body proportions as information for age and cuteness: Animals in illustrated children’s books. Perception & Psychophysics, 48(2), 124-130.
    131. Powers, A., & Kiesler, S. (2006, March). The advisor robot: tracing people’s mental model from a robot’s physical attributes. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction (pp. 218-225).
    132. Premack, D., & Woodruff, G. (1978). Does the chimpanzee have a theory of mind? Behavioral and Brain Sciences, 1(4), 515-526.
    133. Purucker, C., Sprott, D. E., & Herrmann, A. (2014). Consumer response to car fronts: eliciting biological preparedness with product design. Review of Managerial Science, 8(4), 523-540. https://doi.org/10.1007/s11846-013-0116-2.
    134. Rane, P., Mhatre, V., & Kurup, L. (2014). Study of a home robot: Jibo. International Journal of Engineering Research and Technology, 3(10), 490-493.
    135. Reips, U. D., & Funke, F. (2008). Interval-level measurement with visual analogue scales in Internet-based research: VAS Generator. Behavior Research Methods, 40(3), 699-704. https://doi.org/10.3758/BRM.40.3.699.
    136. Riek, L. D. (2012). Wizard of oz studies in hri: a systematic review and new reporting guidelines. Journal of Human-Robot Interaction, 1(1), 119-136.
    137. Rigdon, M., Ishii, K., Watabe, M., & Kitayama, S. (2009). Minimal social cues in the dictator game. Journal of Economic Psychology, 30(3), 358-367.
    138. Robert, L. (2018, December). Personality in the human robot interaction literature: A review and brief critique. Proceedings of the 24th Americas Conference on Information Systems, Aug (pp. 16-18).
    139. Rodrigues, P. B., Singh, R., Oytun, M., Adami, P., Woods, P. J., Becerik-Gerber, B., ... & Lucas, G. M. (2023). A multidimensional taxonomy for human-robot interaction in construction. Automation in Construction, 150, 104845.
    140. Rosenthal-von der Pütten, A. M., Krämer, N. C., Maderwald, S., Brand, M., & Grabenhorst, F. (2019). Neural mechanisms for accepting and rejecting artificial social partners in the uncanny valley. Journal of Neuroscience, 39(33), 6555-6570.
    141. Rossi, S., Staffa, M., Bove, L., Capasso, R., & Ercolano, G. (2017). User’s Personality and Activity Influence on HRI Comfortable Distances. In Social Robotics: 9th International Conference, ICSR 2017, Tsukuba, Japan, November 22-24, 2017, Proceedings 9 (pp. 167-177). Springer International Publishing.
    142. Ruhland, K., Peters, C. E., Andrist, S., Badler, J. B., Badler, N. I., Gleicher, M., ... & McDonnell, R. (2015). A review of eye gaze in virtual agents, social robotics and hci: Behaviour generation, user interaction and perception. In Computer Graphics Forum (Vol. 34, No. 6, pp. 299-326).
    143. Šabanović, S., & Chang, W. L. (2016). Socializing robots: constructing robotic sociality in the design and use of the assistive robot PARO. AI & Society, 31(4), 537-551.
    144. Sabanovic, S., Michalowski, M. P., & Simmons, R. (2006). Robots in the wild: Observing human-robot social interaction outside the lab. In 9th IEEE International Workshop on Advanced Motion Control, 2006. (pp. 596-601). IEEE.
    145. Salem, M., Lakatos, G., Amirabdollahian, F., & Dautenhahn, K. (2015, March). Would you trust a(faulty)robot? Effects of error, task type and personality on human-robot cooperation and trust. In 2015 10th ACM/IEEE International Conference on Human-Robot Interaction(HRI)(pp. 1-8). IEEE.
    146. Saunderson, S., & Nejat, G. (2019). How robots influence humans: A survey of nonverbal communication in social human-robot interaction. International Journal of Social Robotics, 11(4), 575-608.
    147. Saygin, A. P., Chaminade, T., Ishiguro, H., Driver, J., & Frith, C. (2012). The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions. Social Cognitive and Affective Neuroscience, 7(4), 413-422.
    148. Schaefer, K. (2013). The perception and measurement of human-robot trust. Electronic Theses and Dissertations, 2004-2019. 2688.
    149. Schaefer, K. E. (2016). Measuring trust in human robot interactions: Development of the “trust perception scale-HRI”. In Robust Intelligence and Trust in Autonomous Systems (pp. 191-218). Springer, Boston, MA.
    150. Schermerhorn, P., Scheutz, M., & Crowell, C. R. (2008, March). Robot social presence and gender: Do females view robots differently than males? In Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction (pp. 263-270).
    151. Seyama, J. I., & Nagayama, R. S. (2007). The uncanny valley: Effect of realism on the impression of artificial human faces. Presence, 16(4), 337-351. 10.1162/pres.16.4.337.
    152. Shayganfar, M., Rich, C., & Sidner, C. L. (2012, October). A design methodology for expressing emotion on robot faces. In 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 4577-4583). IEEE.
    153. Sheridan, T. B. (2016). Human–robot interaction: status and challenges. Human Factors, 58(4), 525-532.
    154. Sipitakiat, A., & Blikstein, P. (2013, June). Interaction design and physical computing in the era of miniature embedded computers. In Proceedings of the 12th International Conference on Interaction Design and Children (pp. 515-518).
    155. Smith, T. V. (1930). Book Review: Robots or Men? H. Dubreuil. International Journal of Ethics, 41(1).
    156. Song, Y., & Luximon, Y. (2020). Trust in AI agent: A systematic review of facial anthropomorphic trustworthiness for social robot design. Sensors, 20(18), 5087.
    157. Song, Y., & Luximon, Y. (2021). The face of trust: The effect of robot face ratio on consumer preference. Computers in Human Behavior, 116, 106620.
    158. Song, Y., Luximon, A., & Luximon, Y. (2021). The effect of facial features on facial anthropomorphic trustworthiness in social robots. Applied Ergonomics, 94, 103420.
    159. Steinfeld, A., Fong, T., Kaber, D., Lewis, M., Scholtz, J., Schultz, A., & Goodrich, M. (2006, March). Common metrics for human-robot interaction. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction (pp. 33-40).
    160. Steinfeld, A., Jenkins, O. C., & Scassellati, B. (2009, March). The oz of wizard: simulating the human for interaction research. In Proceedings of the 4th ACM/IEEE international conference on Human robot interaction (pp. 101-108).
    161. Strait, M., Briggs, P., & Scheutz, M. (2015). Gender, more so than age, modulates positive perceptions of language-based human-robot interactions. In 4th international symposium on new frontiers in human robot interaction, 21-22.
    162. Strait, M., Urry, H. L., & Muentener, P. (2019, March). Children’s responding to humanlike agents reflects an uncanny valley. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction(HRI)(pp. 506-515). IEEE.
    163. Tao, V., Moy, K., & Amirfar, V. A. (2016). A little robot with big promise may be future of personalized health care. Pharmacy Today, 22(9), 38.
    164. Tasaki, R., Kitazaki, M., Miura, J., & Terashima, K. (2015, May). Prototype design of medical round supporting robot “Terapio”. In 2015 IEEE International Conference on Robotics and Automation(ICRA)(pp. 829-834). IEEE.
    165. Todorov, A., Dotsch, R., Porter, J. M., Oosterhof, N. N., & Falvello, V. B. (2013). Validation of data-driven computational models of social perception of faces. Emotion, 13(4), 724.
    166. Torre, I., & White, L. (2021). Trust in Vocal Human–Robot Interaction: Implications for Robot Voice Design. In Voice Attractiveness. 299-316.
    167. Trovato, G., Lucho, C., & Paredes, R. (2018). She’s electric-the influence of body proportions on perceived gender of robots across cultures. Robotics, 7(3), 50.
    168. van Breemen, A., Yan, X., & Meerbeek, B. (2005, July). iCat: an animated user-interface robot with personality. In Proceedings of the fourth international joint conference on Autonomous agents and multiagent systems (pp. 143-144).
    169. Van Pinxteren, M. M., Wetzels, R. W., Rüger, J., Pluymaekers, M., & Wetzels, M. (2019). Trust in humanoid robots: implications for services marketing. Journal of Services Marketing, 33(4), 507-518.
    170. Venturoso, L., Gabrieli, G., Truzzi, A., Azhari, A., Setoh, P., Bornstein, M. H., & Esposito, G. (2019). Effects of baby schema and mere exposure on explicit and implicit face processing. Frontiers in Psychology, 10, 2649.
    171. Verner, I. M., Polishuk, A., & Krayner, N. (2016). Science class with RoboThespian: using a robot teacher to make science fun and engage students. IEEE Robotics & Automation Magazine, 23(2), 74-80.
    172. Vollmer, A. L., Read, R., Trippas, D., & Belpaeme, T. (2018). Children conform, adults resist: A robot group induced peer pressure on normative social conformity. Science Robotics, 3(21), eaat7111.
    173. Walters, M. L., Dautenhahn, K., Te Boekhorst, R., Koay, K. L., Kaouri, C., Woods, S., ... & Werry, I. (2005, August). The influence of subjects’ personality traits on personal spatial zones in a human-robot interaction experiment. In ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005. (pp. 347-352). IEEE.
    174. Walters, M. L., Dautenhahn, K., Woods, S. N., Koay, K. L., Te Boekhorst, R., & Lee, D. (2006). Exploratory studies on social spaces between humans and a mechanical-looking robot. Connection Science, 18(4), 429-439.
    175. Walters, M. L., Koay, K. L., Syrdal, D. S., Dautenhahn, K., & Te Boekhorst, R. (2009). Preferences and perceptions of robot appearance and embodiment in human-robot interaction trials. Procs of New Frontiers in Human-Robot Interaction, 136-143.
    176. Walters, M. L., Syrdal, D. S., Dautenhahn, K., Te Boekhorst, R., & Koay, K. L. (2008). Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion. Autonomous Robots, 24(2), 159-178
    177. Winfield, A. F., & Jirotka, M. (2018). Ethical governance is essential to building trust in robotics and artificial intelligence systems. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 376(2133), 20180085.
    178. Wittig, S., Rätsch, M., & Kloos, U. (2015). Parameterized facial animation for socially interactive robots. In Mensch und Computer 2015 Tagungsband (pp. 355-358).
    179. Xu, K., Chen, X., & Huang, L. (2022). Deep mind in social responses to technologies: A new approach to explaining the Computers are Social Actors phenomena. Computers in Human Behavior, 134, 107321.
    180. Yanco, H. A., & Drury, J. L. (2002, November). A taxonomy for human-robot interaction. In Proceedings of the AAAI fall symposium on human-robot interaction (pp. 111-119).
    181. Yim, J. D., & Shaw, C. D. (2011). Design considerations of expressive bidirectional telepresence robots. In CHI’11 Extended Abstracts on Human Factors in Computing Systems (pp. 781-790).
    182. Yoshikawa, Y., Shinozawa, K., Ishiguro, H., Hagita, N., & Miyamoto, T. (2006, October). The effects of responsive eye movement and blinking behavior in a communication robot. In 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 4564-4569). IEEE.
    183. Yott J, Poulin-Dubois D. (2016). Are infants’ theory-of-mind abilities well integrated? implicit understanding of intentions, desires, and beliefs. Journal of Cognition and Development,17(5). https://doi.org/10.1080/15248372.2015.1086771.
    184. Zaga, C., De Vries, R. A., Li, J., Truong, K. P., & Evers, V. (2017, May). A simple nod of the head: The effect of minimal robot movements on children’s perception of a low-anthropomorphic robot. In Proceedings of the 2017 CHI conference on human factors in computing systems (pp. 336-341).
    185. Zebrowitz, L. A., Voinescu, L., & Collins, M. A. (1996). “Wide-Eyed" and" Crooked-Faced": Determinants of Perceived and Real Honesty Across the Life Span. Personality and social psychology bulletin, 22(12), 1258-1269.
    186. Zhang, T., Kaber, D. B., Zhu, B., Swangnetr, M., Mosaly, P., & Hodge, L. (2010). Service robot feature design effects on user perceptions and emotional responses. Intelligent Service Robotics, 3(2), 73-88.
    187. Zheng, W., Luo, T., Hu, C. P., & Peng, K. (2018). Glued to which face? Attentional priority effect of female babyface and male mature face. Frontiers in Psychology, 9, 286.
    188. Złotowski, J., Sumioka, H., Nishio, S., Glas, D. F., Bartneck, C., & Ishiguro, H. (2016). Appearance of a robot affects the impact of its behaviour on perceived trustworthiness and empathy. Paladyn, Journal of Behavioral Robotics, 7(1), 55-66.

    中文參考文獻

    1. 毛誌賢、朱曉龍、韋建軍、王春寶、劉銓權、段麗紅、王同、羅承開、張廣帥、王玉龍、龍建軍、林焯華(2021)。家庭服務機器人現狀與展望。機電工程技術,50(02):8-14。
    2. 王亮(2020)社交機器人“單向度情感”倫理風險問題芻議。自然辯證法研究, 36(01):56-61。
    3. 王亮(2021)。基於情境體驗的社交機器人倫理:從“欺騙”到“向善”。自然辯證法研究,37(10):55-60。
    4. 王婷、王丹、張積家(2019)。雙語與雙言影響心理理論內隱系統與外顯系統的分離。心理與行為研究,17(04):442-451。
    5. 何燦群、肖維禎(2020)。服務機器人中的擬人化設計研究。裝飾,(04):27-31。
    6. 周天策(2017)。共處的“中道”——兒童陪護機器人的倫理風險分析。自然辯證法研究, 33(04):57-62。
    7. 林爱珺、刘运红(2021)。“算计情感”:社交机器人的伦理风险审视。新媒体与社会,(01):47-58。
    8. 胡明豔、譚潤民(2022)。陪護機器人倫理問題及對策建議。科技智囊,316(09):70-76。
    9. 張晨光(2017)。內隱心理理論研究進展。湖北函授大學學報。30(06):93-94。
    10. 喻豐、許麗穎(2020)。人工智慧之擬人化。西北師大學報(社會科學版),57(05):52-60。
    11. 程村(2020)。微表情識別綜述。計算機時代,(09):17-19+23。
    12. 賈曉毓,陳建雄(2020)。公共場所中人與社交機器人之互動行為觀察,銘傳大學2020國際學術研討會-「設計 X Reset」設計論文集,銘傳大學,2020年5月22日,1-8。
    13. 錢渺、傅根躍(2014)。心理理論的自發反應範式:方法、結果與解釋。心理科學進展,22(01):27-37。
    14. 龐亮、易茜(2022)。人機傳播中的“背叛”:社交機器人的倫理困境。中國新聞傳播研究,(04):16-28。
    15. 竇笑(2022)。社交機器人倫理問題與政策建議研究。智庫理論與實踐, 7(05):103-110。

    無法下載圖示 全文公開日期 2025/07/22 (校內網路)
    全文公開日期 2025/07/22 (校外網路)
    全文公開日期 2025/07/22 (國家圖書館:臺灣博碩士論文系統)
    QR CODE