簡易檢索 / 詳目顯示

研究生: Muhammad Yeza Baihaqi
Muhammad Yeza Baihaqi
論文名稱: 人形機器人考官口試下的人類行為、主觀和生理評估
Human Behavioral, Subjective, and Physiological Assessments Under an Oral Test by a Humanoid Robot Examiner
指導教授: 徐勝均
Sheng-Dong Xu
口試委員: 柯正浩
Cheng-Hao Ko
李俊賢
Jin-Shyan Lee
學位類別: 碩士
Master
系所名稱: 工程學院 - 自動化及控制研究所
Graduate Institute of Automation and Control
論文出版年: 2023
畢業學年度: 111
語文別: 英文
論文頁數: 87
中文關鍵詞: 行為評估人形機器人考官生理評估學生的考試焦慮主觀評估
外文關鍵詞: behavioral assessment, humanoid robot examiner, physiological assessment, students’ test anxiety, subjective assessment
相關次數: 點閱:203下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報


Acknowledgements I 摘要 II Abstract III Table of Contents IV List of Figures VI List of Tables VIII Chapter 1 Introduction 1 1.1 Background and Motivation 1 1.2 Thesis Objectives 3 1.3 Thesis Outline 4 Chapter 2 Theoretical Background 5 2.1 Social Robotics 5 2.1.1 Social Robots Platform 6 2.1.2 Social Robots Experimental Designs 8 2.2 Student Anxiety 11 2.2.1 The Role of Small Talk, Positive Reinforcement, and Positive Affirmation in Reducing Student Anxiety 12 Chapter 3 Methods 14 3.1 Participants 14 3.2 Robot Architecture 15 3.2.1 Asynchronous Active Speech Recognition (AASR) 16 3.2.2 Dialogue Manager and Expression Manager 17 3.2.3 Text-to-Speech 21 3.2.4 Robot Gesture 22 3.2.5 Answer Storage 23 3.3 Oral Test Protocol 23 3.4 Assessments 30 3.4.1 Behavioral Assessment 31 3.4.2 Subjective Assessment 32 3.4.3 Physiological Assessment 33 3.5 Experimental Procedure 35 3.6 Video Acceptance Survey 36 3.7 Data Analysis 37 Chapter 4 Experimental Results and Discussion 39 4.1 BERT Model Testing Results 39 4.2 Behavioral Assessment 40 4.3 Subjective Assessment 43 4.3.1 Robot Evaluation 43 4.3.2 CTAS 46 4.4 Physiological Assessment 47 4.4.1 Heart Rate 47 4.4.2 Oxygen Saturation 50 4.5 Student Score 51 4.6 Video Survey 51 4.7 Discussion 52 Chapter 5 Conclusion and Future Works 58 5.1 Conclusion 58 5.2 Future Works 58 References 60

[1] M. Y. Baihaqi and S. S.-D. Xu, “A survey on practical applications of small robots as social robots,” in Proc. International Conference on System Science and Engineering, Taichung, Taiwan, May 26–29, 2022, p. 1. [Online]. Available: https://scholar.google.co.id/scholar?oi=bibs&cluster=3781506376625057595&btnI=1&hl=en. Accessed: September 29, 2022.
[2] R. V. D. Berghe, J. Verhagen, O. Oudegenoeg-Paz, S. V. D. Ven, and P. Leseman, “Social robots for language learning: A review,” Review of Educational Research, vol. 89, no. 2, pp. 259–295, December 2018, DOI: 10.3102/0034654318821286.
[3] B. Görer, A. A. Salah, and H. L. Akin, “A robotic fitness coach for elderly,” in Proc. International Joint Conference on Ambient Intelligence, Dublin, Ireland, December 3-5, 2013, pp. 124–139, DOI: 10.1007/978-3-319-03647-2_9.
[4] J. Saunders, D. S. Syrdal, K. L. Koay, N. Burke, and K. Dautenhahn, “‘Teach me–show me’—end-user personalization of a smart home and companion robot,” IEEE Transactions on Human-Machine Systems, vol. 46, no. 1, pp. 27–40, February 2016, DOI: 10.1109/THMS.2015.2445105.
[5] M. Y. Baihaqi, V. Vincent, and J. W. Simatupang, “Humanoid robot application as COVID-19 symptoms checker using computer vision and multiple sensors,” ELKHA: Jurnal Teknik Elektro, vol. 13, no. 2, pp. 105–112, October 2021, DOI: 10.26418/elkha.v13i2.47213.
[6] R. Aydogan, O. Keskin, and U. Cakan, “Would you imagine yourself negotiating with a robot, Jennifer? Why not?,” IEEE Transactions on Human-Machine Systems, vol. 52, no. 1, pp. 41–51, February 2022, DOI: 10.1109/THMS.2021.3121664.
[7] M. Lei, I. M. Clemente, H. Liu, and J. Bell, “The acceptance of telepresence robots in higher education,” International Journal of Social Robotics, vol. 14, no. 1, pp. 1025–1042, January 2022, DOI: 10.1007/s12369-021-00837-y.
[8] M. Ford, Rise of the Robots: Technology and the Threat of a Jobless Future, New York, NY, USA, Forbes, 2015, pp. 1-81. [Online]. Available: uc.pt/feuc/citcoimbra/Martin_Ford-Rise_of_the_Robots. Accessed: September 29, 2022.
[9] M. Smakman, P. Vogt, and E. A. Konijn, “Moral considerations on social robots in education: A multi-stakeholder perspective,” Computers and Education, vol. 174, no. 104317, pp. 1–14, August 2021, DOI: 10.1016/j.compedu.2021.104317.
[10] J. Shimaya, Y. Yoshikawa, K. Ogawa, and H. Ishiguro, “Robotic question support system to reduce hesitation for face-to-face questions in lecture,” Journal of Computer Assisted Learning, vol. 32, no. 3, pp. 621–631, November 2020, DOI: 10.1111/jcal.12511.
[11] Z. Huang, C. Lin, M. K. Pak, J. Maeda, Y. Kitajima, M. Nakamura, N, Kuwahara, T. Ogata, and J. Ota, “Impact of using a robot patient for nursing skill training in patient transfer,” IEEE Transactions on Learning Technologies, vol. 10, no. 3, pp. 355–366, September 2017, DOI: 10.1109/TLT.2016.2599537.
[12] C. Lytridis, C. Bazinas, G. Sidiropoulos, G. A. Papakostas, V. G. Kaburlasos, V.-A. Nikopoulou, V. Holeva, and A. Evangeliou, “Distance special education delivery by social robots,” MDPI: Electronics, vol. 1034, no. 9, pp. 1–13, June 2020, DOI: 10.3390/electronics9061034.
[13] C.-F. Shih, C.-W. Chang, and G.-D. Chen, “Robot as a storytelling partner in the English classroom-preliminary discussion,” in Proc. International Conference on Advanced Learning Technologies, Niigata, Japan, July 18–20, 2007, pp. 1–5, DOI: ICALT.2007.219.
[14] S. Kim and C. Lee, “Effects of robot for teaching geometry to fourth graders,” International Journal of Innovation in Science and Mathematics Education, vol. 24, no. 2, pp. 52–70, Jan. 2016. [Online]. Available: openjournals.library.sydney.edu. au/index.php/CAL/article/view/9048. Accessed: July 7, 2023.
[15] T. Belpaeme, J. Kennedy, B. Scassellati, and F. Tanaka, “Social robots for education: A review,” Science Robotics, vol. 3, no. 21, pp. 1–9, August 2018, DOI: 10.1126/scirobotics.aat5954.
[16] A. Azis, “Teachers’ conceptions and use of assessment in student learning,” Indonesian Journal of Applied Linguistics, vol. 2, no. 1, pp. 40–51, July 2012, DOI: 10.17509/ijal.v2i1.72.
[17] W. A. Rosen and M. E. Carr, “An autonomous articulating desktop robot for proctoring remote online examinations,” in Proc. IEEE Frontiers in Education Conference, Oklahoma, OK, USA, October 23–26, 2013, pp. 1935–1939, DOI: 10.1109/FIE.2013.6685172.
[18] P. Dhivya, J. Divya, S. A. Prakash, and N. Devi, “Exam assisting robot,” in Proc. Online Conference on Data Science and Intelligent Information Technology, Kovilpatti, India, May 14–15, 2021, pp. 1-10, DOI: 10.1063/5.0078269 .
[19] O. Mubin, M. Cappuccio, F. Alnajjar, M. I. Ahmad, and S. Shahid, “Can a robot invigilator prevent cheating?,” AI and Society, vol. 35, no. 1, pp. 981–989, March 2020, DOI: 10.1007/s00146-020-00954-8.
[20] L. N. Brown and A. M. Howard, “The positive effects of verbal encouragement in mathematics education using social robots,” in Proc. IEEE Integrated STEM Education, Princeton, NJ, USA, March 8, 2014, pp. 1–5, DOI: 10.1109/ISECon. 2014.6891009.
[21] S. Spaulding, H. Chen, S. Ali, M. Kulinski, and C. Breazeal, “A social robot system for modeling children’s word pronunciation,” in Proc. International Conference on Autonomous Agents and Multi Agent Systems, Stockholm, Sweden, July 10–15, 2018, pp. 1658–1666, DOI: 10.5555/3237383.3237946.
[22] C. Y. Lin, W. W. Shen, M. H. M. Tsai, J. M. Lin, and W. K. Cheng, “Implementation of an individual English oral training robot system,” in Proc. International Conference on Innovative Technologies and Learning, Porto, Portugal, August 29–31, 2020, pp. 40–49, DOI: 10.1007/978-3-030-63885-6_5.
[23] M. Sun, I. Leite, J. F. Lehman, and B. Li, “Collaborative storytelling between robot and child: a feasibility study,” in Proc. Peer and Robot Interactions, Stanford, CA, USA, June 27–30, 2017, pp. 205–214, DOI: 10.1145/3078072.3079714.
[24] B. M. Kehm, “Oral examinations at German universities,” Assessment in Education, vol. 8, no. 1, pp. 25–31, June 2010, DOI: 10.1080/09695940120033234.
[25] N. Borin, L. Metcalf, and B. Tietje, “Implementing assessment in an outcome-based marketing curriculum,” Journal of Marketing Education, vol. 30, no. 2, pp. 150–159, May 2008, DOI: 10.1177/0273475308317706.
[26] H. Li, D. Yang, and Y. Shiota, “Exploring the possibility of using a humanoid robot as a tutor and oral test proctor in Chinese as a foreign language,” in Lecture Notes in Educational Technology, vol. 1, Ed. Singapore, Singapore, Springer, 2021, pp. 113–129, DOI: 10.1007/978-981-15-7579-2_6.
[27] T. B. Sheridan, “Human-robot interaction: status and challenges,” Human Factors, vol. 58, no. 6, pp. 525–532, April 2016, DOI: 10.1007/s00146-021-01208-x.
[28] A. Dobrosoverstnova, G. Hannibal, and T. Reinboth, “Service robots for affective labor: A sociology of labor perspective,” AI & Society, vol. 37, no. 4, pp. 487–499, April 2021, DOI: 10.1007/s00146-021-01208-x.
[29] C. Breazeal and B. Scassellati, “A context-dependent attention system for a social robot,” in Proc. International Joint Conference on Artificial Intelligence, Stockholm, Sweden, July 31–August 6, 1999, pp. 1146–1151, DOI: 10.5555/1624312.1624382.
[30] M. Sarrica, S. Brondi, and L. Fortunati, “How many facets does a ‘social robot’ have? A review of scientific and popular definitions online,” Information Technology and People, vol. 33, no. 1, pp. 1–21, Jan. 2020, DOI: 10.1108/ITP-04-2018-0203.
[31] M. M. A. D. Graaf, S. B. Allouch, and J. A. G. M. V. Dijk, “What makes robots social?: A user’s perspective on characteristics for social human-robot interaction,” in Lecture Notes in Computer Science, M. J.-C. In: Tapus A, André E and E. Ferland F, Ammi M, Eds., Cham, Springer International Publishing, 2015, pp. 184–193, DOI: 10.1007/978-3-319-25554-5_19.
[32] H. Mahdi, S. A. Akgun, S. Saleh, and K. A. Dautenhahn, “A survey on the design and evolution of social robots — Past, present and future,” Robotics and Autonomous System and Autonomous System, vol. 156, no. 104193, pp. 1–48, October 2022, DOI: /10.1016/j.robot.2022.104193.
[33] T. Fong, I. Nourbakhsh, and K. Dautenhahn, “A survey of socially interactive robots,” Robotics and Autonomous System, vol. 42, no. 3–4, pp. 143–166, March 2003, DOI: 10.1016/S0921-8890(02)00372-X.
[34] T. Shibata, “An overview of human interactive robots for psychological enrichment,” Proceedings of the IEEE, vol. 92, no. 11, pp. 1749–1758, December 2004, DOI: 10.1109/JPROC.2004.835383.
[35] K. Baraka, P. Alves-Oliveira, and T. Ribeiro, “An extended framework for characterizing social robots,” in Springer Series on Bio- and Neurosystems, L. P. B. Jost C and E. Belpaeme T, Bethel C, Chrysostomou D, Crook N, M. Grandgeorge, and N. Mirnig, Eds., Springer International Publishing, 2020, pp. 21–64, DOI: 10.1007/978-3-030-42307-0_2.
[36] G. Hoffman and X. Zhao, “A Primer for conducting experiments in human–robot interaction,” Journal of Human-Robot Interaction, vol. 10, no. 1, pp. 1–31, March 2021, DOI: 10.1145/3412374.
[37] D. A. Norman, The Psychology of Everyday, New York, USA, 1988. [Online]. Accessed: September 29, 2022.
[38] M. Y. Baihaqi and S. S.-D. Xu, “Seven stages of action for the interaction design of the customer service of a robot in a shopping mall,” in Proc. International Conference on Advanced Robotics and Intelligent Systems, Taipei, Taiwan, August 24–27, 2022, p. 1. [Online]. Available: https://scholar.google.co.id/scholar?oi =bibs&cluster=15240398606079444214&btnI=1&hl=en. Accessed: September 29, 2022.
[39] L. A. Burker-Smalley, “Using oral exams to assess communication skills in business courses,” Business and Professional Communication Quarterly, vol. 77, no. 3, pp. 235–256, June 2014, DOI: 10.1177/2329490614537873.
[40] M. A. Memon, G. R. Joughin, and B. Memon, “Oral assessment and postgraduate medical examinations: establishing conditions for validity,” Advance in Health Science Education, vol. 15, no. 2, pp. 277–289, March 2008, DOI: 10.1007/s10459-008-9111-9.
[41] R. Wakefield, L. Southgate, and V. Wass, “Improving oral examinations: Selecting, training, and monitoring examiners for the MRCGP,” British Medical Journal, vol. 311, no. 1, pp. 931–935, October 1995, DOI: 10.1136/bmj.311.7010.931.
[42] M. A. Brackett, J. L. Floman, C. A. Floman, L. Cherkasskiy, and P. Salovey, “The influence of teacher emotion on grading practices: A preliminary look at the evaluation of student writing,” Teachers and Teaching: Theory and Practice, vol. 19, no. 6, pp. 634–646, April 2013, DOI: 10.1080/13540602.2013.827453.
[43] H. Hazen, “Use of oral examinations to assess student learning in the social sciences,” Journal of Geography in Higher Education, vol. 44, no. 4, pp. 592–607, June 2020, DOI: 10.1080/03098265.2020.1773418.
[44] A. Trifoni and M. Shahini, “How does exam anxiety affect the performance of university student,” Mediterranean Journal of Social Sciences, vol. 2, no. 2, pp. 93–100, May 2011. [Online]. Available: https://www.richtmann.org/journal/index. php/mjss/article/view/10785/10398. Accessed: July 7, 2023.
[45] M. Zeidner, Test Anxiety: The State of the Art, New York, NY, USA: Plenum Press, 1998. [Online]. Available: https://link.springer.com/book/10.1007/b109548. Accessed: October 28, 2022.
[46] G. Ramirez, S. T. Shaw, and E. A. Maloney, “Math anxiety: Past research, promising interventions, and a new interpretation framework,” Education Psychologist, vol. 53, no. 3, pp. 145–164, July 2018, DOI: 10.1080/00461520.2018.1447384.
[47] L. H. Lian and M. B. Budin, “Investigating the relationship between English language anxiety and the achievement of school based oral English test among Malaysian form four students,” International Journal of Learning, Teaching, and Educational Research, vol. 2, no. 1, pp. 67–69, February 2014. [Online]. Available: https://ijlter.myres.net/index.php/ijlter/article/view/1214. Accessed: September 28. 2022.
[48] C. A. Klee, Faces in a Crowd: The Individual Learner in Multisection Courses, Boston, MA, USA: Heinle, 1995. [Online]. Available: https://eric.ed.gov/?id= ED482487. Accessed: June 20, 2022.
[49] R. Hembree, “Correlates, causes, effects, and treatment of test anxiety,” Review of Educational Research,” Review of Educational Research, vol. 4, no. 5, pp. 253–262, July 2014, DOI: 10.3102/00346543058001047.
[50] M. Salehi and F. Marefat, “The effects of foreign language anxiety and test anxiety on foreign language test performance,” Theory and Practice in Language Studies, vol. 4, no. 5, pp. 931–940, May 2014, DOI: 10.4304/tpls.4.5.931-940.
[51] J. C. Cassady and R. E. Johnson, “Cognitive test anxiety and academic performance,” Contemporary Educational Psychology, vol. 27, no. 2, pp. 270–295, April 2002, DOI: 10.1006/ceps.2001.1094.
[52] Yansyah and H. Nor, “The effect of small talk on undergraduate students speaking ability,” in Proc. English Language Teaching, Literature, and Translation International Conference, Semarang, Indonesia, October 6–7, 2015, pp. 954–966. [Online]. Available: https://scholar.google.co.id/citations?view_op=view_citation &hl=id&user=Cjdjj4AAAAJ&cstart=20&pagesize=80&sortby=pubdate&citation_for_view=-Cjdjj4AAAAJ:qjMakFHDy7sC. Accessed: June 29, 2022.
[53] J. Yulia, “The effect of small talk toward students’ speaking ability at SMP YLPI Marpoyan Pekanbaru,” Bachelor’s thesis, Faculty of Education, Universitas Islam Riau, Pekanbaru, Indonesia, 2020. [Online]. Available: https://repository.uir.ac.id/ 16187/. Accessed: September 29, 2022.
[54] A. A. Blau, “The use of a small talk and its effect on the level of academic self-concept and the oral performance in the seventh grade elementary school student,” Teacher degree thesis, Faculty of Education, Universidad Alberto Hurtado, Santiago, Chile, 2017. [Online]. Available: https://repositorio.uahurtado.cl/bitstream/handle/ 11242/24049/INGAlvearBlau.pdf?sequence=1. Accessed: September 30, 2022.
[55] H. Pebriyana, “The correlation of students’ anxiety and self-confidence toward their speaking ability,” Journal of Languages and Language Teaching, vol. 5, no. 1, pp. 28–33, May 2017, DOI: 10.33394/jollt.v5i1.331.
[56] A. Beins, “Small talk and chit chat: Using informal communication to build a learning community online,” Transformations: The Journal of Inclusive Scholarship and Pedagogy, vol. 26, no. 2, pp. 157–175, June 2017, DOI: 10.33394/jollt.v5i1.331.
[57] I. Wahab, Z. Astri, N. Tanasy, and N. Fachrunnisa, “A conversation analysis: The use of small-talk,” Scope of English Language Teaching, Literature and Linguistics, vol. 4, no. 1, pp. 53–62, June 2021, DOI: 10.46918/seltics.v4i1.983.
[58] A. G. Ramirez and J. K. Hall, “Language and culture in secondary level spanish textbooks,” The Modern Language Journal, vol. 74, no. 1, pp. 48–65, February 1990, DOI: 10.2307/327943.
[59] A. Cotton, Classroom Reinforcement. Chicago, USA, Herbert Welberg, University of Illinois, 2008. [Online]. Accessed: June 20, 2022.
[60] T. C. Moore, R. Robetson, D. M. Maggin, R. M. Oliver, and J. H. Wehby, “Using teacher praise and opportunities to respond to promote appropriate student behavior,” Preventing School Failure, vol. 54, no. 3, pp. 172–178, January 2010, DOI: 10.1080/10459880903493179.
[61] M. R. Lepper, J. H. Corpus, and S. S. Iyengar, “Intrinsic and extrinsic motivational orientations in the classroom: Age differences and academic correlates,” Journal of Education Psychology, vol. 97, no. 2, pp. 184–196, May 2005, DOI: 10.1037/0022-0663.97.2.184.
[62] F. Manzoor, M. Ahmed, and B. R. Gill, “Use of motivational expressions as positive reinforcement in learning English at primary level in rural areas of Pakistan,” International Journal of English Language Teaching, vol. 3, no. 1, pp. 32–47, March 2015. [Online]. Available: https://www.eajournals.org/wp-content/uploads/Use-of-Motivational-Expressions-as-Positive-Reinforcement-in-Learning-English-At-Prim ary-Level-in-Rural-Areas-of-Pakistan3.pdf. Accessed: June 25, 2022.
[63] W. T. Henderson and S.-S. Wen, “Effects immediate positive reinforcement on undergraduate’s course achievement,” Psychological Reports, vol. 39, no. 2, pp. 568–570, July 1976, DOI: 10.2466/pr0.1976.39.2.568.
[64] E. Kitano, “Anxiety in the college Japanese language classroom,” The Modern Language Journal, vol. 85, no. 4, pp. 549–565, December 2002, DOI: 10.1111/0026-7902.00125.
[65] O. A. Samodra and A. Faridi, “The correlation of positive reinforcement, self-confidence, and speaking performance of English young learners,” English Education Journal, vol. 11, no. 3, pp. 393–405, September 2021, DOI: 10.15294/ EEJ.V11I1.45463.
[66] Z. Tian, J. Yi, Y. Bai, J. Tao, S. Zhang, and Z. Wen, “Synchronous transformers for end-to-end speech recognition,” in Proc. International Conference on Acoustics, Speech and Signal Processing, Barcelona, Spain, May 4–8, 2020, pp. 7884–7888, DOI: 10.1109/ICASSP40776.2020.9054260.
[67] D. Fox, “Releasing our v8 transcription Model - 18.72% better accuracy,” AssemblyAI, 2021. [Online]. Available: http://www.assemblyai.com/blog/releasing-our-v8-transcription-model-major-accuracy-improvements/. Accessed: May 13, 2023.
[68] J. L. Ba, J. R. Kiros, and G. E. Hinton, “Layer normalization,” arXiv: 1607.06450, pp. 1–14, July 2016.
[69] L. Vaughn, “2022 benchmark report,” AssemblyAI, 2022. [Online]. Available: https://www.assemblyai.com/blog/2022-benchmark-report/. Accessed: May 13, 2023.
[70] G. N. R. Prasad, “Identification of Bloom’s taxonomy level for the given question paper using NLP tokenization technique,” Turkish Journal of Computer and Mathematics Education, vol. 12, no. 13, pp. 1872–1875, June 2021. [Online]. Available: https://www.turcomat.org/index.php/turkbilmat/article/view/8839/6886. Accessed: June 29, 2022.
[71] A. G. Jivani, “A Comparative study of stemming algorithms,” International Journal of Computer Applications in Technology, vol. 2, no. 6, pp. 1930–1938, December 2011. [Online]. Available: https://www.researchgate.net/publication/284038938_ A_Comparative_Study_of_Stemming_Algorithms. Accessed: July 3, 2022.
[72] J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, “BERT: Pre-training of deep bidirectional transformers for language understanding,” arXiv: 1810.04805, pp. 1–16, May 2019.
[73] P. Y. Wu and W. R. Mebane, “A deep learning framework for constructing multimodal representations for vision-and-language tasks,” Computational Communication Research, vol. 4, no. 1, pp. 275–322, February 2022. [Online]. Available: https://computationalcommunication.org/ccr/article/view/102. Accessed: January 22, 2023.
[74] N. Chen, Y. Sun, and Y. Yan, “Sentiment analysis and research based on two-channel parallel hybrid neural network model with attention mechanism,” IET Control Theory & Applications, vol. 10, no. 1049, pp. 1–9, April 2023, DOI: /10.1049/ cth2.12463.
[75] T. Ma, Q. Pan, H. Rong, Y. Qian, Y. Tian, and N. Al-Nabhan, “T-BERTSum: Topic-aware text summarization based on BERT,” IEEE Transactions on Computational Social Systems, vol. 9, no. 3, pp. 879–890, June 2022, DOI: 10.1109/TCSS. 2021.3088506.
[76] J. Bromley, I. Guyon, Y. Lecun, E. Sackinger, and R. Shah, “Signature verification using a ‘Siamese’ time delay neural network,” in Proc. International Conference on Neural Information Processing Systems, Denver, Colorado, USA, November 29–December 2, 1993, pp. 737–744, DOI: 10.5555/2987189.2987282.
[77] N. Reimers and I. Gurevych, “Sentence-BERT: Sentence embeddings using siamese BERT-networks,” arXiv: 1908.10084, pp. 1–11, August 2019.
[78] P. Xia, L. Zhang, and F. Li, “Learning similarity with cosine similarity ensemble,” Information Sciences, vol. 307, pp. 39–52, June 2015, DOI: /10.1016/j.ins.2015.02. 024.
[79] A. Sharma, “Pyttsx3: Python library for text to speech,” Github, 2019. [Online]. Available: https://github.com/nateshmbhat/pyttsx3. Accessed: May 05, 2023.
[80] A. C. Punekar, S. A. Nambiar, R. S. Laliya, and S. R. Salian, “A translator for Indian sign language to text and speech,” International Journal for Research in Applied Science and Engineering Technology, vol. 8, no. 6, pp. 1640–1646, June 2020, DOI: 10.22214/ijraset.2020.6267.
[81] A. V. Yadav, S. S. Verma, and D. D. Singh, “Virtual assistant for blind people,” International Journal of Advance Scientific Research and Engineering Trends, vol. 6, no. 5, pp. 156–159, May 2021, DOI: 10.22214/ijraset.2022.48401.
[82] S. Singh, A. Vishwakarma, P. Shukla, A. Raote, and K. Sheikh, “Healthcare virtual assistant,” International Research Journal of Modernization in Engineering Technology and Science, vol. 03, no. 5, pp. 3050–3054, May 2021. [Online]. Available: https://www.irjmets.com/uploadedfiles/paper/volume3/issue_5_may_202 1/11161/1628083452.pdf. Accessed: November 22, 2022.
[83] “Rapiro the programmable DIY robot kit with endless possibilities,” Switch Science, 2011. [Online]. Available: https://www.rapiro.com/. Accessed: June 14, 2023.
[84] S. Song and S. Yamada, “Expressing emotions through color, sound, and vibration with an appearance-constrained social robot,” in Proc. International Conference on Human-Robot Interaction, Vienna, Austria, March 6–9, 2017, pp. 2–11, DOI: 10.1145/2909824.3020239.
[85] P. Bremmer and M. Fraser, “The effects of robot-performed co-verbal gesture on listener behavior,” in Proc. International Conference on Humanoid Robots, Bled, Slovenia, October 26–28, 2011, pp. 458–465, DOI: 10.1109/Humanoids. 2011. 6100810.
[86] M. Salem, S. Kopp, I. Wachsmuth, K. Rohfing, and F. Joublin, “Generation and evaluation of communicative robot gesture,” International Journal of Social Robotics, vol. 4, no. 1, pp. 201–217, February 2012, DOI: 10.1007/s12369-011-0124-9 .
[87] J. Borgstedt, F. E. Pollick, and S. Brewster, “Hot or not? Exploring user perceptions of thermal human-robot interaction,” in Proc. International Conference on Robot and Human Interactive Communication, Napoli, Italy, August 29–September 2, 2022, pp. 1334–1340, DOI: 10.1109/RO-MAN53752.2022.9900785.
[88] T. Gresgersen, “Recognizing visual and auditory cues in the detection of foreign-language anxiety,” TESL Canada Journal, vol. 26, no. 2, pp. 46–64, June 2009, DOI: /10.18806/tesl.v26i2.414.
[89] L. Hu and N. Wang, “Anxiety in foreign language learning,” in Proc. International Conference on Global Economy, Commerce, and Service Science, Phuket, Thailand, 2014, pp. 122–124. [Online]. Available: https://www.atlantis-press.com/article/ 10950.pdf. Accessed: August 22, 2022.
[90] N. Shomoossi and Z. Kassaian, “Variation of test anxiety over listening and speaking test performance,” Iranian Journal of Language Studies, vol. 3, no. 1, pp. 65–78, Jan. 2009.
[91] K. Hull, H. L. Lawford, S. Hood, M. Murray, M. Trempe, J. Crooks, M. Richardson, and M. Jensen, “Student anxiety and evaluation,” Collected Essays on Learning and Teaching, vol. 12, no. 1, pp. 23–35, June 2019. [Online]. Available: researchgate.net/publication/26571145_Variation_of_Test_anxiety_over_Listening_and_Speaking_Test_Performance. Accessed: June 20, 2022.
[92] S. Nur, R. Rosmini, and G. Sakkir, “EFL students’ anxiety in oral presentation in thesis examination during COVID-19 pandemic era: Factors and strategies,” Arrus Journal of Social Sciences and Humanities, vol. 2, no. 2, pp. 144–159, October 2020. [Online]. Available: https://ojs.unm.ac.id/JoEELE/article/view/32683. Accessed: June 22, 2022.
[93] S. E. P. Djahimo, D. I. N. B. Bora, and E. Huan, “Student anxiety and their speaking performance: teaching EFL to Indonesian student,” International Journal of Social Sciences and Humanities, vol. 2, no. 3, pp. 187–195, December 2018, DOI: 10.29332/ijssh.v2n3.235.
[94] A. Stefan, C. M. Berchtold, and M. Angstwurm, “Translation of a scale measuring cognitive test anxiety (G-CTAS) and its psychometric examination among medical students in Germany,” GMS Journal for Medical Education, vol. 37, no. 5, pp. 1–15, September 2020, DOI: 10.3205/zma001343.
[95] L. A. Furlan, J. C. Cassady, and E. R. Perez, “Adapting the cognitive test anxiety scale for use with Argentinean university students,” International Journal of Testing, vol. 9, no. 1, pp. 3–19, March 2009, DOI: 10.1080/15305050902733448.
[96] S. Dogan, N. Nalcaci, S. Dogan, A. Badnjevic, A. Kurtovic, and D. Marjanovic, “Changes in blood pressure and heart rate measurement undergraduate students during exam period,” Journal of Biometrics & Biostatistics, vol. 8, no. 2, pp. 1–5, April 2017, DOI: 10.4172/2155-6180.1000347.
[97] I. V. Diest, J. F. Thayer, B. Vandeputte, K. P. V. D. Woestijne, and O. V. D. Bergh, “Anxiety and respiratory variability,” Physiology and Behavior, vol. 89, no. 2, pp. 189–195, September 2006, DOI: 10.1016/j.physbeh.2006.05.041.
[98] D. . Zipes, P. Libby, R. O. Bonow, D. L. Mann, and G. F. Tomaselli, Braunwald’s Heart Disease e-Book: A Textbook of Cardiovascular Medicine, Philadelphia, PA, USA, Elsevier Health Sciences, 2018. [Online]. Accessed: June 22, 2022.
[99] A. C. Ralston, R. K. Webb, and W. B. Runciman, “Potential errors in pulse oximetry,” Anesthesia, vol. 46, no. 4, pp. 202–206, April 1991, DOI: 10.1111/j.1365-2044.1991. tb09411.x.
[100] “ESP32­WROOM­32 datasheet,” 2023. [Online]. Available: espressif.com. Accessed: June 14, 2023.
[101] “MAX30100 datasheet,” 2014. [Online]. Available: analog.com. Accessed: June 14, 2023.
[102] W. Jenkal, R. Latif, A. Toumanari, A. Dliou, E. E. B’charri, and F. M. R. Moulainine, “An efficient algorithm of ECG signal denoising using the adaptive dual threshold filter and the discrete wavelet transform,” Biocybernetics and Biomedical Engineering, vol. 36, no. 3, pp. 499–508, April 2016, DOI: 10.1016/j.bbe.2016. 04.001.
[103] D. K. Avdeeva, V. Y. Kazakov, N. M. Natalinova, M. L. Ivanov, M. A. Yuzhakoya, and N. V. Turushev, “The simulation results of the high-pass and low-pass filter effect on the quality of micro potential recordings on the electrocardiogram,” Biology and Medicine, vol. 6, no. 1, pp. 1–10, September 2014. [Online]. Available: https://biolmedonline.com/Articles/Vol6_1_2014/BM-015-14. Accessed: June 22, 2022.
[104] M. Y. Baihaqi, C. W. D. Lumoindong, and V. Vincent, “Simulasi perbandingan filter savitzky golay dan filter low pass butterworth pada orde ketiga sebagai pembatal kebisingan,” Jurnal Konstelasi, vol. 1, no. 2, pp. 226–232, April 2021, DOI: 10.24002/konstelasi.v1i2.4294.
[105] A. Datar, A. Jain, and P. C. Sharma, “Design of Kaiser window based optimized prototype filter for cosine modulated filter banks,” Signal Processing, vol. 5, no. 90, pp. 1742–1749, May 2010, DOI: 10.1016/j.sigpro.2009.11.011.
[106] C. Hu, Z. Wang, Y. Zhu, and M. Zhang, “Accurate three-dimensional contouring error estimation and compensation scheme with zero-phase filter,” International Journal of Machine Tools and Manufacture, vol. 128, pp. 33–40, May 2018, DOI: 10.1016/j.ijmachtools.2018.01.001.
[107] M. Y. Baihaqi and W. Wijaya, “Penerapan filter Kalman untuk meningkatkan akurasi dan presisi sensor suhu LM35,” Jurnal Konstelasi, vol. 1, no. 1, pp. 93–101, June 2021, DOI: 10.24002/konstelasi.v1i1.4282.
[108] A. A. S. Pirooz, R. G. J. Flay, L. Minola, C. A. Molina, and D. Chen, “Effects of sensor response and moving average filter duration on maximum wind gust measurements,” Journal of Wind Engineering and Industrial Aerodynamics, vol. 206, no. 104354, pp. 1–13, November 2020, DOI: 10.1016/j.jweia.2020.104354.
[109] C. Breazeal, “Toward sociable robots,” Robotics and Autonomous System, vol. 42, no. 3, pp. 167–175, March 2003, DOI: https://doi.org/10.1016/S0921-8890(02) 00373-1.
[110] T. Nomura and A. Nakao, “Comparison on the identification of affective body motions by robots between elder people and university students: A case study in Japan,” International Journal of Social Robotics, vol. 2, no. 2, pp. 147–157, June 2010, DOI: 10.1007/s12369-010-0050-2.
[111] M. Heerink, B. Korse, V. Evers, and B. Wielinga, “Assessing acceptance of assistive social agent technology by older adults: the Almere model,” International Journal of Social Robotics, vol. 2, no. 1, pp. 361–375, September 2010, DOI: 10.1007/s12369-010-0068-5.
[112] N. Reimers, “Pretrained models sentence-transformers,” 2022. [Online]. Available: https://www.sbert.net/docs/pretrained_models.html. Accessed: June 28, 2023. Accessed: June 29, 2022.
[113] H. Kallio, A.-M. Pietilä, M. Johnson, and M. K. Docent, “Systematic methodological review: developing a framework for a qualitative semi-structured interview guide,” Journal of Advanced Nursing, vol. 72, no. 12, pp. 2954–2965, June 2016, DOI: 10.1111/jan.13031.
[114] M. DeJonchkheere and L. M. Vaughn, “Semistructured interviewing in primary care research: a balance of relationship and rigour,” Fam Med Com Health, vol. 7, no. 2, pp. 1–8, March 2019.
[115] K. Malterud, V. D. Siersma, and A. D. Guassora, “Sample size in qualitative interview studies: Guided by information power,” Qualitative Health Research, vol. 26, no. 13, pp. 1–8, November 2016, DOI: 10.1177/1049732315617444.
[116] P.-C. Lin, P. C. K. Huang, Y. Jiang, C. P. Velasco, and M. A. M. Cano, “An experimental design for facial and color emotion expression of a social robot,” The Journal of Supercomputing, vol. 79, no. 2, pp. 1980–2009, August 2023, DOI: 10.1007/s11227-022-04734-7.
[117] F. Babel, J. Kraus, L. Miller, M. Kraus, N. Wagner, W. Minker, and M. Baumann, “Small talk with a robot? The impact of dialog content, talk initiative, and gaze behavior of a social robot on trust, acceptance, and proximity,” International Journal of Social Robotics, vol. 13, no. 2, pp. 1485–1498, August 2021, DOI: 10.1007/ s12369-020-00730-0.

無法下載圖示 全文公開日期 2025/07/31 (校內網路)
全文公開日期 2025/07/31 (校外網路)
全文公開日期 2025/07/31 (國家圖書館:臺灣博碩士論文系統)
QR CODE