簡易檢索 / 詳目顯示

研究生: 周慧如
Hui-Ru Chou
論文名稱: 應用機器學習建構生理數據之情緒辨識預測模型
Applying machine learning to construct prediction model for emotion recognition based on physiological data
指導教授: 林久翔
Chiuh-Siang Lin
口試委員: 曹譽鐘
Yu-Chung Tsao
王孔政
Kung-Jeng Wang
學位類別: 碩士
Master
系所名稱: 管理學院 - 工業管理系
Department of Industrial Management
論文出版年: 2023
畢業學年度: 111
語文別: 中文
論文頁數: 114
中文關鍵詞: 情緒辨識機器學習生理訊號統計分析
外文關鍵詞: Emotion recognition, Machine learning, Physiological signals, Statistical analysis
相關次數: 點閱:374下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

近年來,情緒誘發實驗盛行,主要情緒辨識方式分為行為模式及生理模式,行為模式有臉部辨識(Facial recognition)、語音辨識(Speech recognition)等;生理模式有腦電圖(Electroencephalogram, EEG)、心電圖(Electrocardiogram, ECG)、皮膚電反應(Galvanic Skin Response, GSR)、心率(Heart Rate, HR),及心率變異性(Heart Rate Variability, HRV)等。在本研究藉由穿戴裝置收集受測者之腦電圖(EEG)、皮膚電反應(GSR)、心率(HR),及心率變異性(HRV)之生理數據進行情緒誘發實驗,並應用六種情緒誘發影片,分別為恐懼(Fear)、悲傷(Sadness)、厭惡(Disgust)、憤怒(Anger)、歡喜(Joy)、驚喜(Surprise),以及國際情感圖庫系統(IAPS, International Affective Picture System)之 54 張圖片,誘發 50 位受試者情緒表現,同時收集主觀 SAM 問券數據及客觀生理數據,進行重複實驗單因子變異數分析、成對比較,以及使用獨立成分分析(independent component analysis, ICA)降維方式,做情緒辨識預測,探討主觀 SAM 量表與客觀生理訊號之不同生理數據組合與不同演算法對情緒辨識模型的影響。從研究結果欲找出最佳情緒辨識模型,當受測者在觀看情緒影片及圖片時,藉由生理訊號預測受測者,對於情緒影片及照片情緒之分類辨識。在影片方面,以六類情緒辨識時,極限梯度提升(eXtreme Gradient Boosting, XGBoost)演算法準確率為 62%;以兩類之 正 、 負 向 情 緒 辨 識 時 , 極 限 梯 度 提 升 (eXtreme Gradient Boosting, XGBoost) 及Voting(Random Forest+ XGBoost)演算法準確率可達 97%。在圖片方面,以三類之正、負向、中性情緒圖片辨識時,使用隨機森林(Random Forest) 及極限梯度提升(eXtreme Gradient Boosting, XGBoost)演算法準確率可達 67%;以二類之正、負向情緒圖片辨識時,使用隨機森林(Random Forest, RF) 、極限梯度提升(eXtreme Gradient Boosting, XGBoost)及 Voting(Random Forest+ XGBoost)演算法準確率最高都可達 81%。在本研究第 5 章討論,經過整體模型性能比較後,又以 XGBoost 演算法最佳。本研究將可以藉由生理數據之統計分析及預測模型方式,供未來研究員作為進一步的參考。


In recent years, emotion induction experiments have become prevalent, and the main
methods for emotion recognition are behavioral and physiological. Behavioral methods include facial recognition, speech recognition, etc., while physiological methods include Electroencephalogram (EEG), Electrocardiogram (ECG), Galvanic Skin Response (GSR), Heart Rate (HR), and Heart Rate Variability (HRV). In this study,physiological data, including EEG, GSR, HR, and HRV, were collected from participants using wearable devices during emotion induction experiments. Six emotion induction videos were used, namely Fear, Sadness, Disgust, Anger, Joy, and Surprise, as well as 54 images from the International Affective Picture System (IAPS). Fifty participants were recruited, and subjective Self-Assessment Manikin (SAM) questionnaires and objective physiological data were collected to analyze the effect of different physiological data combinations and algorithms on emotion recognition models using
repeated measures ANOVA, paired comparisons, and Independent Component Analysis (ICA)
dimensionality reduction.According to the research results, the aim is to find the optimal emotion recognition model that can predict the classification of emotions experienced by participants while watching emotion-inducing videos and IAPS images based on physiological signals. In the case of emotional videos with six emotion categories, the eXtreme Gradient Boosting (XGBoost) algorithm achieves an accuracy of 62%. When it comes to distinguishing between positive and negative emotional videos, both the XGBoost and Voting (Random Forest+ XGBoost) algorithms reached an accuracy of 97%.For the three categories of IAPS images (positive, negative, and neutral), the Random Forest algorithm and XGBoost had an accuracy of 67%, and for positive and negative IAPS images, the Random Forest, XGBoost, and Voting (Random Forest +XGBoost) algorithms all achieved the highest accuracy of 81%. In the discussion of Chapter 5 of this study, after comparing the overall performance of the models, the XGBoost
algorithm is determined to be the best. The results of this study can be used as a reference on statistical analysis and predictive models of physiological data for future researchers.

摘要 I Abstract II 誌謝 III 目錄 IV 圖目錄 VII 表目錄 IX 第 1 章 緒論 1 1.1 研究背景與動機 1 1.2 研究目的 2 1.3 研究限制 3 第 2 章 文獻探討 4 2.1 情緒辨識 4 2.2 情緒誘發評估 10 2.2.1 主觀評估 11 2.2.2 客觀評估 12 2.3 機器學習情緒辨識流程 14 2.3.1 特徵提取 14 2.3.2 特徵降維 14 2.4 分類演算法 15 2.4.1 隨機森林(Random Forest) 15 2.4.2 極限梯度提升(eXtreme Gradient Boosting, XGBoost) 16 2.4.3 Ensemble Voting: 隨機森林(Random Forest, RF)+極限梯度提升(XGBoost).... 16 第 3 章 研究方法 17 3.1 實驗設計 17 3.1.1 自變項與應變項 17 3.1.2 受測者 17 3.1.3 實驗設備 17 3.2 實驗內容與程序 19 3.2.1 情緒影片誘發 21 3.2.2 IAPS 情緒圖片誘發 21 3.3 資料處理與分析方法 22 3.3.1 數據預處理 22 3.3.2 分類模型訓練 23 3.3.3 模型性能評估 24 第 4 章 研究結果 25 4.1 統計分析 25 4.1.1 情緒影片之單因子重複實驗變異數分析 25 4.1.2 IAPS 圖片之情緒維度分析 30 4.2 機器學習模型性能 31 4.2.1 六類及二類之情緒影片模型性能 31 4.2.2 三類及二類之 IAPS 情緒圖片模型性能 36 第 5 章 討論 41 5.1 受測後訪談及主觀問券分析結果 41 5.2 情緒影片及 IAPS 圖片統計分析比較 42 5.3 機器學習模型性能分析結果 44 5.3.1 情緒影片及 IAPS 圖片之模型性能比較 44 5.3.2 不同生理訊號組合及主觀評估模型性能比較 48 5.3.3 整體情緒分類模型比較 50 第 6 章 結論與展望 53 6.1 結論 53 6.2 未來展望 53 Reference 55 附錄一:參與研究同意書 64 附錄二:情緒影片與 IAPS 圖片之情緒三維平均值與標準差 68 附錄三:情緒影片之主觀與客觀 LSD 成對比較 73

Ahmad, Z., & Khan, N. (2022). A Survey on Physiological Signal-Based Emotion Recognition. Bioengineering, 9(11), 688.
Al-Fahoum, A. S., & Al-Fraihat, A. A. (2014). Methods of EEG signal features extraction using linear analysis in frequency and time-frequency domains. International Scholarly Research Notices, 2014.
AL-Ziarjawey, H. A. J., & Çankaya, I. (2015). Heart rate monitoring and PQRST detection based on graphical user interface with Matlab. International Journal of Information and Electronics Engineering, 5(4), 311.
Algarni, M., Saeed, F., Al-Hadhrami, T., Ghabban, F., & Al-Sarem, M. (2022). Deep learning-based approach for emotion recognition using electroencephalography (EEG) signals using Bi-directional long short-term memory (Bi-LSTM). Sensors, 22(8), 2976.
Anandhi, B., & Jerritta, S. (2022). Hilbert Huang Transform (HHT) Analysis of Heart Rate Variability (HRV) in Recognition of Emotion in Children with Autism Spectrum Disorder (ASD). In Biomedical Signals Based Computer-Aided Diagnosis for Neurological Disorders (pp. 65-81). Springer.
Arai, T., Kato, R., & Fujita, M. (2010). Assessment of operator stress induced by robot collaboration in assembly. CIRP annals, 59(1), 5-8.
Barkana, B. D., Ozkan, Y., & Badara, J. A. (2022). Analysis of working memory from EEG signals under different emotional states. Biomedical signal processing and control, 71, 103249.
Bejani, M., Gharavian, D., & Charkari, N. M. (2014). Audiovisual emotion recognition using ANOVA feature selection method and multi-classifier neural networks. Neural Computing and Applications, 24, 399-412.
Bhardwaj, A., Gupta, A., Jain, P., Rani, A., & Yadav, J. (2015). Classification of human emotions from EEG signals using SVM and LDA Classifiers. 2015 2nd International Conference on Signal Processing and Integrated Networks (SPIN),
Bong, S. Z., Murugappan, M., & Yaacob, S. (2012). Analysis of electrocardiogram (ECG) signals for human emotional stress classification. Trends in Intelligent Robotics, Automation, and Manufacturing: First International Conference, IRAM 2012, Kuala Lumpur, Malaysia, November 28-30, 2012. Proceedings,
Bradley, M. M., Codispoti, M., Cuthbert, B. N., & Lang, P. J. (2001). Emotion and motivation I: defensive and appetitive reactions in picture processing. Emotion, 1(3), 276.
Britton, J. C., Taylor, S. F., Sudheimer, K. D., & Liberzon, I. (2006). Facial expressions and complex IAPS pictures: common and differential networks. Neuroimage, 31(2), 906-919.
Brodny, G., Kołakowska, A., Landowska, A., Szwoch, M., Szwoch, W., & Wróbel, M. R. (2016). Comparison of selected off-the-shelf solutions for emotion recognition based on facial expressions. 2016 9th International Conference on Human System Interactions (HSI),
Calefato, F., Lanubile, F., & Novielli, N. (2017). Emotxt: a toolkit for emotion recognition from text. 2017 seventh international conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW),
Chen, P., & Zhang, J. (2017). Performance comparison of machine learning algorithms for EEG-signal-based emotion recognition. Artificial Neural Networks and Machine Learning–ICANN 2017: 26th International Conference on Artificial Neural Networks, Alghero, Italy, September 11-14, 2017, Proceedings, Part I 26,
Cheng, C.-F., & Lin, C. J. (2023). Building a Low-Cost Wireless Biofeedback Solution: Applying Design Science Research Methodology. Sensors, 23(6), 2920.
Deng, Y., Yang, M., & Zhou, R. (2017). A new standardized emotional film database for Asian culture. Frontiers in psychology, 8, 1941.
Domínguez-Jiménez, J. A., Campo-Landines, K. C., Martínez-Santos, J. C., Delahoz, E. J., & Contreras-Ortiz, S. H. (2020). A machine learning model for emotion recognition from physiological signals. Biomedical signal processing and control, 55, 101646.
Dutta, S., Mishra, B. K., Mitra, A., & Chakraborty, A. (2022). An analysis of emotion recognition based on GSR signal. ECS Transactions, 107(1), 12535.
Dzedzickis, A., Kaklauskas, A., & Bucinskas, V. (2020). Human emotion recognition: Review of sensors and methods. Sensors, 20(3), 592.
Edla, D. R., Mangalorekar, K., Dhavalikar, G., & Dodia, S. (2018). Classification of EEG data for human mental state analysis using Random Forest Classifier. Procedia computer science, 132, 1523-1532.
Egger, M., Ley, M., & Hanke, S. (2019). Emotion recognition from physiological signal analysis: A review. Electronic Notes in Theoretical Computer Science, 343, 35-55.
Ekman, P. (1971). Universals and cultural differences in facial expressions of emotion. Nebraska symposium on motivation,
Febrianto, R., & Wijayanto, T. (2023). Emotion classification using EEG signals in response to image stimulus using support vector machine method. AIP Conference Proceedings,
Fucito, L. M., & Juliano, L. M. (2009). Depression moderates smoking behavior in response to a sad mood induction. Psychology of Addictive Behaviors, 23(3), 546.
Gadzhiev, I. M., Knyshenko, M. P., Dolenko, S. A., & Samsonovich, A. V. (2023). Inherent dimension of the affective space: Analysis using electromyography and machine learning. Cognitive Systems Research, 78, 96-105.
Goldman, A. I., & Sripada, C. S. (2005). Simulationist models of face-based emotion recognition. Cognition, 94(3), 193-213.
Grossmann, I., Ellsworth, P. C., & Hong, Y.-y. (2012). Culture, attention, and emotion. Journal of Experimental Psychology: General, 141(1), 31.
Guo, H.-W., Huang, Y.-S., Lin, C.-H., Chien, J.-C., Haraikawa, K., & Shieh, J.-S. (2016). Heart rate variability signal features for emotion recognition by using principal component analysis and support vectors machine. 2016 IEEE 16th international conference on bioinformatics and bioengineering (BIBE),
Guyon, I., Gunn, S., Nikravesh, M., & Zadeh, L. A. (2008). Feature extraction: foundations and applications (Vol. 207). Springer.
Haag, A., Goronzy, S., Schaich, P., & Williams, J. (2004). Emotion recognition using bio-sensors: First steps towards an automatic system. Affective Dialogue Systems: Tutorial and Research Workshop, ADS 2004, Kloster Irsee, Germany, June 14-16, 2004. Proceedings,
Hamdi, H., Richard, P., Suteau, A., & Allain, P. (2012). Emotion assessment for affective computing based on physiological responses. 2012 IEEE international conference on fuzzy systems,
Han, D., Kong, Y., Han, J., & Wang, G. (2022). A survey of music emotion recognition. Frontiers of Computer Science, 16(6), 166335.
Huang, J., Xu, D., Peterson, B. S., Hu, J., Cao, L., Wei, N., Zhang, Y., Xu, W., Xu, Y., & Hu, S. (2015). Affective reactions differ between Chinese and American healthy young adults: a cross-cultural study using the international affective picture system. BMC psychiatry, 15, 1-7.
Izard, C. (1971). The Face of Emotion, Appleton-Century-Crofts. East Norwalk, CT.
Jahedi, S., & Méndez, F. (2014). On the advantages and disadvantages of subjective measures. Journal of Economic Behavior & Organization, 98, 97-114.
Jiang, Y.-G., Xu, B., & Xue, X. (2014). Predicting emotions in user-generated videos. Proceedings of the AAAI Conference on Artificial Intelligence,
Kahou, S. E., Pal, C., Bouthillier, X., Froumenty, P., Gülçehre, Ç., Memisevic, R., Vincent, P., Courville, A., Bengio, Y., & Ferrari, R. C. (2013). Combining modality specific deep neural networks for emotion recognition in video. Proceedings of the 15th ACM on International conference on multimodal interaction,
Kaviani, H., Gray, J. A., Checkley, S. A., Kumari, V., & Wilson, G. D. (1999). Modulation of the acoustic startle reflex by emotionally-toned film-clips. International Journal of Psychophysiology, 32(1), 47-54.
Khalil, R. A., Jones, E., Babar, M. I., Jan, T., Zafar, M. H., & Alhussain, T. (2019). Speech emotion recognition using deep learning techniques: A review. IEEE Access, 7, 117327-117345.
Kim, Y. E., Schmidt, E. M., Migneco, R., Morton, B. G., Richardson, P., Scott, J., Speck, J. A., & Turnbull, D. (2010). Music emotion recognition: A state of the art review. Proc. ismir,
Kipli, K., Latip, A. A. A., Lias, K., Bateni, N., Yusoff, S. M., Suud, J., Jalil, M., Ray, K., Kaiser, M. S., & Mahmud, M. (2023). Evaluation of Galvanic Skin Response (GSR) Signals Features for Emotion Recognition. Applied Intelligence and Informatics: Second International Conference, AII 2022, Reggio Calabria, Italy, September 1–3, 2022, Proceedings,
Kirsh, S. J., & Mounts, J. R. (2007). Violent video game play impacts facial emotion recognition. Aggressive Behavior: Official Journal of the International Society for Research on Aggression, 33(4), 353-358.
Kołakowska, A., Landowska, A., Szwoch, M., Szwoch, W., & Wrobel, M. R. (2014). Emotion recognition and its applications. Human-Computer Systems Interaction: Backgrounds and Applications 3, 51-62.
Koolagudi, S. G., & Rao, K. S. (2012). Emotion recognition from speech: a review. International journal of speech technology, 15, 99-117.
Lang, P., & Bradley, M. M. (2007). The International Affective Picture System (IAPS) in the study of emotion and attention. Handbook of emotion elicitation and assessment, 29, 70-73.
Lang, P. J. (1995). The emotion probe: Studies of motivation and attention. American psychologist, 50(5), 372.
Lee, G., Park, S., & Whang, M. (2023). The Evaluation of Emotional Intelligence by the Analysis of Heart Rate Variability. Sensors, 23(5), 2839.
Lee, H., Shin, D., & Shin, D. (2014). A Study on the Emotion Classification as well as the Algorithm of the Classification Applying EEG-Data. Future Information Technology: FutureTech 2014,
Lee, T.-W., & Lewicki, M. S. (2002). Unsupervised image classification, segmentation, and enhancement using ICA mixture models. IEEE Transactions on Image Processing, 11(3), 270-279.
Levenson, R. W. (2003). Autonomic specificity and emotion. Handbook of affective sciences, 2, 212-224.
Liew, S.-L., Ma, Y., Han, S., & Aziz-Zadeh, L. (2011). Who's afraid of the boss: Cultural differences in social hierarchies modulate self-face recognition in Chinese and Americans. PloS one, 6(2), e16901.
Lin, X., Chen, J., Ma, W., Tang, W., & Wang, Y. (2023). EEG emotion recognition using improved graph neural network with channel selection. Computer methods and programs in biomedicine, 231, 107380.
Liu, M., Fan, D., Zhang, X., & Gong, X. (2016). Human emotion recognition based on galvanic skin response signal feature selection and svm. 2016 international conference on smart city and systems engineering (ICSCSE),
Maaoui, C., & Pruski, A. (2010). Emotion recognition through physiological signals for human-machine communication. Cutting edge robotics, 2010(317-332), 11.
Macht, M., Roth, S., & Ellgring, H. (2002). Chocolate eating in healthy men during experimentally induced sadness and joy. Appetite, 39(2), 147-158.
Mehmood, R. M., & Lee, H. J. (2016). A novel feature extraction method based on late positive potential for emotion recognition in human brain signal patterns. Computers & Electrical Engineering, 53, 444-457.
Mehrabian, A. (1997). Comparison of the PAD and PANAS as models for describing emotions and for differentiating anxiety from depression. Journal of psychopathology and behavioral assessment, 19, 331-357.
Murugappan, M., Ramachandran, N., & Sazali, Y. (2010). Classification of human emotion from EEG using discrete wavelet transform. Journal of biomedical science and engineering, 3(04), 390.
Naik, G. R., Arjunan, S., & Kumar, D. (2011). Applications of ICA and fractal dimension in sEMG signal processing for subtle movement analysis: a review. Australasian physical & engineering sciences in medicine, 34, 179-193.
Nasoz, F., Alvarez, K., Lisetti, C. L., & Finkelstein, N. (2004). Emotion recognition from physiological signals using wireless sensors for presence technologies. Cognition, Technology & Work, 6, 4-14.
Nie, D., Wang, X.-W., Shi, L.-C., & Lu, B.-L. (2011). EEG-based emotion recognition during watching movies. 2011 5th International IEEE/EMBS Conference on Neural Engineering,
Nita, S., Bitam, S., Heidet, M., & Mellouk, A. (2022). A new data augmentation convolutional neural network for human emotion recognition based on ECG signals. Biomedical signal processing and control, 75, 103580.
Oatley, K., & Johnson-Laird, P. N. (1987). Towards a cognitive theory of emotions. Cognition and emotion, 1(1), 29-50.
Oh, S., Lee, J.-Y., & Kim, D. K. (2020). The design of CNN architectures for optimal six basic emotion classification using multiple physiological signals. Sensors, 20(3), 866.
Panksepp, J. (1982). Toward a general psychobiological theory of emotions. Behavioral and Brain sciences, 5(3), 407-422.
Parrott, W. G. (2001). Emotions in social psychology: Essential readings. psychology press.
Picard, R. W. (2000). Affective computing. MIT press.
Plutchik, R. (2001). The nature of emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. American scientist, 89(4), 344-350.
Raheel, A., Majid, M., Alnowami, M., & Anwar, S. M. (2020). Physiological sensors based emotion recognition while experiencing tactile enhanced multimedia. Sensors, 20(14), 4037.
Rai, N., Kaushik, N., Kumar, D., Raj, C., & Ali, A. (2022). Mortality prediction of COVID-19 patients using soft voting classifier. International Journal of Cognitive Computing in Engineering, 3, 172-179.
Rigas, G., Katsis, C. D., Ganiatsas, G., & Fotiadis, D. I. (2007). A user independent, biosignal based, emotion recognition method. User Modeling 2007: 11th International Conference, UM 2007, Corfu, Greece, July 25-29, 2007. Proceedings 11,
Rinella, S., Massimino, S., Fallica, P. G., Giacobbe, A., Donato, N., Coco, M., Neri, G., Parenti, R., Perciavalle, V., & Conoci, S. (2022). Emotion Recognition: Photoplethysmography and Electrocardiography in Comparison. Biosensors, 12(10), 811.
Sasikala, D., Chandrakanth, D., Reddy, C., & Teja, J. J. (2022). Inhibiting Webshell Attacks by Random Forest Ensembles with XGBoost. Journal of Information Technology and Digital World, 4(3), 153-166.
Scott, J. C. (1930). Systolic blood-pressure fluctuations with sex, anger and fear. Journal of Comparative Psychology, 10(2), 97.
Seo, J., Laine, T. H., & Sohn, K.-A. (2019). An exploration of machine learning methods for robust boredom classification using EEG and GSR data. Sensors, 19(20), 4561.
Shu, L., Xie, J., Yang, M., Li, Z., Li, Z., Liao, D., Xu, X., & Yang, X. (2018). A review of emotion recognition using physiological signals. Sensors, 18(7), 2074.
Silva, J. R. (2011). International Affective Picture System (IAPS) in Chile: A crosscultural adaptation and validation study. Terapia psicologica, 29(2), 251-258.
Soares, A. P., Pinheiro, A. P., Costa, A., Frade, C. S., Comesaña, M., & Pureza, R. (2015). Adaptation of the international affective picture system (IAPS) for European Portuguese. Behavior Research Methods, 47, 1159-1177.
Sohaib, A. T., Qureshi, S., Hagelbäck, J., Hilborn, O., & Jerčić, P. (2013). Evaluating classifiers for emotion recognition using EEG. Foundations of Augmented Cognition: 7th International Conference, AC 2013, Held as Part of HCI International 2013, Las Vegas, NV, USA, July 21-26, 2013. Proceedings 7,
Soleymani, M., Pantic, M., & Pun, T. (2011). Multimodal emotion recognition in response to videos. IEEE transactions on affective computing, 3(2), 211-223.
Sun, B., & Lin, Z. (2022). Emotion Recognition using Machine Learning and ECG signals. arXiv preprint arXiv:2203.08477.
Takahashi, S., Anzai, Y., & Sakurai, Y. (2003). A new approach to spike sorting for multi-neuronal activities recorded with a tetrode—how ICA can be practical. Neuroscience research, 46(3), 265-272.
Tarvainen, M. P., Niskanen, J.-P., Lipponen, J. A., Ranta-Aho, P. O., & Karjalainen, P. A. (2014). Kubios HRV–heart rate variability analysis software. Computer methods and programs in biomedicine, 113(1), 210-220.
Tato, R., Santos, R., Kompe, R., & Pardo, J. M. (2002). Emotional space improves emotion recognition. Seventh International Conference on Spoken Language Processing,
Verma, J. (2015). Repeated measures design for empirical researchers. John Wiley & Sons.
Viola, F. C., Debener, S., Thorne, J., & Schneider, T. R. (2010). Using ICA for the analysis of multi-channel EEG data. Simultaneous EEG and fMRI: Recording, Analysis, and Application: Recording, Analysis, and Application, 121-133.
Wan-Hui, W., Yu-Hui, Q., & Guang-Yuan, L. (2009). Electrocardiography recording, feature extraction and classification for emotion recognition. 2009 WRI World congress on computer science and information engineering,
Wu, M., Hu, S., Wei, B., & Lv, Z. (2022). A novel deep learning model based on the ICA and Riemannian manifold for EEG-based emotion recognition. Journal of Neuroscience Methods, 378, 109642.
Xing, B., Zhang, H., Zhang, K., Zhang, L., Wu, X., Shi, X., Yu, S., & Zhang, S. (2019). Exploiting EEG signals and audiovisual feature fusion for video emotion recognition. IEEE Access, 7, 59844-59861.
Xu, L., Wen, X., Shi, J., Li, S., Xiao, Y., Wan, Q., & Qian, X. (2021). Effects of individual factors on perceived emotion and felt emotion of music: based on machine learning methods. Psychology of Music, 49(5), 1069-1087.
Zhang, K., Li, Y., Wang, J., Cambria, E., & Li, X. (2021). Real-time video emotion recognition based on reinforcement learning and domain knowledge. IEEE Transactions on Circuits and Systems for Video Technology, 32(3), 1034-1047.
Zhang, Q., & Lee, M. (2009). Analysis of positive and negative emotions in natural scene using brain activity and GIST. Neurocomputing, 72(4-6), 1302-1306.
Zhao, G., Ge, Y., Shen, B., Wei, X., & Wang, H. (2017). Emotion analysis for personality inference from EEG signals. IEEE transactions on affective computing, 9(3), 362-371.

無法下載圖示 全文公開日期 2028/06/15 (校內網路)
全文公開日期 本全文未授權公開 (校外網路)
全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
QR CODE