簡易檢索 / 詳目顯示

研究生: 唐寧
NING TANG
論文名稱: 利用員工出缺勤行為模式強化離職預測模型
Enhancing Turnover Prediction by Employee Attendance Behavior
指導教授: 戴碧如
Bi-ru Dai
口試委員: 黃俊龍
Jiun-Long Huang
戴碧如
Bi-Ru Dai
沈之涯
Chih-Ya Shen
陳怡伶
Yi-Ling Chen
學位類別: 碩士
Master
系所名稱: 電資學院 - 資訊工程系
Department of Computer Science and Information Engineering
論文出版年: 2019
畢業學年度: 107
語文別: 英文
論文頁數: 52
中文關鍵詞: 出缺勤員工離職預測分類深度學習一維卷積神經網路雙向長短期記憶循環神經網路一維卷積雙向長短期記憶循環神經網路
外文關鍵詞: Attendance, Employees turnover prediction, Classification, Deep learning, 1-D CNN, Bi-LSTM, 1-D CNN Bi-LSTM
相關次數: 點閱:432下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報

現今,減少員工離職率是各大公司的一大課題。如果能預測員工未來是否會 離開公司,將可以幫助公司提前做出反應,有效的挽留人才。不同於過去分析員 工基本信息的相關研究,本篇研究著重於員工出缺勤資料的研究,透過該資料觀 察員工出缺勤行為,達到識別員工離職的可能性。因此我們提出一個神經網路模 型,稱為 SAB 模型,該模型可以擷取員工每個月出缺勤趨勢作為特徵來進行員工 離職預測,以及我們強化 SAB 模型,加入考慮員工過去幾個月的出缺勤行為變化 的功能,稱為 LAB 模型。我們使用現實中的資料集做了大量的實驗,以現實中的 資料集驗證了大多數情況,LAB 模型可以顯著的優於過去提出的方法。


Nowadays, reducing the employee turnover rate is a significant task for companies. If we can predict employees who are going to leave the company in the near future, it will help the company to react in advance and effectively retain talents. Different from most of related works, which analyze the personal information of employees, this study mainly focuses on using the attendance data as the key feature to observe the changes of attendance behavior for identifying the potential of turnover. We propose a neural network model named Short Term Attendance Behavior Model (SAB Model) to extract the monthly attendance trend for predicting the turnover of employees, and further extend it to Long Term Attendance Behavior Model (LAB Model) which takes into account the changes of attendance behavior in recent months. Extensive experiments conducted on a real-world dataset verified that the LAB Model can considerably outperform the state-of-the-art employee turnover prediction works in most cases.

指導教授推薦書 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . II 論文口試委員審定書 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . III Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . IV 論文摘要 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . V 致謝 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . VI Table of contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . VII List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . IX List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Motivation and Contribution . . . . . . . . . . . . . . . . . . . . . . . . 1 1.3 Thesis Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 3 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 4 Proposed Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 4.1 Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 4.2 Data Preprocessing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 4.2.1 Time Alignment . . . . . . . . . . . . . . . . . . . . . . . . . . 7 4.2.2 Class Imbalance Problem . . . . . . . . . . . . . . . . . . . . . . 8 4.3 Feature Exaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 4.4 Classification Model . . . . . . . . . . . . . . . . . . . . . . . . . . . .12 4.4.1 SAB Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . .13 4.4.2 LAB Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . .13 5 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .22 5.1 Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .22 5.2 Experiment Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . .25 5.3 Oversampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .26 5.4 Extract Time Series Data . . . . . . . . . . . . . . . . . . . . . . . . . .27 5.5 Dynamically Adjust Data Length . . . . . . . . . . . . . . . . . . . . . .29 5.6 Attendance Feature Group . . . . . . . . . . . . . . . . . . . . . . . . .31 5.7 Profile and Attendance Feature Importance . . . . . . . . . . . . . . . .32 5.8 Monthly Employee Turnover Discussion . . . . . . . . . . . . . . . . . .35 5.9 Eemployee Turnover Model Disscussion . . . . . . . . . . . . . . . . . . 36 6 Conclusions and Future Works . . . . . . . . . . . . . . . . . . . . . . . . . .37 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .38

[1] F. A. Gers, J. Schmidhuber, and F. Cummins, “Learning to forget: Continual prediction with lstm,” 1999.

[2] M. Schuster and K. K. Paliwal, “Bidirectional recurrent neural networks,” IEEE Transactions on Signal Processing, vol. 45, no. 11, pp. 2673–2681, 1997.

[3] E. Ribes, K. Touahri, and B. Perthame, “Employee turnover prediction and retention policies design: a case study,” arXiv preprint arXiv:1707.01377, 2017.

[4] H.-Y. Chang, “Employee turnover: a novel prediction solution with effective feature selection,” in WSEAS International Conference. Proceedings. Mathematics and Computers in Science and Engineering, no. 3, Citeseer, 2009.

[5] J. Liu, Y. Long, M. Fang, R. He, T. Wang, and G. Chen, “Analyzing employee turnover based on job skills,” in Proceedings of the International Conference on Data Processing and Applications, pp. 16–21, ACM, 2018.

[6] İ. O. Yiğit and H. Shourabizadeh, “An approach for predicting employee churn by using data mining,” in 2017 International Artificial Intelligence and Data Processing Symposium (IDAP), pp. 1–4, IEEE, 2017.

[7] J. L. Cotton and J. M. Tuttle, “Employee turnover: A meta-analysis and review with implications for research,” Academy of management Review, vol. 11, no. 1, pp. 5570, 1986.

[8] P. Ajit, “Prediction of employee turnover in organizations using machine learning algorithms,” algorithms, vol. 4, no. 5, p. C5, 2016.

[9] Ö. Yıldırım, P. Pławiak, R.-S. Tan, and U. R. Acharya, “Arrhythmia detection using deep convolutional neural network with long duration ecg signals,” Computers in biology and medicine, vol. 102, pp. 411–420, 2018.

[10] Ö. Yildirim, “A novel wavelet sequence based on deep bidirectional lstm network model for ecg signal classification,” Computers in biology and medicine, vol. 96, pp. 189–202, 2018.

[11] S. Yang, Q. Sun, H. Zhou, and Z. Gong, “A multi-layer neural network model integrating bilstm and cnn for chinese sentiment recognition,” in Proceedings of the 2018 International Conference on Computing and Artificial Intelligence, pp. 23–29, ACM, 2018.

[12] S. Gao, A. Ramanathan, and G. Tourassi, “Hierarchical convolutional attention networks for text classification,” tech. rep., Oak Ridge National Lab.(ORNL), Oak Ridge, TN (United States), 2018.

[13] Y. Sun, F. P.-W. Lo, and B. Lo, “Eeg-based user identification system using 1dconvolutional long short-term memory neural networks,” Expert Systems with Applications, vol. 125, pp. 259–267, 2019.

[14] W. Yin, X. Yang, L. Zhang, and E. Oki, “Ecg monitoring system integrated with ir-uwb radar based on cnn,” IEEE Access, vol. 4, pp. 6344–6351, 2016.

[15] S. S. Ratakonda and S. Sasi, “Seasonal trend analysis on multi-variate time series data,” in 2018 International Conference on Data Science and Engineering (ICDSE), pp. 1–6, IEEE, 2018.

[16] H. Hosoya and A. Hyvärinen, “Learning visual spatial pooling by strong pca dimension reduction,” Neural computation, vol. 28, no. 7, pp. 1249–1264, 2016.

[17] F. Agostinelli, M. Hoffman, P. Sadowski, and P. Baldi, “Learning activation functions to improve deep neural networks,” arXiv preprint arXiv:1412.6830, 2014.

[18] B. A. Pearlmutter, “Gradient calculations for dynamic recurrent neural networks: A survey,” IEEE Transactions on Neural networks, vol. 6, no. 5, pp. 1212–1228, 1995.

[19] M. Zaheer, A. Ahmed, and A. J. Smola, “Latent lstm allocation joint clustering and non-linear dynamic modeling of sequential data,” in Proceedings of the 34th International Conference on Machine Learning-Volume 70, pp. 3967–3976, JMLR. org, 2017.

[20] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, “Dropout: a simple way to prevent neural networks from overfitting,” The Journal of Machine Learning Research, vol. 15, no. 1, pp. 1929–1958, 2014.

[21] B. Krawczyk and M. Woźniak, “Combining diverse one-class classifiers,” in International Conference on Hybrid Artificial Intelligence Systems, pp. 590–601, Springer, 2012.

[22] T. M. Cover and J. A. Thomas, Elements of information theory. John Wiley & Sons, 2012.

[23] M. J. Somers, “Application of two neural network paradigms to the study of voluntary employee turnover.,” Journal of Applied Psychology, vol. 84, no. 2, p. 177, 1999.

無法下載圖示 全文公開日期 2024/08/15 (校內網路)
全文公開日期 本全文未授權公開 (校外網路)
全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
QR CODE