Basic Search / Detailed Display

Author: 呼延建
Zamroji Hariyanto
Thesis Title: 運用深度神經網絡技術建構最佳化預測模式 - 以PM2.5感測資料集為例
Construction of Optimized Forecasting Model Using Deep Neural Networks Technology Based on PM2.5 Sensed Dataset
Advisor: 羅乃維
Nai-Wei Lo
Committee: 賴源正
Yuan-Cheng Lai
林伯慎
Bor-Shen Lin
Degree: 碩士
Master
Department: 管理學院 - 資訊管理系
Department of Information Management
Thesis Publication Year: 2019
Graduation Academic Year: 107
Language: 英文
Pages: 73
Keywords (in Chinese): PM2.5Forecasting OptimizationData CleaningDeep Neural NetworkDeep Learning
Keywords (in other languages): PM2.5, Forecasting Optimization, Data Cleaning, Deep Neural Network, Deep Learning
Reference times: Clicks: 210Downloads: 1
Share:
School Collection Retrieve National Library Collection Retrieve Error Report
  • Technology in human life has advanced tremendously and it brings a lot of convenient for people in various aspects of their life. Besides that, it also brings a harmful impact on the environment, especially on air quality. Due to industrial production, the quantity of pollutant concentration raises rapidly many times. Fine particulate matter (PM2.5), one of dangerous pollutant, is regarded as one of the main factors for the deterioration of public health. Many efforts were being created to provide the monitoring of PM2.5 concentrations. PM2.5 forecasting provided for early warning to people. In terms of forecasting, accuracy is the most challenging task. A proper model needs to be constructed to lead the precision prediction. Nowadays, Deep Neural Network (DNN) is an artificial intelligence technique that has proven to solve several prediction problems. Therefore, this thesis proposed the forecasting optimization mechanism employing the Golden Section Search and Fruit Fly Optimization Algorithm combines with a data cleansing mechanism using DNN models. The proposed mechanism effectively optimize three DNN models that are Multilayer Perceptron (MLP), Long Short – Term Memory (LSTM) and Gated Recurrent Unit (GRU) to achieve better forecasting accuracy of PM2.5 concentration.


    Technology in human life has advanced tremendously and it brings a lot of convenient for people in various aspects of their life. Besides that, it also brings a harmful impact on the environment, especially on air quality. Due to industrial production, the quantity of pollutant concentration raises rapidly many times. Fine particulate matter (PM2.5), one of dangerous pollutant, is regarded as one of the main factors for the deterioration of public health. Many efforts were being created to provide the monitoring of PM2.5 concentrations. PM2.5 forecasting provided for early warning to people. In terms of forecasting, accuracy is the most challenging task. A proper model needs to be constructed to lead the precision prediction. Nowadays, Deep Neural Network (DNN) is an artificial intelligence technique that has proven to solve several prediction problems. Therefore, this thesis proposed the forecasting optimization mechanism employing the Golden Section Search and Fruit Fly Optimization Algorithm combines with a data cleansing mechanism using DNN models. The proposed mechanism effectively optimize three DNN models that are Multilayer Perceptron (MLP), Long Short – Term Memory (LSTM) and Gated Recurrent Unit (GRU) to achieve better forecasting accuracy of PM2.5 concentration.

    Cover i Recommendation Letter ii Qualification Letter iii Abstract iv Acknowledgments v Contents vi List of Figures viii List of Tables x Chapter 1 Introduction 1 Chapter 2 Literature Review 5 2.1 Related Work 5 2.2 Multilayer Perceptron 6 2.3 Recurrent Neural Network 7 2.3.1 Long Short-Term Memory 8 2.3.2 Gated Recurrent Unit 9 2.4 Golden Section Search 10 2.5 Fruit Fly Optimization Algorithm 11 2.6 Anomaly Detection 14 Chapter 3 Proposed Forecasting Model Optimization Mechanism 15 3.1 Basic Forecasting Model Optimization Mechanism 15 3.1.1 Data Observation 15 3.1.2 Dataset Pre-processing 17 3.1.3 Forecasting Model Architecture 18 3.1.4 Model Optimization Process 20 3.2 Forecasting Model Optimization Mechanism with Data Cleaning 21 Chapter 4 Experiment and Results 25 4.1 Environment and Setup 25 4.1.1 Dataset Parameters 25 4.1.2 Model Learning Parameters 26 4.1.3 Optimization Parameters 27 4.2 Experimental Scenarios and Results 28 4.2.1 Scenario 1: Forecasting Model with standard hyper-parameters 29 4.2.2 Scenario 2: Forecasting Model with optimized hyper-parameters 30 4.2.3 Scenario 3: Forecasting Model with optimized hyper-parameters and removing each anomaly data 36 4.2.4 Scenario 4: Forecasting Model with optimized hyper-parameters and removing each anomaly data and the data which in its radius 39 4.2.5 Scenario 5: Forecasting Model with optimized hyper-parameters and replacing anomaly data with its radius mean 41 4.3 Comparison of Five Scenarios 43 4.4 Comparison with previous researches 47 4.5 Evaluation 48 4.5.1 Meteorological factors affect the forecasting result 51 4.5.2 The distance to trained dataset location affect the forecasting result 53 4.5.3 The optimized DNN models have equivalent forecasting results 54 4.6 Discussion 55 Chapter 5 Conclusion 58 References 60

    [1] F. Lu et al., “Systematic review and meta-analysis of the adverse health effects of ambient PM2.5 and PM10 pollution in the Chinese population,” Environ. Res., vol. 136, pp. 196–204, 2015.
    [2] W. C. Lo, R. H. Shie, C. C. Chan, and H. H. Lin, “Burden of disease attributable to ambient fine particulate matter exposure in Taiwan,” J. Formos. Med. Assoc., vol. 116, no. 1, pp. 32–40, 2017.
    [3] L. J. Chen et al., “An Open Framework for Participatory PM2.5 Monitoring in Smart Cities,” IEEE Access, vol. 5, no. August, pp. 14441–14454, 2017.
    [4] “Civil IoT Taiwan.” [Online]. Available: https://ci.taiwan.gov.tw/. [Accessed: 15-May-2019].
    [5] “Civil IoT Taiwan Data Service Platform.” [Online]. Available: https://ci.taiwan.gov.tw/dsp/. [Accessed: 15-May-2019].
    [6] S. Mahajan, L. Chen, and T. Tsai, “An Empirical Study of PM2 . 5 Forecasting Using Neural Network,” in 2017 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computed, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation, 2017, pp. 1–7.
    [7] S. Mahajan, H. M. Liu, T. C. Tsai, and L. J. Chen, “Improving the Accuracy and Efficiency of PM2.5 Forecast Service Using Cluster-Based Hybrid Neural Network Model,” IEEE Access, vol. 6, pp. 19193–19204, 2018.
    [8] Y. T. Tsai, Y. R. Zeng, and Y. S. Chang, “Air Pollution Forecasting Using RNN with LSTM,” in 2018 IEEE 16th Intl Conf on Dependable, Autonomic and Secure Computing, 16th Intl Conf on Pervasive Intelligence and Computing, 4th Intl Conf on Big Data Intelligence and Computing and Cyber Science and Technology Congress, 2018, pp. 1068–1073.
    [9] H. Dai, G. Zhao, J. Lu, and S. Dai, “Comment and improvement on ‘a new Fruit Fly Optimization Algorithm: Taking the financial distress model as an example,’” Knowledge-Based Syst., vol. 59, pp. 159–160, 2014.
    [10] W. S. McCulloch and W. Pitts, “A logical calculus of the ideas immanent in nervous activity,” Bull. Math. Biophys., vol. 5, no. 4, pp. 115–133, 1943.
    [11] F. Rosenblatt, “The perceptron: A probabilistic model for information storage and organization in the brain,” Psychol. Rev., vol. 65, no. 6, pp. 386–408, 1958.
    [12] M. W. Gardner and S. R. Dorling, “Artificial Neural Networks ( The Multilayer Perceptron ) - A Review of Applications in The Atmospheric Sciences,” Atmos. Environ., vol. 32, no. 14, pp. 2627–2636, 1998.
    [13] F. Murtagh, “Multilayer perceptrons for classification and regression,” Neurocomputing, vol. 2, no. 5, pp. 183–197, 1991.
    [14] F. Pedregosa et al., “Scikit-learn: Machine Learning in Python,” J. Mach. Learn. Res., vol. 12, pp. 2825–2830, 2011.
    [15] M. W. Gardner and S. R. Dorling, “Neural network modelling and prediction of hourly NOx and NO2 concentrations in urban air in London,” Atmos. Environ., vol. 33, pp. 2627–2636, 1999.
    [16] P. Perez, A. Trier, and J. Reyes, “Prediction of PM concentrations several hours in advance using neural networks in Santiago , Chile,” Atmos. Environ., vol. 34, pp. 1189–1196, 2000.
    [17] J. Kukkonen et al., “ARTICLE IN PRESS Extensive evaluation of neural network models for the prediction of NO 2 and PM 10 concentrations , compared with a deterministic modelling system and measurements in central Helsinki,” Atmos. Environ., vol. 37, no. 2, pp. 4539–4550, 2003.
    [18] A. B. Chelani, C. V Chalapati Rao, K. M. Phadke, and M. Z. Hasan, “Prediction of sulphur dioxide concentration using artificial neural networks,” Environ. Model. Softw., vol. 17, no. 2, pp. 161–168, 2002.
    [19] E. Agirre-basurko, G. Ibarra-berastegi, and I. Madariaga, “Regression and multilayer perceptron-based models to forecast hourly O 3 and NO 2 levels in the Bilbao area,” vol. 21, no. 2, pp. 430–446, 2006.
    [20] B. G. Horne, P. Tino, and C. L. Giles, “Learning long-term dependencies in NARX recurrent neural networks,” IEEE Trans. Neural Networks, vol. 7, no. 6, pp. 1329–1338, Nov. 1996.
    [21] Y. Bengio, P. Simard, and P. Frasconi, “Learning long-term dependencies with gradient descent is difficult,” IEEE Trans. Neural Networks, vol. 5, no. 2, pp. 157–166, Mar. 1994.
    [22] P. Gabor, “Recurrent Neural Networks for Time Series Forecasting,” arXiv Prepr., 2019.
    [23] T. Mikolov, A. Joulin, S. Chopra, M. Mathieu, F. Artificial, and N. Y. City, “LEARNING LONGER MEMORY IN RECURRENT NEURAL NETWORKS,” arXiv Prepr., 2015.
    [24] H. Salehinejad, S. Sankar, J. Barfett, E. Colak, and S. Valaee, “Recent Advances in Recurrent Neural Networks,” arXiv Prepr., pp. 1–21.
    [25] S. Hochreiter and J. Urgen Schmidhuber, “LONG SHORT-TERM MEMORY,” Neural Comput., vol. 9, no. 8, pp. 1735–1780, 1997.
    [26] K. Cho et al., “Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation,” 2014.
    [27] M. Nguyen, “Illustrated Guide to LSTM’s and GRU’s: A step by step explanation.” [Online]. Available: https://towardsdatascience.com/illustrated-guide-to-lstms-and-gru-s-a-step-by-step-explanation-44e9eb85bf21. [Accessed: 15-May-2019].
    [28] J. Kiefer, “Sequential Minimax Search for a Maximum,” Proc. Am. Math. Soc., vol. 4, no. 3, p. 502, 2006.
    [29] M. Basseville and I. V Nikiforov, Detection of Abrupt Changes: Theory and Application. Upper Saddle River, NJ, USA: Prentice-Hall, Inc., 1993.
    [30] V. Chandola, A. Banerjee, and V. Kumar, “Anomaly Detection: A Survey,” ACM Comput. Surv., vol. 41, no. 3, pp. 15:1--15:58, Jul. 2009.
    [31] D. T. Shipmon, J. M. Gurevitch, P. M. Piselli, and S. T. Edwards, “Time Series Anomaly Detection; Detection of anomalous drops with limited features and sparse examples in noisy highly periodic data,” arXiv Prepr., 2017.
    [32] P. Malhotra, L. Vig, G. Shroff, and P. Agarwal, “Long Short Term Memory Networks for Anomaly Detection in Time Series,” 2015, no. April, pp. 22–24.
    [33] T. Sincich, Business Statistics by Example (2nd ed.). San Francisco: Dellen, 1986.
    [34] D. P. Kingma and J. Ba, “Adam: A Method for Stochastic Optimization,” arXiv Prepr., pp. 1–15, 2014.
    [35] S. Geman, E. Bienenstock, and R. Doursat, “Neural Networks and the Bias/Variance Dilemma,” Neural Comput., vol. 4, no. 1, pp. 1–58, Jan. 1992.
    [36] P. C. Mahalanobis, “On the generalised distance in statistics,” in Proceedings of the National Institute of Science of India, 1936, vol. 2, no. 1, pp. 49–55.
    [37] F. Chollet and others, “Keras.” 2015.
    [38] D. Masters and C. Luschi, “Revisiting Small Batch Training for Deep Neural Networks,” pp. 1–18, 2018.
    [39] “Climate Statistics Monthly Data - Central Weather Bureau (CWB).” [Online]. Available: https://www.cwb.gov.tw/V7e/climate/monthlyData/mD.htm. [Accessed: 07-Jun-2019].
    [40] Y. Kao, C. Lin, and J. Chiang, “Predictive Meteorological Factors for Elevated PM 2 . 5 Levels at an Air Monitoring Station Near a Petrochemical Complex in Yunlin County, Taiwan,” pp. 1–17, 2019.
    [41] Q. Yang et al., “The Relationships between PM2.5 and Meteorological Factors in China: Seasonal and Regional Variations,” Int. J. Environ. Res. Public Health, vol. 14, no. 12, p. 1510, Dec. 2017.

    QR CODE