簡易檢索 / 詳目顯示

研究生: Anindhita Dewabharata
Anindhita Dewabharata
論文名稱: 以資料驅動方法預測建築能耗之研究
Development of Data-driven Approaches for the Prediction of Building Energy Consumption
指導教授: 周碩彥
Shuo-Yan Chou
口試委員: 陳 振明
Jen-Ming Chen
游慧光
Tiffany Hui-Kuang Yu
喻奉天
Vincent F. Yu
羅士哲
Shih-Che Lo
學位類別: 博士
Doctor
系所名稱: 管理學院 - 工業管理系
Department of Industrial Management
論文出版年: 2022
畢業學年度: 110
語文別: 英文
論文頁數: 80
中文關鍵詞: 建築物用電量特徵工程與選取分解法集成經驗模態分解長短期記憶集成模型XGBoost
外文關鍵詞: building, electricity consumption, feature engineering and selection, decomposition, ensemble empirical mode decomposition, long short-term memory, ensemble model, XGBoost
相關次數: 點閱:287下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本研究提出了一個建築物用電量的預測框架。該框架結合了單變量單步(未來一小時)、多步(未來 24 小時)和多變量多步預測的特徵工程、分解法和預測模型。 本研究將所提出的框架應用於由 12 座建築物組成的測試基準和實際數據集。 這些建築物屬於多功能文教機構並且位於三個不同的國家。
    研究結果顯示,結合集成經驗模態分解(EEMD)和長短期記憶(LSTM)時,單變量單步平均絕對百分比誤差(MAPE)可降低 23%,多步平均絕對百分比誤差(MAPE)可降低 16%。 同時,使用 Ensemble-XGBoost 和結合 Ensemble-XGBoost 和 Encoder-Decoder 模型進行多變量多步預測,與使用帶有分解的 LSTM 或多變量 LSTM 相比,可以降低 15% 的MAPE。
    此外,這項研究還發現,相較於使用所有特徵,特徵選擇移除了一些可能成為預測模型噪音的特徵,進而提高了預測精度。除了提出的框架外,本研究還推薦了每個建築物的預測模型,從而豐富建築物的用電量預測方法。 最後,所提出的框架也可以在實際案例中預測電力消耗。


    This study presents a forecasting framework for the electricity consumption of a building. The framework combines feature engineering, decomposition method, and forecasting models for univariate single-step (one hour into the future), multi-step (24 hours into the future), and multivariate multi-step predictions. This study applies the proposed framework to the benchmark and actual dataset, which consists of 12 buildings. The buildings are located in three different countries with multiple functionalities in the educational sector.
    The experiment results reveal that the mean absolute percentage error (MAPE) can be reduced by 23% for univariate single-step and 16% for multi-step when applying the ensemble empirical mode decomposition (EEMD) with a long short-term memory (LSTM). Meanwhile, applying the Ensemble-XGBoost and Ensemble-XGBoost with Encoder-Decoder models for the multivariate multi-step prediction can reduce the MAPE by 15% compared when using LSTM with decomposition or multivariate LSTM.
    Furthermore, this research also found that feature selection can improve prediction accuracy compared to all the features used for prediction. The feature selection removes some features that might become noise for the forecasting model. In addition to the proposed framework, this research recommends the forecasting model for each building. Therefore, the result of this study could enrich the study of the building energy forecasting approach. Finally, the proposed framework also can be applied to the real case of electricity consumption prediction.

    摘要…………………………………………………………………………………………...i ABSTRACT. ii ACKNOWLEDGEMENTS iii LIST OF TABLES vi LIST OF FIGURES viii Chapter 1 INTRODUCTION 1 1.1 Research Background 1 1.2 Research Objectives 2 1.3 Research Scope and Limitations 2 1.4 Thesis Organization 2 Chapter 2 LITERATURE STUDY 4 2.1 Recurrent Neural Network 4 2.2 Ensemble Empirical Mode Decomposition 5 2.3 Data Clustering using K-Shape Clustering Algorithm 7 2.4 XGBoost for Feature Selection and Ensemble Methods 7 Chapter 3 METHODOLOGY 9 3.1 Data collection 9 3.2 Data Pre-processing. 10 3.3 Data Decomposition 13 3.4 Forecasting Using LSTM Algorithm 13 3.5 Forecasting Model 15 3.5.1 Univariate Single-step Forecasting 15 3.5.2 Univariate and Multivariate Multi-step Forecasting. 15 3.5.3 Multivariate Multi-step Forecasting with Encoder-Decoder 17 3.5.4 Ensemble Method 18 3.6 Evaluation 19 Chapter 4 RESULT AND DISCUSSION 20 4.1 Dataset Description 20 4.2 Statistic Description Analysis 21 4.3 Data visualization 22 4.4 Feature engineering and selection result 22 4.5 Parameter Setting 30 4.6 Forecasting Results and Discussion 30 4.6.1 Univariate Single-step and Multi-step Forecasting Results 31 4.6.2 Forecasting Results for Multivariate Data 36 4.6.3 The Ensemble-XGBoost 39 4.6.4 The Ensemble-XGBoost with Encoder-Decoder 40 Chapter 5 CONCLUSION AND FUTURE RESEARCH 48 5.1 Conclusion 48 5.2 Contributions 48 5.3 Future Research 49 REFERENCES 50 APPENDIX ………………………………………………………………………………..55 AUTHOR INTRODUCTION 81

    1. Cozzi, L.; Gould, T.; Bouckart, S.; Crow, D.; Kim, T.; Mcglade, C.; Olejarnik, P.; Wanner, B.; Wetzel, D., World Energy Outlook 2020. vol 2020, 2050, 1-461.
    2. Bourdeau, M.; Zhai, X. q.; Nefzaoui, E.; Guo, X.; Chatellier, P., Modeling and forecasting building energy consumption: A review of data-driven techniques. Sustainable Cities and Society 2019, 48, 101533.
    3. Kontokosta, C. E.; Tull, C., A data-driven predictive model of city-scale energy use in buildings. Applied energy 2017, 197, 303-317.
    4. Kneifel, J.; Webb, D., Predicting Energy Performance of a Net-Zero Energy Building: A Statistical Approach. Applied Energy 2016, 15 September 2016, 468-483.
    5. Cao, X.; Dai, X.; Liu, J., Building energy-consumption status worldwide and the state-of-the-art technologies for zero-energy buildings during the past decade. Energy and Buildings 2016, 128, 198-213.
    6. Capozzoli, A.; Piscitelli, M. S.; Brandi, S.; Grassi, D.; Chicco, G., Automated load pattern learning and anomaly detection for enhancing energy management in smart buildings. Energy 2018, 157, 336-352.
    7. Fan, C.; Lei, Y.; Sun, Y.; Piscitelli, M. S.; Chiosa, R.; Capozzoli, A., Data-centric or algorithm-centric: Exploiting the performance of transfer learning for improving building energy predictions in data-scarce context. Energy 2022, 240, 122775.
    8. Li, G.; Li, F.; Xu, C.; Fang, X., A spatial-temporal layer-wise relevance propagation method for improving interpretability and prediction accuracy of LSTM building energy prediction. Energy and Buildings 2022, 271, 112317.
    9. Xu, C.; Chen, H., A hybrid data mining approach for anomaly detection and evaluation in residential buildings energy data. Energy and Buildings 2020, 215, 109864.
    10. Lu, Y.; Tian, Z.; Zhou, R.; Liu, W., Multi-step-ahead prediction of thermal load in regional energy system using deep learning method. Energy and Buildings 2021, 233, 110658.
    11. Somu, N.; M R, G. R.; Ramamritham, K., A hybrid model for building energy consumption forecasting using long short term memory networks. Applied Energy 2020, 261, 114131.
    12. Gao, Y.; Ruan, Y., Interpretable deep learning model for building energy consumption prediction based on attention mechanism. Energy and Buildings 2021, 252, 111379.
    13. Peng, L.; Wang, L.; Xia, D.; Gao, Q., Effective energy consumption forecasting using empirical wavelet transform and long short-term memory. Energy 2022, 238, 121756.
    14. Montgomery, D.; Jennings, C.; Kulahci, M., Introduction to Time Series Analysis and Forecasting. 2008; p 472.
    15. Koprinska, I.; Rana, M.; Agelidis, V. G., Correlation and instance based feature selection for electricity load forecasting. Knowledge-Based Systems 2015, 82, 29-40.
    16. Lim, B.; Zohren, S. J. P. T. o. t. R. S. A., Time-series forecasting with deep learning: a survey. 2021, 379, (2194), 20200209.
    17. Mandic, D. P.; Chambers, J. A., Recurrent neural networks for prediction: learning algorithms, architectures and stability. Wiley: Hoboken, New Jersey, USA, 2001.
    18. Kumar, D. N.; Raju, K. S.; Sathish, T., River flow forecasting using recurrent neural networks. Water resources management 2004, 18, (2), 143-161.
    19. Puspitasari, C., Recurrent Neural Network (RNN). In Aditya Yanuar R.: 2018.
    20. Hua, Y.; Zhao, Z.; Li, R.; Chen, X.; Liu, Z.; Zhang, H., Deep learning with long short-term memory for time series prediction. IEEE Communications Magazine 2019, 57, (6), 114-119.
    21. Hochreiter, S.; Schmidhuber, J., Long Short-Term Memory. Neural Comput. 1997, 9, (8), 1735–1780.
    22. Aungiers, J. LSTM Neural Network for Time Series Prediction. https://www.jakob-aungiers.com/articles/a/LSTM-Neural-Network-for-Time-Series-Prediction (03-13),
    23. Wu, Z.; Huang, N., Ensemble Empirical Mode Decomposition: a Noise-Assisted Data Analysis Method. Advances in Adaptive Data Analysis 2009, 1, 1-41.
    24. Qiu, X.; Ren, Y.; Suganthan, P. N.; Amaratunga, G. A. J., Empirical Mode Decomposition based ensemble deep learning for load demand time series forecasting. Applied Soft Computing 2017, 54, 246-255.
    25. Bedi, J.; Toshniwal, D., Empirical Mode Decomposition Based Deep Learning for Electricity Demand Forecasting. IEEE Access 2018, 6, 49144-49156.
    26. Liu, M.-D.; Ding, L.; Bai, Y.-L., Application of hybrid model based on empirical mode decomposition, novel recurrent neural networks and the ARIMA to wind speed prediction. Energy Conversion and Management 2021, 233, 113917.
    27. Zhang, X.; Lai, K. K.; Wang, S.-Y., A new approach for crude oil price analysis based on Empirical Mode Decomposition. Energy Economics 2008, 30, (3), 905-918.
    28. Wu, Y.-X.; Wu, Q.-B.; Zhu, J.-Q., Improved EEMD-based crude oil price forecasting using LSTM networks. Physica A: Statistical Mechanics and its Applications 2019, 516, 114-124.
    29. Tan, P.-M.; Steinbach, M.; Kumar, V., Introduction to Data Mining. Pearson Education: Boston, 2006.
    30. Ralanamahatana, C. A.; Lin, J.; Gunopulos, D.; Keogh, E.; Vlachos, M.; Das, G., Mining time series data. In Data mining and knowledge discovery handbook, Springer: 2005; pp 1069-1103.
    31. Fu, T.-c., A review on time series data mining. Journal Engineering Applications of Artificial Intelligence 2011, 24, (1), 164-181.
    32. Senin, P. J. I.; Computer Science Department University of Hawaii at Manoa Honolulu, U., Dynamic time warping algorithm review. 2008, 855, (1-23), 40.
    33. Li, H.; Liu, J.; Yang, Z.; Liu, R. W.; Wu, K.; Wan, Y. J. I. S., Adaptively constrained dynamic time warping for time series classification and clustering. 2020, 534, 97-116.
    34. Wang, W.; Lyu, G.; Shi, Y.; Liang, X. In Time series clustering based on dynamic time warping, 2018 IEEE 9th international conference on software engineering and service science (ICSESS), 2018; IEEE: 2018; pp 487-490.
    35. Li, H. J. I. S., Time works well: Dynamic time warping based on time weighting for time series data mining. 2021, 547, 592-608.
    36. Paparrizos, J.; Gravano, L. In k-shape: Efficient and accurate clustering of time series, Proceedings of the 2015 ACM SIGMOD international conference on management of data, 2015; 2015; pp 1855-1870.
    37. Yang, J.; Ning, C.; Deb, C.; Zhang, F.; Cheong, D.; Lee, S. E.; Sekhar, C.; Tham, K. W. J. E.; Buildings, k-Shape clustering algorithm for building energy usage patterns analysis and forecasting model accuracy improvement. 2017, 146, 27-37.
    38. Chen, T.; Guestrin, C., XGBoost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Association for Computing Machinery: San Francisco, California, USA, 2016; pp 785–794.
    39. Wang, Y.; Guo, Y., Forecasting method of stock market volatility in time series data based on mixed model of ARIMA and XGBoost. China Communications 2020, 17, (3), 205-221.
    40. De'ath, G.; Fabricius, K. E., Classification and regression trees: A powerful yet simple technique for ecological data analysis. Ecology 2000, 81, (11), 3178-3192.
    41. Al-Barakati, H. J.; Saigo, H.; Newman, R. H. J. M. o., RF-GlutarySite: a random forest based predictor for glutarylation sites. 2019, 15, (3), 189-204.
    42. Kumari, P.; Toshniwal, D., Extreme gradient boosting and deep neural network based ensemble learning approach to forecast hourly solar irradiance. Journal of Cleaner Production 2021, 279, 123285.
    43. Barker, S.; Mishra, A.; Irwin, D.; Cecchet, E.; Shenoy, P.; Albrecht, J. J. S., August, Smart*: An open data set and tools for enabling research in sustainable homes. 2012, 111, (112), 108.
    44. Liu, W.; Liu, W. D.; Gu, J. J. J. o. P. S.; Engineering, Forecasting oil production using ensemble empirical model decomposition based Long Short-Term Memory neural network. 2020, 189, 107013.
    45. Chollet, F., Keras: The python deep learning library. Astrophysics source code library 2018, ascl: 1806.022.
    46. Chollet, F., Deep learning with Python. Simon and Schuster: 2021.
    47. Hyndman, R. J.; Athanasopoulos, G., Forecasting: principles and practice. OTexts: 2018.
    48. Zhang, A.; Lipton, Z. C.; Li, M.; Smola, A. J., Dive into deep learning. arXiv preprint arXiv:2106.11342 2021.
    49. Kao, I. F.; Zhou, Y.; Chang, L.-C.; Chang, F.-J., Exploring a Long Short-Term Memory based Encoder-Decoder framework for multi-step-ahead flood forecasting. Journal of Hydrology 2020, 583, 124631.
    50. Angioi, A., Time Series Forecasting with an LSTM Encoder/Decoder in TensorFlow 2.0. In Alessandro Angioi [Blog], 2020; Vol. 2022.
    51. Chollet, F., A ten-minute introduction to sequence-to-sequence learning in Keras. In The Keras Blog.[Keras], 2017; Vol. 2022.
    52. Sutskever, I.; Vinyals, O.; Le, Q. V., Sequence to sequence learning with neural networks. Advances in neural information processing systems 2014, 27.
    53. Zhou, Z.-H., Ensemble methods: foundations and algorithms. CRC press: Florida, USA, 2012.
    54. Wood, D. A., Trend decomposition aids short-term countrywide wind capacity factor forecasting with machine and deep learning methods. Energy Conversion and Management 2022, 253, 115189.
    55. Mancuso, P.; Piccialli, V.; Sudoso, A. M., A machine learning approach for forecasting hierarchical time series. Expert Systems with Applications 2021, 182, 115102.
    56. Miller, C.; Meggers, F. J. E. P., The Building Data Genome Project: An open, public data set from non-residential building electrical meters. 2017, 122, 439-444.
    57. Singh, A. A Gentle Introduction to Handling a Non-Stationary Time Series in Python. https://www.analyticsvidhya.com/blog/2018/09/non-stationary-time-series-python/# (July, 18),
    58. Kwiatkowski, D.; Phillips, P. C.; Schmidt, P.; Shin, Y., Testing the null hypothesis of stationarity against the alternative of a unit root: How sure are we that economic time series have a unit root? Journal of econometrics 1992, 54, (1-3), 159-178.
    59. Tavenard, R.; Faouzi, J.; Vandewiele, G.; Divo, F.; Androz, G.; Holtz, C.; Payne, M.; Yurchak, R.; Rußwurm, M.; Kolar, K. J. J. M. L. R., Tslearn, a machine learning toolkit for time series data. 2020, 21, (118), 1-6.
    60. Seabold, S.; Perktold, J. In Statsmodels: Econometric and statistical modeling with python, Proceedings of the 9th Python in Science Conference, 2010; Austin, TX: 2010; p 10.25080.

    無法下載圖示 全文公開日期 2024/08/15 (校內網路)
    全文公開日期 2024/08/15 (校外網路)
    全文公開日期 2024/08/15 (國家圖書館:臺灣博碩士論文系統)
    QR CODE