簡易檢索 / 詳目顯示

研究生: 鄧利勝
VICTOR ANDREAN
論文名稱: A Parallel Bidirectional Long Short-Term Memory Model for Energy Disaggregation
A Parallel Bidirectional Long Short-Term Memory Model for Energy Disaggregation
指導教授: 連國龍
Kuo-Lung Lian
口試委員: 鮑興國
Hsing-Kuo Pao
李育杰
Yuh-Jye Lee
花凱龍
Kai-Lung Hua
蔡孟伸
Men-Shen Tsai
學位類別: 碩士
Master
系所名稱: 電資學院 - 電機工程系
Department of Electrical Engineering
論文出版年: 2019
畢業學年度: 107
語文別: 英文
論文頁數: 58
中文關鍵詞: Non-Intrusive Load MonitoringBidirectional Long-short Term MemoryEnergy Disaggregation
外文關鍵詞: Non-Intrusive Load Monitoring, Bidirectional Long-short Term Memory, Energy Disaggregation
相關次數: 點閱:277下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • Non-intrusive load monitoring (NILM) is an elegant solution for monitoring energy consumption. NILM was getting popular since the advance of machine learning and deep learning technique. For the past years, there have been some deep learning techniques proposed for NILM. The results have shown that the performance of deep learning models can outperform the prior state of the art of NILM models such as Factorial Hidden Markov Model. A NILM model needs to identify distinctive power patterns of certain appliances in order to monitor the power consumptions. Statistical features (SFs) such as power difference and difference of variant power can be utilized to help the network learn better. As there is no single perfect model that can perfectly fit for everything, based on empirical research, we find that particular SF can be useful at certain type of load. This paper proposes a parallel bidirectional long short-term memory model with SFs to improve learning capability of the network. The proposed method is tested along with some most recent deep learning models on NILM such as DCNN, GLU-Res, BLSTM, and AE. The proposed method can successfully outperform those methods and shows consistent results.


    Non-intrusive load monitoring (NILM) is an elegant solution for monitoring energy consumption. NILM was getting popular since the advance of machine learning and deep learning technique. For the past years, there have been some deep learning techniques proposed for NILM. The results have shown that the performance of deep learning models can outperform the prior state of the art of NILM models such as Factorial Hidden Markov Model. A NILM model needs to identify distinctive power patterns of certain appliances in order to monitor the power consumptions. Statistical features (SFs) such as power difference and difference of variant power can be utilized to help the network learn better. As there is no single perfect model that can perfectly fit for everything, based on empirical research, we find that particular SF can be useful at certain type of load. This paper proposes a parallel bidirectional long short-term memory model with SFs to improve learning capability of the network. The proposed method is tested along with some most recent deep learning models on NILM such as DCNN, GLU-Res, BLSTM, and AE. The proposed method can successfully outperform those methods and shows consistent results.

    Tables of Contents Abstract i Acknowledgements ii Tables of Contents iii List of Figures iv List of Tables v CHAPTER 1 INTRODUCTION 1 1.1 Background 1 1.2 Problem Statement 3 1.3 Methodology 4 1.4 Outline 4 CHAPTER 2 RELATED WORKS 5 2.1 Optimization-based Approach 5 2.2 Learning-based Approach 5 CHAPTER 3 NILM SYSTEM AND DATA PROPROCESSING 9 3.1 Data scaling 10 3.2 Window Length Selection 10 3.3 Input to output relation (IOR) 11 CHAPTER 4 STATE OF THE ART OF NILM MODELS 14 4.1 Deep Convolutional Neural Network (DCNN) 14 4.2 GLU-Res 14 4.3 Bidirectional long short-term memory (BLSTM) 16 4.4 Autoencoder (AE) 17 CHAPTER 5 PROPOSED METHOD 19 5.1 Feature extractor 20 5.2 Deep Neural Network (DNN) Model 23 CHAPTER 6 EXPERIMENT AND RESULT 29 CHAPTER 7 CONCLUSION & FUTURE WORK 48 7.1 Conclusion 48 7.2 Future Work 48 REFERENCES 49

    [1]. G. W. Hart, “Nonintrusive appliance load monitoring,” Proceedings of the IEEE, vol. 80, no. 12, pp. 1870–1891, Dec 1992.
    [2]. S. Lee, B. Song, Y. Kwon, and J. hyo Kim, “Non-intrusive load monitoring for home energy usage with multiple power states recognition,” 2015.
    [3]. J. Kim, T.-T.-H. Le, and H. Kim, “Nonintrusive load monitoring based on advanced deep learning and novel signature,” Computational Intelligence and Neuroscience, pp. 1–1, Dec 2017.
    [4]. H. Kim, M. Marwah, M. Arlitt, G. Lyon, and J. Han, Unsupervised Disaggregation of Low Frequency Power Measurements, pp. 747–758. [Online].
    Available: https://epubs.siam.org/doi/abs/10.1137/1.9781611972818.64
    [5]. Y. Chen, L. Shu, and L. Wang, “Poster abstract: Traffic flow prediction with big data: A deep learning based time series model,” in 2017 IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS), May 2017, pp. 1010–1011.
    [6]. S. Mun, S. Shon, W. Kim, D. K. Han, and H. Ko, “Deep neural network based learning and transferring mid-level audio features for acoustic scene classification,” in 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), March 2017, pp. 796–800.
    [7]. T. Le, J. Kim, and H. Kim, “Classification performance using gated recurrent unit recurrent neural network on energy disaggregation,” in 2016 International Conference on Machine Learning and Cybernetics (ICMLC), vol. 1, July 2016, pp. 105–110.
    [8]. J. Kelly and W. J. Knottenbelt, “Neural NILM: deep neural networks applied to energy disaggregation,” CoRR, vol. abs/1507.06594, 2015. [Online]. Available: http://arxiv.org/abs/1507.06594
    [9]. C. Zhang, M. Zhong, Z. Wang, N. H. Goddard, and C. A. Sutton, “Sequence-to-point learning with neural networks for nonintrusive load monitoring,” CoRR, vol. abs/1612.09106, 2018.
    [10]. K. Chen, Q. Wang, Z. He, and H. Jinliang, “Convolutional sequence to sequence non-intrusive load monitoring,” 09 2018.L. Mauch and B. Yang, “A new approach for supervised power disaggregation by using a deep recurrent lstm network,” in 2015 IEEE Global Conference on Signal and Information Processing (GlobalSIP), Dec 2015, pp. 63–67.
    [11]. L. Mauch and B. Yang, “A new approach for supervised power disaggregation by using a deep recurrent lstm network,” in 2015 IEEE Global Conference on Signal and Information Processing (GlobalSIP), Dec 2015, pp. 63–67.
    [12]. H. Xue, D. Q. Huynh, and M. Reynolds, “Ss-lstm: A hierarchical lstm model for pedestrian trajectory prediction,” in 2018 IEEE Winter Conference on Applications of Computer Vision (WACV), March 2018, pp. 1186–1194.
    [13]. H. Kim, M. Marwah, M. Arlitt, G. Lyon, and J. Han, Unsupervised Disaggregation of Low Frequency Power Measurements, pp. 747–758. [Online]. Available:https://epubs.siam.org/doi/abs/10.1137/1.9781611972818.64
    [14]. M. Z. A. Bhotto, S. Makonin, and I. V. Baji, “Load disaggregation based on aided linear integer programming,” IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 64, no. 7, pp. 792–796, July 2017.
    [15]. F. M. Wittmann, J. C. Lpez, and M. J. Rider, “Nonintrusive load monitoring algorithm using mixed-integer linear programming,” IEEE Transactions on Consumer Electronics, vol. 64, no. 2, pp. 180–187, May 2018.
    [16]. Y. Lin and M. Tsai, “Development of an improved timefrequency analysis-based nonintrusive load monitor for load demand identification,” IEEE Transactions on Instrumentation and Measurement, vol. 63, no. 6, pp. 1470–1483, June 2014.
    [17]. S. Biansoongnern and B. Plangklang, “Nonintrusive load monitoring (nilm) using an artificial neural network in embedded system with low sampling rate,” in 2016 13th International Conference on Electrical Engineering/ Electronics, Computer, Telecommunications and Information Technology (ECTI-CON), June 2016, pp. 1–4.
    [18]. J. Z. Kolter and T. Jaakkola, “Approximate inference in additive factorial hmms with application to energy disaggregation,” in Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, ser. Proceedings of Machine Learning Research, N. D. Lawrence and M. Girolami, Eds., vol. 22. La Palma, Canary Islands: PMLR, 21–23 Apr 2012, pp. 1472–1482. [Online]. Available: http://proceedings.mlr.press/v22/zico12.html
    [19]. J. Z. Kolter and M. J. Johnson, “Redd: A public data set for energy disaggregation research,” in in SustKDD, 2011.
    [20]. A. Zoha, A. Gluhak, M. Nati, and M. A. Imran, “Low-power appliance monitoring using factorial hidden markov models,” in 2013 IEEE Eighth International Conference on Intelligent Sensors, Sensor Networks and Information Processing, April 2013, pp. 527–532.
    [21]. Y. Lin and M. Tsai, “A novel feature extraction method for the development of nonintrusive load monitoring system based on bp-ann,” in 2010 International Symposium on Computer, Communication, Control and Automation (3CA), vol. 2, May 2010, pp. 215–218.
    [22]. Y.-H. Lin and M.-S. Tsai, “Application of neuro-fuzzy pattern recognition for non-intrusive appliance load monitoring in electricity energy conservation,” in 2012 IEEE International Conference on Fuzzy Systems, June 2012, pp. 1–7.
    [23]. S. Biansoongnern and B. Plangklang, “Nonintrusive load monitoring (nilm) using an artificial neural network in embedded system with low sampling rate,” in 2016 13th International Conference on Electrical Engineering/ Electronics, Computer, Telecommunications and Information Technology (ECTI-CON), June 2016, pp. 1–4.
    [24]. O. Krystalakos, C. Nalmpantis, and D. Vrakas, “Sliding window approach for online energy disaggregation using artificial neural networks,” in Proceedings of the 10th Hellenic Conference on Artificial Intelligence, ser. SETN ’18. New York, NY, USA: ACM, 2018, pp. 7:1– 7:6. [Online]. Available: http://doi.acm.org/10.1145/3200947.3201011
    [25]. Y. N. Dauphin, A. Fan, M. Auli, and D. Grangier, “Language modeling with gated convolutional networks,” CoRR, vol. abs/1612.08083, 2016. [Online]. Available: http://arxiv.org/abs/1612.08083
    [26]. S. Hochreiter and J. Schmidhuber, “Long short-term memory,” vol. 9, pp. 1735–80, 12 1997.
    [27]. C. Wigington, S. Stewart, B. Davis, B. Barrett, B. Price, and S. Cohen,“Data augmentation for recognition of handwritten words and lines using a cnn-lstm network,” in 2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR), vol. 01, Nov 2017, pp. 639–645.
    [28]. S. Zhang, Y. Yang, J. Xiao, X. Liu, Y. Yang, D. Xie, and Y. Zhuang, “Fusing geometric features for skeleton-based action recognition using multilayer lstm networks,” IEEE Transactions on Multimedia, vol. 20, no. 9, pp. 2330–2343, Sept 2018.
    [29]. S. Kumar, L. Hussain, S. Banarjee, and M. Reza, “Energy load forecasting using deep learning approach-lstm and gru in spark cluster,” in 2018 Fifth International Conference on Emerging Applications of Information Technology (EAIT), Jan 2018, pp. 1–4.
    [30]. D. Zhu, S. Shen, X. Dai, and J. Chen, “Going wider: Recurrent neural network with parallel cells,” CoRR, vol. abs/1705.01346, 2017. [Online]. Available: http://arxiv.org/abs/1705.01346
    [31]. M. Bouaziz, M. Morchid, R. Dufour, G. Linar`es, and R. D. Mori, “Parallel long short-term memory for multi-stream classification,” CoRR, vol. abs/1702.03402, 2017. [Online]. Available: http://arxiv.org/abs/1702.03402
    [32]. X. Glorot and Y. Bengio, “Understanding the difficulty of training deep feedforward neural networks,” in Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, ser. Proceedings of Machine Learning Research, Y. W. Teh and M. Titterington, Eds., vol. 9. Chia Laguna Resort, Sardinia, Italy: PMLR, 13–15 May 2010, pp. 249
    [33]. N. S. Keskar, D. Mudigere, J. Nocedal, M. Smelyanskiy, and P. T. P. Tang, “On large-batch training for deep learning: Generalization gap and sharp minima,” CoRR, vol. abs/1609.04836, 2016.
    [34]. E. Hoffer, I. Hubara, and D. Soudry, “Train longer, generalize better: closing the generalization gap in large batch training of neural networks,” in NIPS, 2017.
    [35]. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” CoRR, vol. abs/1412.6980, 2014. [Online]. Available: http://arxiv.org/abs/1412.6980

    無法下載圖示 全文公開日期 2024/01/31 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE