簡易檢索 / 詳目顯示

研究生: 南宮濤
Dinar Nugroho Pratomo
論文名稱: Optimized Extreme Learning Machine using Fruit Fly Optimization for Classification
Optimized Extreme Learning Machine using Fruit Fly Optimization for Classification
指導教授: 呂永和
Yungho Leu
口試委員: 楊維寧
Wei-Ning Yang
陳雲岫
Yun-Shiow Chen
Agus Harjoko
Agus Harjoko
Suprapto
Suprapto
Agus Sihabuddin
Agus Sihabuddin
學位類別: 碩士
Master
系所名稱: 管理學院 - 資訊管理系
Department of Information Management
論文出版年: 2019
畢業學年度: 107
語文別: 英文
論文頁數: 71
中文關鍵詞: Extreme learning machinefruit fly optimization algorithmclassification
外文關鍵詞: Extreme learning machine, fruit fly optimization algorithm, classification
相關次數: 點閱:471下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • Extreme Learning Machine (ELM) is one of the learning methods of Artificial Neural Network, which has many advantages, such as fast learning speed, good generalization performance and high accuracy results. However, one of the weaknesses of the ELM method is that the number of hidden nodes cannot be specified without applying the trial and error manually. It is not possible to obtain the right number of hidden nodes in order to get the good result in ELM Method. In this study we propose to use a new swarm intelligence algorithm, the Fruit Fly Optimization Algorithm (FOA), to optimize ELM and to find the optimal number of hidden nodes, weights, and biases. We tested the performance of the proposed method on several datasets from the UCI repository for classification. This research compares the proposed method with other optimization algorithms. The experimental result shows that optimizing ELM by FOA results in higher accuracy compared to other methods such as PSO-ELM, E-ELM, OP-ELM, VPSO-ELM, IPSO-ELM and the original ELM.


    Extreme Learning Machine (ELM) is one of the learning methods of Artificial Neural Network, which has many advantages, such as fast learning speed, good generalization performance and high accuracy results. However, one of the weaknesses of the ELM method is that the number of hidden nodes cannot be specified without applying the trial and error manually. It is not possible to obtain the right number of hidden nodes in order to get the good result in ELM Method. In this study we propose to use a new swarm intelligence algorithm, the Fruit Fly Optimization Algorithm (FOA), to optimize ELM and to find the optimal number of hidden nodes, weights, and biases. We tested the performance of the proposed method on several datasets from the UCI repository for classification. This research compares the proposed method with other optimization algorithms. The experimental result shows that optimizing ELM by FOA results in higher accuracy compared to other methods such as PSO-ELM, E-ELM, OP-ELM, VPSO-ELM, IPSO-ELM and the original ELM.

    ABSTRACT i ACKNOWLEDGEMENT ii TABLE OF CONTENTS iii LIST OF FIGURES iv LIST OF TABLES v Chapter 1. Introduction 1 1.1. Motivation 1 1.2 Research Problem 4 1.3 Research Objective 5 1.3.1 General Objective 5 1.3.2 Specific Objectives 5 1.4 Thesis Overview 5 1.4.1 Chapter 2 Problem Definitions 5 1.4.2 Chapter 3 Proposed Method 6 1.4.3 Chapter 4 Experimental Result 6 1.4.4 Chapter 5 Conclusion and Future Work 6 Chapter 2. Problem Definition 7 2.1 Neural Network 7 2.2 Single-Hidden Layer Feedforward Neural Networks (SLFNs) 9 2.3 Extreme Learning Machine (ELM) 11 2.4 Fruit Fly Optimization Algorithm 15 2.5 Classification 18 2.6 Classification Performance Measure 18 Chapter 3. Optimizing Extreme Learning Machine using Fruit Fly Optimization Algorithm 22 3.1. Datasets 22 3.2. FOA-ELM 26 3.2.1 FOA-ELM optimizing Number of Hidden Nodes 27 3.2.2 FOA-ELM optimizing values of the weights and the biases 29 3.2.3 FOA-ELM optimizing number of hidden nodes, values of the weights and the biases 32 Chapter 4. Experimental Result 35 4.1. Benchmark Datasets 35 4.1.1. Binary Class Datasets 35 4.1.2. Multiclass Datasets 36 4.2. Simulation Environment Settings 36 4.3. Experimental setup 37 4.4. Performance Comparison of Real-World Benchmark Datasets 38 Chapter 5. Conclusions and Future Research 48 5.1. Result Summary 48 5.2. Limitation 49 5.3. Future Research 49 REFERENCES 50

    [1] Huang, G.-B., Zhu, Q.-Y., & Siew, C.-K. (2006). ‘Extreme learning machine: Theory and applications’, Neurocomputing, Vol. 70, No. 1-3, pp. 489–501.
    [2] Huang, G.B., Zhou, H., Ding, X., Zhang, R. (2011). ‘Extreme learning machine for regression and multiclass classification’, IEEE Transactions on Systems, Man, and Cybernetics, Vol.42, No. 2, pp. 513-529.
    [3] Huang, G.B., Ding, X., & Zhou, H. (2010). ‘Optimization method based extreme learning machine for classification’, Neurocomputing, Vol.74, No. 1-3, pp. 155–163.
    [4] Pan, W.-T., (2012). ‘A new fruit fly optimization algorithm: taking the financial distress model as an example’, Knowledge-Based Systems, Vol. 26, pp. 69–74.
    [5] Pan, W.-T., (2013). ‘Using modified fruit fly optimization algorithm to perform the function test and case studies’, Connection Science, Vol.00, No.00, pp.1-10.
    [6] Pan, W.-T., (2011). ‘New Evolutionary Computation: Fruit fly optimization algorithm,” Tsang Hai Book Publishing Co.,
    [7] Wang, L., Zheng, X., & Wang, S. (2013). ‘A novel binary fruit fly optimization algorithm for solving the multidimensional knapsack problem’. Knowledge-Based Systems, Vol. 48, pp.17–23.
    [8] Lin, S.M. (2013), ‘Analysis of service satisfaction in web auction logistics service using a combination of Fruit fly optimization algorithm and general regression neural network’, Neural Computing and Applications, Vol.22, No.3-4, pp.783-791.
    [9] Li, H., Guo, S., Li, C., & Sun, J. (2013). ‘A hybrid annual power load forecasting model based on generalized regression neural network with fruit fly optimization algorithm’, Knowledge-Based Systems, Vol. 37, pp. 378–387.
    [10] Li, H., Guo, S., Zhao, H., Su, C., & Wang, B. (2012). ‘Annual electric load forecasting by a least squares support vector machine with a fruit fly optimization algorithm’, Energies, Vol. 5, No. 11, 4430–4445.
    [11] Blake, C. L., & Merz, C. J. (1998). ‘UCI repository of machine learning databases,” dept. inf. comput. sci., univ. california, irvine, ca, Retrieved:http://www.ics.uci.edu/~mlearn/MLRepository.html
    [12] Bordes, A., Ertekin, S. Weston, J. and Bottou, L. (2005). ‘Fast kernel classifiers with online and active learning’, J. Mach. Learn. Res., Vol. 6, pp. 1579–1619.
    [13] Hull, J. J. (1994). ‘A database for handwritten text recognition research’, IEEE Trans. Pattern Anal. Mach. Intell., Vol. 16, No. 5, pp. 550–554.
    [14] Li, J., & Liu, H. (2004). ‘Kent ridge bio-medical dataset repository’, School Comput. Eng., Nanyang Technol. Univ., Singapore, Retrieved:http://levis.tongji.edu.cn/gzli/data/mirror-kentridge.html
    [15] Odewahn, S. C., Stockwell, E. B. Pennington, R. L. Humphreys, R. M. & Zumach, W. A. (1992). ‘Automated star/galaxy discrimination with neural networks’, Astron. J., Vol. 103, No. 1, pp. 318–331.
    [16] Zaki, M.J., Jr W.M., (2014). ‘Data mining and analysis: fundamental concepts and algorithms", Cambridge University Press, New York.
    [17] Huang, G. B. Zhu, Q.Y. Siew, C.K. (2004) ‘Extreme learning machine: a new learning scheme of feedforward neural networks’, in Proceedings of IEEE International Joint Conference on Neural Networks, Vol. 2, pp. 985–990
    [18] Huang, G. B. Song, S. Gupta, J. N. & Wu, C. (2014) ‘Semi-supervised and unsupervised extreme learning machines’, IEEE transactions on cybernetics44, Vol. 12, pp. 2405-2417.
    [19] Kasun, L. L. C. Zhou, H. Huang, G.-B. & Vong, C. M. (2013). "Representational learning with ELMs for big data"
    [20] Albadr M.A.A. and Tiun, S. (2017) ‘Extreme learning machine: a review’, International Journal of Applied Engineering Research, Vol. 12, No. 14, pp. 4610-4623.
    [21] Lan, Y., Hu, Z., Soh, Y. C., & Huang, G.-B. (2012). ‘An extreme learning machine approach for speaker recognition’. Neural Computing and Applications, Vol. 22, No. 3-4, pp. 417–425.
    [22] Han, K., Yu, D. & Tashev, I. (2014) ‘Speech emotion recognition using deep neural network and extreme learning machine’. Interspeech, pp. 223-227.
    [23] You, Z.-H., Y.-K. Lei, L. Zhu, J. Xia & B. Wang (2013). ‘Prediction of protein-protein interactions from amino acid sequences with ensemble extreme learning machines and principal component analysis’, Vol. 14 No. 8, pp. S10.
    [24] Karpagachelvi, S., M. Arthanari & M. Sivakumar (2012). ‘Classification of electrocardiogram signals with support vector machines and extreme learning machine’. Neural Computing & Applications Vol. 21, No. 6, pp.1331-1339.
    [25] Kaya, Y., & Uyar, M (2013). ‘A hybrid decision support system based on rough set and extreme learning machine for diagnosis of hepatitis disease’. Applied Soft Computing, Vol. 13, No. 8, pp. 3429–3438.
    [26] Minhas, R., Baradarani, A., Seifzadeh, S., & Jonathan Wu, Q. M. (2010). ‘Human action recognition using extreme learning machine based on visual vocabularies’. Neurocomputing, Vol. 73. No. 10-12, pp. 1906–1917
    [27] Mohammed, A. A., Minhas, R., Jonathan Wu, Q. M., & Sid-Ahmed, M. A. (2011). ‘Human face recognition based on multidimensional PCA and extreme learning machine’. Pattern Recognition, Vol. 44 No. 10-11, pp. 2588–2597
    [28] Zong, W., & Huang, G.-B. (2011). ‘Face recognition based on extreme learning machine’. Neurocomputing, Vol. 74, No. 16, pp. 2541–2551
    [29] Li, W., Wang, D., & Chai, T. (2013). ‘Burning state recognition of rotary kiln using ELMs with heterogeneous features’. Neurocomputing, Vol. 102, pp.144–153.
    [30] Han, M. (2010). ‘Change detection of land use and land cover in an urban region with SPOT-5 images and partial Lanczos extreme learning machine’. Journal of Applied Remote Sensing, Vol. 4, No. 1
    [31] Cao, F., Liu, B., & Sun Park, D. (2013). ‘Image classification based on effective extreme learning machine’. Neurocomputing, Vol. 102, pp. 90–97
    [32] Xu, Y., Dai, Y., Dong, Z. Y., Zhang, R., & Meng, K. (2012). ‘Extreme learning machine-based predictor for real-time frequency stability assessment of electric power systems’. Neural Computing and Applications, Vol. 22, No. 3-4, pp. 501–508
    [33] Chen, X., Dong, Z. Y., Meng, K., Xu, Y., Wong, K. P., & Ngan, H. W. (2012). ‘Electricity Price Forecasting With Extreme Learning Machine and Bootstrapping’. IEEE Transactions on Power Systems, Vol. 27, No. 4, pp. 2055–2062
    [34] Chen, F. L., & Ou, T. Y. (2011). ‘Sales forecasting system based on Gray extreme learning machine with Taguchi method in retail industry’. Expert Systems with Applications, Vol. 38, No. 3, pp. 1336–1345
    [35] Xu, Y., Dong, Z. Y., Xu, Z., Meng, K., & Wong, K. P. (2012). ‘An Intelligent Dynamic Security Assessment Framework for Power Systems With Wind Power’. IEEE Transactions on Industrial Informatics, Vol. 8, No. 4, pp. 995–1003
    [36] Balbay, A., Kaya, Y., & Sahin, O. (2012). ‘Drying of black cumin (Nigella sativa) in a microwave assisted drying system and modeling using extreme learning machine’. Energy, Vol. 44, No. 1, pp. 352–357
    [37] Xu, Y., & Shu, Y. (2006). ‘Evolutionary Extreme Learning Machine – Based on Particle Swarm Optimization’. Lecture Notes in Computer Science, pp. 644–652.
    [38] Han, F., Yao, H., & Ling, Q. (2013) ‘An improved evolutionary extreme learning machine based on particle swarm optimization’, Neurocomputing, Vol. 116, pp. 87-93.
    [39] Xue, B., Ma, X., Gu, J., & Li, Y. (2013). ‘An improved extreme learning machine based on Variable-length Particle Swarm Optimization’. 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 1030-1035
    [40] Lacifico, L. D. S., & Ludermir, T. B. (2013). ‘Evolutionary extreme learning machine based on particle swarm optimization and clustering strategies’. The 2013 International Joint Conference on Neural Networks (IJCNN)
    [41] Yu, Y., Li, Y., Li, J., & Gu, X. (2016). ‘Self-adaptive step fruit fly algorithm optimized support vector regression model for dynamic response prediction of magnetorheological elastomer base isolator’. Neurocomputing, Vol. 211, pp. 41–52
    [42] Shen, L., Chen, H., Kang, W., Gu, H., Zhang, B., & Ge, T. (2015). ‘Fruit Fly Optimization Algorithm Based SVM Classifier for Efficient Detection of Parkinson’s Disease’. Lecture Notes in Computer Science, pp. 98–106
    [43] Iscan, H., & Gunduz, M. (2017). ‘An application of fruit fly optimization algorithm for traveling salesman problem’. Procedia Computer Science, Vol. 111, pp. 58–63
    [44] Huang, G.-B., Chen, L., & Siew, C.-K. (2006). ‘Universal Approximation Using Incremental Constructive Feedforward Networks With Random Hidden Nodes’. IEEE Transactions on Neural Networks, Vol. 17, No. 4, pp. 879–892
    [45] Huang, G.-B., & Chen, L. (2007). ‘Convex incremental extreme learning machine’. Neurocomputing, Vol. 70, No. 16-18, pp. 3056–3062
    [46] Huang, G.-B., & Chen, L. (2008). ‘Enhanced random search based incremental extreme learning machine’. Neurocomputing, Vol. 71, No. 16-18, pp. 3460–3468
    [47] Haykin, S. (1999). ‘Neural Network: A Comprehensive Foundation (2nd Edition)’, Prentice Hall
    [48] Tamura, S., & Tateishi, M. (1997). ‘Capabilities of a four-layered feedforward neural network: four layers versus three’. IEEE Transactions on Neural Networks, Vol. 8, No. 2, pp. 251–255.
    [49] Guang-Bin Huang. (2003). Learning capability and storage capacity of two-hidden-layer feedforward networks. IEEE Transactions on Neural Networks, Vol. 14, No. 2, pp. 274–281
    [50] Huang, G.-B., Zhu, Q.-Y., & Siew, C.-K. (2006). Real-Time Learning Capability of Neural Networks. IEEE Transactions on Neural Networks, Vol. 17, No. 4, pp. 863–878.
    [51] Huang, G.-B., Chen, L., & Siew, C.-K. (2006). Universal Approximation Using Incremental Constructive Feedforward Networks With Random Hidden Nodes. IEEE Transactions on Neural Networks, Vol. 17, No. 4, pp. 879–892
    [52] Serre, D. (2002). “Matrices: Theory and applications,” Springer-Verlag New York,

    QR CODE