簡易檢索 / 詳目顯示

研究生: FERANI EVA ZULVIA
FERANI - EVA ZULVIA
論文名稱: 應用梯度進化演算法於分群分析
Cluster analysis using a gradient evolution algorithm
指導教授: 郭人介
Ren-Jieh Kuo
口試委員: 許鉅秉
Jiuh-Biing Sheu
陳穆臻
Mu-Chen Chen
蔡介元
Chieh-Yuan Tsai
陳凱瀛
Kai-Ying Chen
王孔政
Kung-Jeng Wang
歐陽超
Chao Ou-Yang
學位類別: 博士
Doctor
系所名稱: 管理學院 - 工業管理系
Department of Industrial Management
論文出版年: 2015
畢業學年度: 103
語文別: 英文
論文頁數: 138
中文關鍵詞: 梯度進化演算法梯度進化為基礎的K平均數演算法萬用演算法分群分析最佳化
外文關鍵詞: Gradient evolution algorithm, Gradient evolution-based K-means algorithm, Metaheuristics, Cluster analysis, Optimization.
相關次數: 點閱:389下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本研究主要是提出一個源自於梯度搜尋方法之萬用演算法。梯度的最佳化方法主要應用於尋找極值以及最佳解,本研究將梯度方法進行修改,並建構一個以梯度定理為基礎更新規則的萬用演算法。此方法名為梯度進化演算法(gradient evolution (GE) algorithm),其透過一組向量來搜尋搜索空間,包含下列三個步驟:向量之更新、跳躍及重整。向量更新為梯度進化法的主要更新法則,搜尋方向是以牛頓逼近法(Newton-Raphson)來決定;向量之跳躍與重整能夠避免陷入區域最佳解。本研究分別執行三項實驗及運用十五種測試函式驗證梯度進化演算法。首先,評估各參數設定對於結果的影響程度,同時選定最佳參數設定。後續對原始與修正過的萬用演算法進行比較,實驗結果顯示,於大多數標竿資料測試的結果中,相較於粒子群最佳化演算法(particle swarm optimization algorithm)、差分進化法(differential evolution algorithm)、人工蜂群演算法(artificial bee colony algorithm)及連續型基因演算法(continuous genetic algorithm),梯度進化演算法能擁有優於或是與上列方法一樣好的表現。最後,將梯度進化演算法加入K平均演算法(K-means algorithm),應用於解決分群問題。本研究提出兩種分群方法,分別為單目標和多目標分群方法,運用標竿資料集測試之實驗結果顯示,以梯度進化演算法為基礎之K平均分群法(GE-based K-means algorithm)能取得比其他以萬用演算法為基礎之分群法更佳的分群結果。


    This study presents a new metaheuristic method that is derived from the gradient-based searching method. In an exact optimization method, the gradient is used to find extreme points as well as the optimal point. This study modifies the gradient method and creates a new metaheuristic method that uses a gradient theorem as its basic updating rule. This new method is named gradient evolution (GE) algorithm which explores the search space using a set of vectors and includes three major operators: vector updating, jumping and refreshing. Vector updating is the main updating rule in the GE algorithm. The search direction is determined using the Newton-Raphson method. Vector jumping and refreshing allow this method to avoid local optima. In order to evaluate the performance of the GE algorithm, three different experiments are conducted by using fifteen test functions. The first experiment determines the influence of parameter settings on the result. It also determines the best parameter setting. There follows a comparison between the basic and improved metaheuristic methods. The experimental results show that the GE algorithm performs better than, or as well as other metaheuristics, such as particle swarm optimization algorithm, differential evolution algorithm, artificial bee colony algorithm and continuous genetic algorithm, for most of the benchmark problems tested. Furthermore, the proposed GE algorithm is applied to clustering problem. In order to solve the clustering problem, the GE algorithm is embedded with K-means algorithm. There are two clustering approaches presented in this study, single-objective and multi-objective clustering methods. The experimental results using benchmark datasets indicate that GE-based K-means algorithm outperforms other metaheuristic-based K-means algorithms.

    摘要 i ABSTRACT ii ACKNOWLEDGEMENTS iii TABLE OF CONTENTS iv LIST OF TABLES vi LIST OF FIGURES ix Chapter 1 INTRODUCTION 1 1.1 Research Background 1 1.2 Research Objectives 3 1.3 Research Scope and Limitations 3 1.4 Thesis Organization 3 Chapter 2 LITERATURE STUDY 5 2.1 The Gradient 5 2.2 Gradient-Based Algorithms 7 2.3 Clustering 10 2.3.1 K-means algorithm 10 2.3.2 Metaheuristic-based clustering methods 11 2.4 Multi-Objective Optimization Using Metaheuristic Methods 12 Chapter 3 METHODOLOGY 14 3.1 Solution Representation and Initialization 15 3.2 Vector Updating 16 3.3 Vector Jumping 19 3.4 Vector Refreshing 20 3.5 The GE Algorithm 20 Chapter 4 EXPERIMENTAL RESULTS 23 4.1 Test Functions 23 4.2 Experimental Setup 23 4.3 Analysis of Parameter Setting 25 4.4 A Comparison with Basic Metaheuristic Methods 26 Chapter 5 APPLICATION OF GRADIENT EVOLUTION ALGORITHM TO SINGLE-OBJECTIVE CLUSTERING 39 5.1 Gradient Evolution Algorithm for Clustering 39 5.2 Datasets and Experimental Setup 41 5.3 Analysis of Parameter Setting towards Single-objective Clustering Result 42 5.4 A Comparison with other Metaheuristic-based Clustering Algorithms 43 Chapter 6 APPLICATION OF GRADIENT EVOLUTION ALGORITHM TO MULTI-OBJECTIVE CLUSTERING 49 6.1 Gradient Evolution Algorithm for Multi-objective Clustering 49 6.2 Datasets and Experimental Setup 51 6.3 Analysis of Parameter Setting in GE-based Multi-objective Clustering 51 6.4 A Comparison with Other Metaheuristic-based Clustering Algorithms 53 Chapter 7 CONCLUSIONS 59 7.1 Conclusions 59 7.2 Contributions 60 7.3 Future Research 60 REFERENCES 62 APPENDIX I COMPUTATIONAL RESULTS FOR OPTIMIZATION PROBLEM 66 APPENDIX II COMPUTATIONAL RESULTS FOR SINGLE-OBJECTIVE CLUSTERING 105 APPENDIX III COMPUTATIONAL RESULT FOR SINGLE-OBJECTIVE CLUSTERING 115 AUTHOR INTRODUCTION 125

    Acharya, S., Saha, S., & Thadisina, Y. (2015). Multiobjective Simulated Annealing basedClustering of Tissue Samples for Cancer Diagnosis. Biomedical and Health Informatics, IEEE Journal of, PP, 1-1.
    Ackley, D. H. (1987). A connectionist machine for genetic hillclimbing. USA: Kluwer Academic Publishers.
    Akay, B., & Karaboga, D. (2012). A modified artificial bee colony algorithm for real-parameter optimization. Information Sciences, 192, 120-142.
    Alok, A., Saha, S., & Ekbal, A. (2015). Semi-supervised clustering for gene-expression data in multiobjective optimization framework. International Journal of Machine Learning and Cybernetics, 1-19.
    Arnold, D. V., & Salomon, R. (2007). Evolutionary Gradient Search Revisited. Evolutionary Computation, IEEE Transactions on, 11, 480-495.
    Bäck, T., & Schwefel, H.-P. (1993). An Overview of Evolutionary Algorithms for Parameter Optimization. Evolutionary computation, 1, 1-23.
    Bandyopadhyay, S., & Maulik, U. (2002). An evolutionary technique based on K-Means algorithm for optimal clustering in RN. Information Sciences, 146, 221-237.
    Baudry, B., Fleurey, F., Jezequel, J., & Le Traon, Y. (2005). Automatic test case optimization: a bacteriologic algorithm. IEEE Software, 22, 76-82.
    Bazaraa, M. S., Sherali, H. D., & Shetty, C. M. (2006). Nonlinear Programming Theory and Algorithms. New Jersey: John Wiley & Sons, Inc.
    Bezdek, J. C., Boggavarapu, S., Hall, L. O., & Bensaid, A. (1994). Genetic algorithm guided clustering. In Evolutionary Computation, 1994. IEEE World Congress on Computational Intelligence., Proceedings of the First IEEE Conference on (pp. 34-39 vol.31).
    Brest, J., Greiner, S., Boskovic, B., Mernik, M., & Zumer, V. (2006). Self-Adapting Control Parameters in Differential Evolution: A Comparative Study on Numerical Benchmark Problems. Evolutionary Computation, IEEE Transactions on, 10, 646-657.
    Chelouah, R., & Siarry, P. (2000). A Continuous Genetic Algorithm Designed for the Global Optimization of Multimodal Functions. Journal of Heuristics, 6, 191-213.
    Chelouah, R., & Siarry, P. (2003). Genetic and Nelder–Mead algorithms hybridized for a more accurate global optimization of continuous multiminima functions. European Journal of Operational Research, 148, 335-348.
    Chelouah, R., & Siarry, P. (2005). A hybrid method combining continuous tabu search and Nelder–Mead simplex algorithms for the global optimization of multiminima functions. European Journal of Operational Research, 161, 636-654.
    Chootinan, P., & Chen, A. (2006). Constraint handling in genetic algorithms using a gradient-based repair method. Computers & Operations Research, 33, 2263-2281.
    Coello Coello, C. A., & Lechuga, M. S. (2002). MOPSO: a proposal for multiple objective particle swarm optimization. In Evolutionary Computation, 2002. CEC '02. Proceedings of the 2002 Congress on (Vol. 2, pp. 1051-1056).
    Collette, Y., & Siarry, P. (2004). Multiobjective Optimization. Principles and Case Studies. Berlin, Germany: SpringerVerlag.
    Das, S., Abraham, A., & Konar, A. (2008). Automatic Clustering Using an Improved Differential Evolution Algorithm. Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on, 38, 218-237.
    Davies, D. L., & Bouldin, D. W. (1979). A Cluster Separation Measure. Pattern Analysis and Machine Intelligence, IEEE Transactions on, PAMI-1, 224-227.
    Dorigo, M., & Di Caro, G. (1999). Ant colony optimization: a new meta-heuristic. In Proceedings of the 1999 Congress on Evolutionary Computation (Vol. 2): IEEE.
    Dunn, J. C. (1973). A Fuzzy Relative of the ISODATA Process and Its Use in Detecting Compact Well-Separated Clusters. Journal of Cybernetics, 3, 32-57.
    Eberhart, R., & Kennedy, J. (1995). A new optimizer using particle swarm theory. In Proceedings of the Sixth International Symposium on Micro Machine and Human Science (pp. 39-43). Nagoya.
    García, S., Molina, D., Lozano, M., & Herrera, F. (2009). A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: a case study on the CEC’2005 special session on real parameter optimization. Journal of Heuristics, 15, 617-644.
    Geem, Z. W., Kim, J. H., & Loganathan, G. V. (2001). A New Heuristic Optimization Algorithm: Harmony Search. Simulation, 76.
    Griewank, A. O. (1981). Generalized descent for global optimization. Journal of Optimization Theory and Applications, 34, 11-39.
    Hestenes, M. R., & Stiefel, E. (1952). Methods of conjugate gradients for solving linear systems (Vol. 49): NBS.
    Holland, J. H. (1962). Outline for a logical theory of adaptive systems. Journal of the ACM, 9, 297-314.
    Horn, J., Nafpliotis, N., & Goldberg, D. E. (1994). A niched Pareto genetic algorithm for multiobjective optimization. In Evolutionary Computation, 1994. IEEE World Congress on Computational Intelligence., Proceedings of the First IEEE Conference on (pp. 82-87 vol.81).
    Ibtissem, C., & Nouredine, L. (2013). A hybrid method based on conjugate gradient trained neural network and differential evolution for non linear systems identification. In Electrical Engineering and Software Applications (ICEESA), 2013 International Conference on (pp. 1-5).
    Jong, K. A. D. (1975). An analysis of the behavior of a glass of genetic adaptive systems. University of Michigan, Michigan.
    Kao, Y.-T., & Zahara, E. (2008). A hybrid genetic algorithm and particle swarm optimization for multimodal functions. Applied Soft Computing, 8, 849-857.
    Karaboga, D. (2005). An idea based on honey bee swarm for numerical optimization. Techn. Rep. TR06, Erciyes Univ. Press, Erciyes.
    Karaboga, D., & Basturk, B. (2007). A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. Journal of Global Optimization, 39, 459-471.
    Knowles, J., & Corne, D. (1999). The Pareto archived evolution strategy: a new baseline algorithm for Pareto multiobjective optimisation. In Evolutionary Computation, 1999. CEC 99. Proceedings of the 1999 Congress on (Vol. 1, pp. 105 Vol. 101).
    Krishna, K., & Murty, M. N. (1999). Genetic K-means algorithm. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, 29, 433-439.
    Larson, R. E., Hostetler, R. P., & Edwards, B. H. (1991). Brief Calculus with Applications. Toronto: D. C. Heath and Company.
    Laumanns, M., Rudolph, G., & Schwefel, H.-P. (1998). A spatial predator-prey approach to multi-objective optimization: A preliminary study. In A. Eiben, T. Bäck, M. Schoenauer & H.-P. Schwefel (Eds.), Parallel Problem Solving from Nature — PPSN V (Vol. 1498, pp. 241-249): Springer Berlin Heidelberg.
    Liang, J. J., Qin, A. K., Suganthan, P. N., & Baskar, S. (2006). Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans. Evol. Comp., 10, 281-295.
    Lichman, M. (2013). UCI Machine Learning Repository. In. CA: University of California, Irvine, School of Information and Computer Sciences.
    Locatelli, M. (2003). A Note on the Griewank Test Function. Journal of Global Optimization, 25, 169-174.
    Lu, Y., Lu, S., Fotouhi, F., Deng, Y., & Brown, S. J. (2004). FGKA: A fast genetic k-means clustering algorithm. In Proceedings of the 2004 ACM symposium on Applied computing (pp. 622-623): ACM.
    Maulik, U., & Bandyopadhyay, S. (2000). Genetic algorithm-based clustering technique. Pattern Recognition, 33, 1455-1465.
    Miller, R. E. (2000). Optimization Foundations and Applications. USA: Wiley-Interscience Publication.
    Monmarché, N., Slimane, M., & Venturini, G. (1999). AntClass: discovery of clusters in numeric data by an hybridization of an ant colony with the Kmeans algorithm.
    Moscato, P., Cotta, C., & Mendes, A. (2004). Memetic Algorithms. In J. Kacprzyk (Ed.), New Optimization Techniques in Engineering (Vol. 141, pp. 53-85). Germany: Springer Berlin Heidelberg.
    Mukhopadhyay, A., Maulik, U., & Bandyopadhyay, S. (2015). A Survey of Multiobjective Evolutionary Clustering. ACM Comput. Surv., 47, 1-46.
    Murthy, C. A., & Chowdhury, N. (1996). In search of optimal clusters using genetic algorithms. Pattern Recognition Letters, 17, 825-832.
    Nelder, J. A., & Mead, R. (1965). A Simplex Method for Function Minimization. The Computer Journal, 7, 308-313.
    Oftadeh, R., Mahjoob, M. J., & Shariatpanahi, M. (2010). A novel meta-heuristic optimization algorithm inspired by group hunting of animals: Hunting search. Computers & Mathematics with Applications, 60, 2087-2098.
    Omran, M. H., Salman, A., & Engelbrecht, A. (2005). Self-adaptive Differential Evolution. In Y. Hao, J. Liu, Y. Wang, Y.-m. Cheung, H. Yin, L. Jiao, J. Ma & Y.-C. Jiao (Eds.), Computational Intelligence and Security (Vol. 3801, pp. 192-199): Springer Berlin Heidelberg.
    Patil, P. B., & Verma, U. P. (2009). Numerical Computational Methods. U.K.: Alpha Science International Ltd.
    Price, K. V., Storn, R. M., & Lampinen, J. A. (2005). Differential Evolution A Practical Approach to Global Optimization. Berlin: Springer-Verlag.
    Qin, A. K., Huang, V. L., & Suganthan, P. N. (2009). Differential Evolution Algorithm With Strategy Adaptation for Global Numerical Optimization. Evolutionary Computation, IEEE Transactions on, 13, 398-417.
    Rosenbrock, H. H. (1960). An Automatic Method for Finding the Greatest or Least Value of a Function. The Computer Journal, 3, 175-184.
    Salomon, R. (1998). Evolutionary algorithms and gradient search: similarities and differences. Evolutionary Computation, IEEE Transactions on, 2, 45-55.
    Sardar, S., Maity, S., Das, S., & Suganthan, P. N. (2011). Constrained real parameter optimization with a gradient repair based Differential Evolution algorithm. In Differential Evolution (SDE), 2011 IEEE Symposium on (pp. 1-8).
    Shah-Hosseini, H. (2007). Problem solving by intelligent water drops. In 2007 IEEE Congress on Evolutionary Computation (pp. 3226-3231). Singapore.
    Shahidi, N., Esmaeilzadeh, H., Abdollahi, M., Ebrahimi, E., & Lucas, C. (2004). Self-adaptive memetic algorithm: an adaptive conjugate gradient approach. In Cybernetics and Intelligent Systems, 2004 IEEE Conference on (Vol. 1, pp. 6-11 vol.11).
    Shelokar, P. S., Jayaraman, V. K., & Kulkarni, B. D. (2004). An ant colony approach for clustering. Analytica Chimica Acta, 509, 187-195.
    Shi, Y., & Eberhart, R. C. (1999). Empirical study of particle swarm optimization. In Proceedings of the 1999 Congress on Evolutionary Computation (Vol. 3, pp. 1950 Vol. 1953). Washington D.C.
    Srinivas, N., & Deb, K. (1994). Muiltiobjective Optimization Using Nondominated Sorting in Genetic Algorithms. Evolutionary Computation, 2, 221-248.
    Storn, R., & Price, K. (1997). Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. Journal of Global Optimization, 11, 341-359.
    Tan, P.-N., Steinbach, M., & Kumar, V. (2006). Introduction to Data Mining. Boston: Pearson Education, Inc.
    Tingsong, D., Pusheng, F., & Yanjun, S. (2007). A Modified Niche Genetic Algorithm Based on Evolution Gradient and Its Simulation Analysis. In Natural Computation, 2007. ICNC 2007. Third International Conference on (Vol. 4, pp. 35-39).
    Turi, R. H. (2001). Clustering-based colour image segmentation. Monash, Australia.
    Wen, J. Y., Wu, Q. H., Jiang, L., & Cheng, S. J. (2003). Pseudo-gradient based evolutionary programming. In Electronics Letters (Vol. 39, pp. 631-632): Institution of Engineering and Technology.
    Wu, B. L., & Yu, X. H. (1998). Enhanced evolutionary programming for function optimization. In Evolutionary Computation Proceedings, 1998. IEEE World Congress on Computational Intelligence., The 1998 IEEE International Conference on (pp. 695-698).
    Xin, Y., Yong, L., & Guangming, L. (1999). Evolutionary programming made faster. IEEE Trans. Evol. Comput., 3, 82-102.
    Yang, C.-L., Kuo, R. J., Chien, C.-H., & Quyen, N. T. P. (2015). Non-dominated sorting genetic algorithm using fuzzy membership chromosome for categorical data clustering. Applied Soft Computing, 30, 113-122.
    Yang, X. S. (2010a). Engineering Optimization: An Introduction with Metaheuristic Applications. New Jersey: John Wiley & Sons, Inc.
    Yang, X. S. (2010b). A New Metaheuristic Bat-Inspired Algorithm. In J. González, D. Pelta, C. Cruz, G. Terrazas & N. Krasnogor (Eds.), Nature Inspired Cooperative Strategies for Optimization (Vol. 284, pp. 65-74). Germany: Springer Berlin Heidelberg.
    Ypma, T. J. (1995). Historical Development of the Newton-Raphson Method. SIAM Review, 37, 531-551.
    Yuhui, S., & Eberhart, R. C. (1999). Empirical study of particle swarm optimization. In Evolutionary Computation, 1999. CEC 99. Proceedings of the 1999 Congress on (Vol. 3, pp. 1950 Vol. 1953).
    Zaharie, D. (2003). Control of population diversity and adaptation indifferental evolution algorithms. In R. Matousek & P. Osmera (Eds.), Mendel 9th International Conference Soft Computing (pp. 41-46). Brno, Chech Republic.

    無法下載圖示 全文公開日期 2020/07/21 (校內網路)
    全文公開日期 本全文未授權公開 (校外網路)
    全文公開日期 本全文未授權公開 (國家圖書館:臺灣博碩士論文系統)
    QR CODE