logo

SCIENCE CHINA Information Sciences, Volume 62 , Issue 10 : 202101(2019) https://doi.org/10.1007/s11432-018-9852-8

Wave models and dynamical analysis of evolutionary algorithms

More info
  • ReceivedOct 18, 2018
  • AcceptedMar 29, 2019
  • PublishedSep 3, 2019

Abstract


Acknowledgment

This work was supported by National Natural Science Foundation of China (Grant No. 61672391).


References

[1] Back T. Evolutionary algorithms in theory and practice: evolution strategies, evolutionary programming, genetic algorithms. Oxford: Oxford University Press, 1996. Google Scholar

[2] Wang F, Zhang H, Li K. A hybrid particle swarm optimization algorithm using adaptive learning strategy. Inf Sci, 2018, 436-437: 162-177 CrossRef Google Scholar

[3] Guo S M, Yang C C. Enhancing Differential Evolution Utilizing Eigenvector-Based Crossover Operator. IEEE Trans Evol Computat, 2015, 19: 31-49 CrossRef Google Scholar

[4] Wegener I. Methods for the analysis of evolutionary algorithms on pseudo-boolean functions. Evolutionary optimization. Boston: Springer, 2003: 349-369 DOI: 10.17877/DE290R-15272. Google Scholar

[5] Beyer H G. Convergence analysis of evolutionary algorithms that are based on the paradigm of information geometry.. Evolary Computation, 2014, 22: 679-709 CrossRef PubMed Google Scholar

[6] Derrac J, García S, Hui S. Analyzing convergence performance of evolutionary algorithms: A statistical approach. Inf Sci, 2014, 289: 41-58 CrossRef Google Scholar

[7] Tan C J, Neoh S C, Lim C P. Application of an evolutionary algorithm-based ensemble model to job-shop scheduling. J Intell Manuf, 2019, 30: 879-890 CrossRef Google Scholar

[8] Wu H, Kuang L, Wang F. A multiobjective box-covering algorithm for fractal modularity on complex networks. Appl Soft Computing, 2017, 61: 294-313 CrossRef Google Scholar

[9] Goldberg D E, Segrest P. Finite Markov chain analysis of genetic algorithms. In: Proceedings of the 2nd International Conference on Genetic Algorithms, Cambridge, 1987. 1: 1. Google Scholar

[10] Rudolph G. Finite Markov chain results in evolutionary computation: A tour d'horizon. Fund Inform, 1998, 35: 67--89. Google Scholar

[11] He J, Yao X. Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence, 2001, 127: 57-85 CrossRef Google Scholar

[12] Sudholt D. A New Method for Lower Bounds on the Running Time of Evolutionary Algorithms. IEEE Trans Evol Computat, 2013, 17: 418-435 CrossRef Google Scholar

[13] Yu Y, Qian C, Zhou Z H. Switch Analysis for Running Time Analysis of Evolutionary Algorithms. IEEE Trans Evol Computat, 2015, 19: 777-792 CrossRef Google Scholar

[14] Bian C, Qian C, Tang K. A general approach to running time analysis of multi-objective evolutionary algorithms. In: Proceedings of 27th International Joint Conference on Artificial Intelligence (IJCAI), Stockholm, 2018. 1405--1411. Google Scholar

[15] Mori N, Yoshida J, Tamaki H, et al. A thermodynamical selection rule for the genetic algorithm. In: Proceedings of IEEE International Conference on Evolutionary Computation, Perth, 1995. 1: 188. Google Scholar

[16] Cornforth T W, Lipson H. A hybrid evolutionary algorithm for the symbolic modeling of multiple-time-scale dynamical systems. Evol Intel, 2015, 8: 149-164 CrossRef Google Scholar

[17] Li Y X, Zou X F, Kang L S. A new dynamical evolutionary algorithm based on statistical mechanics. J Comput Sci Technol, 2003, 18: 361-368 CrossRef Google Scholar

[18] Li Y X, Xiang Z L, Xia J N. Dynamical system models and convergence analysis for simulated annealing algorithm. Chinese Journal of Computers, 2018, 1-13. http://kns.cnki.net/kcms/detail/11.1826.TP.20181114.1046.002.html. Google Scholar

[19] Li Y X, Xiang Z L, Zhang W Y. A relaxation model and time complexity analysis for simulated annealing algorithm. Chinese Journal of Computers, 2018. to be published. Google Scholar

[20] Zhou Y L. One-Dimensional Unsteady Hydrodynamics. Beijing: Science China Press, 1998. Google Scholar

[21] Lamb H. Hydrodynamics. Cambridge: Cambridge University Press, 1993. Google Scholar

[22] Gu C H, Li D Q. Mathematical Physics Equations. Beijing: People's Education Press, 1982. Google Scholar

[23] Zhang Y. Expansion Waves and Shock Waves. Beijing: Peking University Press, 1983. Google Scholar

[24] Shi Y, Eberhart R C. Empirical study of particle swarm optimization. In: Proceedings of IEEE International Conference on Evolutionary Computation, Washington, 1999. 3: 1945--1950. Google Scholar

[25] Liang J J, Qu B Y, Suganthan P N. Problem Definitions and Evaluation Criteria for the CEC 2014 Special Session and Competition on Single Objective Real-Parameter Numerical Optimization. Zhengzhou University and Nanyang Technological University, Technical Report. 2013. Google Scholar

[26] ?repin?ek M, Liu S H, Mernik M. Exploration and exploitation in evolutionary algorithms. ACM Comput Surv, 2013, 45: 1-33 CrossRef Google Scholar

[27] Liu S H, Mernik M, Bryant B R. To explore or to exploit: An entropy-driven approach for evolutionary algorithms. Int J Knowledge-based Intell Eng Syst, 2009, 13: 185-206 CrossRef Google Scholar

[28] Tang K, Yang P, Yao X. Negatively Correlated Search. IEEE J Sel Areas Commun, 2016, 34: 542-550 CrossRef Google Scholar

[29] Ursem R K. Diversity-guided evolutionary algorithms. In: Proceedings of International Conference on Parallel Problem Solving from Nature, Berlin, 2002. 462--471. Google Scholar

  • Figure 1

    (Color online) Illustration showing (a) the rightward linear wave and (b) clusters of characteristic lines.

  • Figure 2

    (Color online) Wave propagation for $f(x)$, showing the (a) particle density distribution, (b) volume elasticity coefficient $\kappa$, and (c) wave propagation speed $a$ over time.

  • Figure 3

    (Color online) (a) ${\rm~TSP}(n=50)$; (b) ${\rm~TSP}(n=100)$.

  • Figure 4

    (Color online) PSO applied to (a) $f_1$ and (b) $f_2$.

  • Figure 5

    (Color online) DE applied to (a) $f_1$ and (b) $f_2$.

  • Figure 6

    (Color online) PSO vs. DE, applied to (a) $f_1$ and (b) $f_2$.

  • Figure 7

    (Color online) $a$ vs. $a_{sw}$, for (a) $f_1$ and (b) $f_2$.

  • Table 1   Test functions used for PSO and DE
    Test function $D$ Search range $f_{\rm~min}$ Name
    $f_1(y)=\sum_{i=1}^{D-1}(100(y_{i}^2-y_{i+1}^2)+(y_i-1)^2)+F_{4}^*$,30 $[-100,100]^{D}$400Shifted and Rotated
    where $y=M(\frac{2.048z}{100})+1$, $M$ is a rotation matrix,Rosenbrock [25]
    $z=x-o$, and $o=[o_1,o_2,\ldots,o_n]$ is the shifted
    global optimum.
    $f_2(y)=\sum_{i=1}^D\frac{y_{i}^2}{4000}-\prod_{i=1}^D{\rm~cos}(\frac{y_i}{\sqrt~i})+1+F_{7}^*$,30$[-100,100]^{D}$700 Shifted and Rotated
    where $y=M(\frac{600z}{100})$, $M$ is a rotation matrix,Griewank [25]
    $z=x-o$, and $o=[o_1,o_2,\ldots,o_n]$ is the shifted
    global optimum.