SCIENCE CHINA Information Sciences, Volume 64 , Issue 5 : 150103(2021) https://doi.org/10.1007/s11432-020-3114-y

## On the robustness of median samplingin noisy evolutionary optimization

• AcceptedOct 16, 2020
• PublishedApr 8, 2021
Share
Rating

### Acknowledgment

This work was supported by National Key Research and Development Program of China (Grant No. 2017YFB1003102), National Natural Science Foundation of China (Grant Nos. 62022039, 61672478, 61876077), and MOE University Scientific-Technological Innovation Plan Program. The authors would like to thank the anonymous reviewers for their helpful comments and suggestions to this work.

### References

[1] Bäck T. Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms. Oxford University Press, Oxford, UK, 1996. Google Scholar

[2] Xu P, Liu X, Cao H. An efficient energy aware virtual network migration based on genetic algorithm. Front Comput Sci, 2019, 13: 440-442 CrossRef Google Scholar

[3] Yuan Q, Tang H, You W. Virtual network function scheduling via multilayer encoding genetic algorithm with distributed bandwidth allocation. Sci China Inf Sci, 2018, 61: 092107 CrossRef Google Scholar

[4] Jin Y, Branke J. Evolutionary Optimization in Uncertain Environments-A Survey. IEEE Trans Evol Computat, 2005, 9: 303-317 CrossRef Google Scholar

[5] Aizawa A N, Wah B W. Scheduling of Genetic Algorithms in a Noisy Environment. Evolary Computation, 1994, 2: 97-122 CrossRef Google Scholar

[6] Stagge P. Averaging efficiently in the presence of noise. In: Proceedings of the 5th International Conference on Parallel Problem Solving from Nature, Amsterdam, The Netherlands, 1998. 188-197. Google Scholar

[7] Branke J, Schmidt C. Selection in the presence of noise. In: Proceedings of the 5th ACM Conference on Genetic and Evolutionary Computation, Chicago, IL, 2003. 766-777. Google Scholar

[8] Branke J, Schmidt C. Sequential sampling in noisy environments. In: Proceedings of the 8th International Conference on Parallel Problem Solving from Nature, Birmingham, UK, 2004. 202-211. Google Scholar

[9] Auger A, Doerr B. Theory of Randomized Search Heuristics: Foundations and Recent Developments. World Scientific, Singapore, 2011. Google Scholar

[10] Neumann F, Witt C. Bioinspired Computation in Combinatorial Optimization: Algorithms and Their Computational Complexity. Springer-Verlag, Berlin, Germany, 2010. Google Scholar

[11] Zhang Y, Huang H, Wu H. Theoretical analysis of the convergence property of a basic pigeon-inspired optimizer in a continuous search space. Sci China Inf Sci, 2019, 62: 070207 CrossRef Google Scholar

[12] Hwang H-K, Witt C. Sharp bounds on the runtime of the (1+1) EA via drift analysis and analytic combinatorial tools. In: Proceedings of the 15th International Workshop on Foundations of Genetic Algorithms. Potsdam, Germany, 2019. 1-12. Google Scholar

[13] Huang H, Su J, Zhang Y. An Experimental Method to Estimate Running Time of Evolutionary Algorithms for Continuous Optimization. IEEE Trans Evol Computat, 2020, 24: 275-289 CrossRef Google Scholar

[14] Zhang Y, Qin X, Ma Q. Markov chain analysis of evolutionary algorithms on OneMax function - From coupon collector's problem to (1?+?1) EA. Theor Comput Sci, 2020, 820: 26-44 CrossRef Google Scholar

[15] Bian C, Qian C, Tang K. Towards a running time analysis of the (1+1)-EA for OneMax and LeadingOnes under general bit-wise noise. In: Proceedings of the 15th International Conference on Parallel Problem Solving from Nature, Coimbra, Portugal, 2018. 165-177. Google Scholar

[16] Dang-Nhu R, Dardinier T, Doerr B, et al. A new analysis method for evolutionary optimization of dynamic and noisy objective functions. In: Proceedings of the 20th ACM Conference on Genetic and Evolutionary Computation, Kyoto, Japan, 2018. 1467-1474. Google Scholar

[17] Droste S. Analysis of the (1+1) EA for a noisy OneMax. In: Proceedings of the 6th ACM Conference on Genetic and Evolutionary Computation, Seattle, WA, 2004. 1088-1099. Google Scholar

[18] Gie?en C, K?tzing T. Robustness of Populations in Stochastic Environments. Algorithmica, 2016, 75: 462-489 CrossRef Google Scholar

[19] Qian C, Yu Y, Zhou Z H. Analyzing Evolutionary Optimization in Noisy Environments. Evolary Computation, 2018, 26: 1-41 CrossRef Google Scholar

[20] Sudholt D. On the robustness of evolutionary algorithms to noise: Refined results and an example where noise helps. In: Proceedings of the 20th ACM Conference on Genetic and Evolutionary Computation, Kyoto, Japan, 2018. 1523-1530. Google Scholar

[21] Qian C, Shi J-C, Yu Y, et al. Subset selection under noise. In: Advances in Neural Information Processing Systems 30, Long Beach, CA, 2017. 3563-3573. Google Scholar

[22] Qian C. Distributed Pareto Optimization for Large-Scale Noisy Subset Selection. IEEE Trans Evol Computat, 2020, 24: 694-707 CrossRef Google Scholar

[23] Dang D-C, and Lehre P K. Efficient optimisation of noisy fitness functions with population-based evolutionary algorithms. In: Proceedings of the 13th International Workshop on Foundations of Genetic Algorithms. Aberystwyth, UK, 2015. 62-68. Google Scholar

[24] Prügel-Bennett A, Rowe J, Shapiro J. Run-time analysis of population-based evolutionary algorithm in noisy environments. In: Proceedings of the 13th International Workshop on Foundations of Genetic Algorithms. Aberystwyth, UK, 2015. 69-75. Google Scholar

[25] Qian C, Bian C, Yu Y, et al. Analysis of noisy evolutionary optimization when sampling fails. In: Proceedings of the 20th ACM Conference on Genetic and Evolutionary Computation. Kyoto, Japan, 2018. 1507-1514. Google Scholar

[26] Qian C, Yu Y, Tang K. On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments. Evolary Computation, 2018, 26: 237-267 CrossRef Google Scholar

[27] Qian C, Bian C, Jiang W. Running Time Analysis of the ( $1+1$ 1 + 1 )-EA for OneMax and LeadingOnes Under Bit-Wise Noise. Algorithmica, 2019, 81: 749-795 CrossRef Google Scholar

[28] Friedrich T, Kotzing T, Krejca M S. The Compact Genetic Algorithm is Efficient under Extreme Gaussian Noise. IEEE Trans Evol Computat, 2016, : 1-1 CrossRef Google Scholar

[29] Doerr B, Hota A, Kötzing T. Ants easily solve stochastic shortest path problems. In: Proceedings of the 14th ACM Conference on Genetic and Evolutionary Computation, Philadelphia, PA, 2012. 17-24. Google Scholar

[30] Feldmann M, Kötzing T. Optimizing expected path lengths with ant colony optimization using fitness proportional update. In: Proceedings of the 12th International Workshop on Foundations of Genetic Algorithms. Adelaide, Australia, 2013. 65-74. Google Scholar

[31] Friedrich T, K?tzing T, Krejca M S. Robustness of Ant Colony Optimization to Noise. Evolary Computation, 2016, 24: 237-254 CrossRef Google Scholar

[32] Sudholt D, Thyssen C. A Simple Ant Colony Optimizer for Stochastic Shortest Path Problems. Algorithmica, 2012, 64: 643-672 CrossRef Google Scholar

[33] Akimoto Y, Astete-Morales S, Teytaud O. Analysis of runtime of optimization algorithms for noisy functions over discrete codomains. Theor Comput Sci, 2015, 605: 42-50 CrossRef Google Scholar

[34] Huber P, Ronchetti M. Robust Statistics. John Wiley & Sons, Hoboken, NJ, 2009. Google Scholar

[35] Leys C, Ley C, Klein O. Detecting outliers: Do not use standard deviation around the mean, use absolute deviation around the median. J Exp Social Psychology, 2013, 49: 764-766 CrossRef Google Scholar

[36] DeNavas-Walt C, Proctor B, Smith J. Income, Poverty, and Health Insurance Coverage in the United States: 2011. U.S. Census Bureau, 2012. Google Scholar

[37] Doerr B, Sutton A. When resampling to cope with noise, use median, not mean. In: Proceedings of the 21st ACM Conference on Genetic and Evolutionary Computation, Prague, Czech Republic, 2019. 242-248. Google Scholar

[38] Droste S, Jansen T, Wegener I. On the analysis of the (1+1) evolutionary algorithm. Theor Comput Sci, 2002, 276: 51-81 CrossRef Google Scholar

[39] He J, Yao X. Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence, 2001, 127: 57-85 CrossRef Google Scholar

•

Algorithm 1 (1+1)-EA

Let $x$ be a uniformly chosen solution in $\{0,1\}^n$;

while the stopping criterion does not satisfy do

$z\leftarrow$ copy $x$;

Change each bit of $z$ with a probability of $1/n$ independently;

if $f^{\mathrm{n}}(z)~\geq~f^{\mathrm{n}}(x)$ then

$x\leftarrow~z$;

end if

end while

Citations

Altmetric