Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CMA-ES with Adaptive Reevaluation for Multiplicative Noise (2405.11471v1)

Published 19 May 2024 in cs.NE

Abstract: The covariance matrix adaptation evolution strategy (CMA-ES) is a powerful optimization method for continuous black-box optimization problems. Several noise-handling methods have been proposed to bring out the optimization performance of the CMA-ES on noisy objective functions. The adaptations of the population size and the learning rate are two major approaches that perform well under additive Gaussian noise. The reevaluation technique is another technique that evaluates each solution multiple times. In this paper, we discuss the difference between those methods from the perspective of stochastic relaxation that considers the maximization of the expected utility function. We derive that the set of maximizers of the noise-independent utility, which is used in the reevaluation technique, certainly contains the optimal solution, while the noise-dependent utility, which is used in the population size and leaning rate adaptations, does not satisfy it under multiplicative noise. Based on the discussion, we develop the reevaluation adaptation CMA-ES (RA-CMA-ES), which computes two update directions using half of the evaluations and adapts the number of reevaluations based on the estimated correlation of those two update directions. The numerical simulation shows that the RA-CMA-ES outperforms the comparative method under multiplicative noise, maintaining competitive performance under additive noise.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (13)
  1. Quality gain analysis of the weighted recombination evolution strategy on general convex quadratic functions. Theoretical Computer Science 832 (2020), 42–67. https://doi.org/10.1016/j.tcs.2018.05.015
  2. Bidirectional Relation between CMA Evolution Strategies and Natural Evolution Strategies. In Parallel Problem Solving from Nature, PPSN XI. Springer Berlin Heidelberg, Berlin, Heidelberg, 154–163.
  3. Hans-Georg Beyer and Bernhard Sendhoff. 2007. Evolutionary Algorithms in the Presence of Noise: To Sample or Not to Sample. In 2007 IEEE Symposium on Foundations of Computational Intelligence. IEEE, 17–24. https://doi.org/10.1109/FOCI.2007.372142
  4. Nikolaus Hansen. 2016. The CMA Evolution Strategy: A Tutorial. arXiv:1604.00772 (2016). arXiv:1604.00772
  5. Nikolaus Hansen and Anne Auger. 2014. Principled Design of Continuous Stochastic Search: From Theory to Practice. Springer Berlin Heidelberg, Berlin, Heidelberg, 145–180. https://doi.org/10.1007/978-3-642-33206-7_8
  6. COCO: a platform for comparing continuous optimizers in a black-box setting. Optimization Methods and Software 36, 1 (2020), 114–144. https://doi.org/10.1080/10556788.2020.1808977
  7. A Method for Handling Uncertainty in Evolutionary Optimization With an Application to Feedback Control of Combustion. IEEE Transactions on Evolutionary Computation 13, 1 (2009), 180–197. https://doi.org/10.1109/TEVC.2008.924423
  8. Nikolaus Hansen and Andreas Ostermeier. 1996. Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation. In Proceedings of IEEE International Conference on Evolutionary Computation. IEEE, 312–317. https://doi.org/10.1109/ICEC.1996.542381
  9. Verena Heidrich-Meisner and Christian Igel. 2009. Uncertainty Handling CMA-ES for Reinforcement Learning. In Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation. Association for Computing Machinery, New York, NY, USA, 1211–1218. https://doi.org/10.1145/1569901.1570064
  10. Michael Hellwig and Hans-Georg Beyer. 2016. Evolution Under Strong Noise: A Self-Adaptive Evolution Strategy Can Reach the Lower Performance Bound - The pcCMSA-ES. In Proceedings of Parallel Problem Solving from Nature (PPSN) (Lecture Notes in Computer Science, Vol. 9921). Springer, 26–36.
  11. Kouhei Nishida and Youhei Akimoto. 2018. PSA-CMA-ES: CMA-ES with Population Size Adaptation. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO). Association for Computing Machinery, 865–872.
  12. CMA-ES with Learning Rate Adaptation: Can CMA-ES with Default Population Size Solve Multimodal and Noisy Problems?. In Proceedings of the Genetic and Evolutionary Computation Conference. Association for Computing Machinery, New York, NY, USA, 839–847. https://doi.org/10.1145/3583131.3590358
  13. Information-Geometric Optimization Algorithms: A Unifying Picture via Invariance Principles. Journal of Machine Learning Research 18, 1 (2017), 564–628.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Kento Uchida (19 papers)
  2. Kenta Nishihara (1 paper)
  3. Shinichi Shirakawa (25 papers)

Summary

We haven't generated a summary for this paper yet.