Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

First Steps Towards a Runtime Analysis When Starting With a Good Solution (2006.12161v3)

Published 22 Jun 2020 in cs.NE

Abstract: The mathematical runtime analysis of evolutionary algorithms traditionally regards the time an algorithm needs to find a solution of a certain quality when initialized with a random population. In practical applications it may be possible to guess solutions that are better than random ones. We start a mathematical runtime analysis for such situations. We observe that different algorithms profit to a very different degree from a better initialization. We also show that the optimal parameterization of the algorithm can depend strongly on the quality of the initial solutions. To overcome this difficulty, self-adjusting and randomized heavy-tailed parameter choices can be profitable. Finally, we observe a larger gap between the performance of the best evolutionary algorithm we found and the corresponding black-box complexity. This could suggest that evolutionary algorithms better exploiting good initial solutions are still to be found. These first findings stem from analyzing the performance of the $(1+1)$ evolutionary algorithm and the static, self-adjusting, and heavy-tailed $(1 + (\lambda,\lambda))$ GA on the OneMax benchmark. We are optimistic that the question how to profit from good initial solutions is interesting beyond these first examples.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (48)
  1. Fast mutation in crossover-based algorithms. Algorithmica, 84:1724–1761, 2022.
  2. Lazy parameter tuning and control: Choosing all parameters randomly from a power-law distribution. Algorithmica, 2023. To appear.
  3. Anne Auger and Benjamin Doerr, editors. Theory of Randomized Search Heuristics. World Scientific Publishing, 2011.
  4. A tight runtime analysis for the (μ+λ)𝜇𝜆{(\mu+\lambda)}( italic_μ + italic_λ ) EA. Algorithmica, 83:1054–1095, 2021.
  5. A tight runtime analysis for the (1+(λ,λ))1𝜆𝜆{(1+(\lambda,\lambda))}( 1 + ( italic_λ , italic_λ ) ) GA on LeadingOnes. In Foundations of Genetic Algorithms, FOGA 2019, pages 169–182. ACM, 2019.
  6. A rigorous runtime analysis of the (1+(λ,λ))1𝜆𝜆{(1+(\lambda,\lambda))}( 1 + ( italic_λ , italic_λ ) ) GA on jump functions. Algorithmica, 84:1573–1602, 2022.
  7. Computing single source shortest paths using single-objective fitness. In Foundations of Genetic Algorithms, FOGA 2009, pages 59–66. ACM, 2009.
  8. Runtime analysis of the (1+(λ,λ))1𝜆𝜆{(1+(\lambda,\lambda))}( 1 + ( italic_λ , italic_λ ) ) genetic algorithm on random satisfiable 3-CNF formulas. In Genetic and Evolutionary Computation Conference, GECCO 2017, pages 1343–1350. ACM, 2017.
  9. Fixed-target runtime analysis. Algorithmica, 84:1762–1793, 2022.
  10. Optimal static and self-adjusting parameter choices for the (1+(λ,λ))1𝜆𝜆{(1+(\lambda,\lambda))}( 1 + ( italic_λ , italic_λ ) ) genetic algorithm. Algorithmica, 80:1658–1709, 2018.
  11. From black-box complexity to designing new genetic algorithms. Theoretical Computer Science, 567:87–104, 2015.
  12. Fast re-optimization via structural diversity. In Genetic and Evolutionary Computation Conference, GECCO 2019, pages 233–241. ACM, 2019.
  13. Optimal parameter choices via precise black-box analysis. Theoretical Computer Science, 801:1–34, 2020.
  14. Sharp bounds by probability-generating functions and variable drift. In Genetic and Evolutionary Computation Conference, GECCO 2011, pages 2083–2090. ACM, 2011.
  15. Adaptive drift analysis. Algorithmica, 65:224–250, 2013.
  16. Adjacency list matchings: an ideal genotype for cycle covers. In Genetic and Evolutionary Computation Conference, GECCO 2007, pages 1203–1210. ACM, 2007.
  17. Edge-based representation beats vertex-based representation in shortest path problems. In Genetic and Evolutionary Computation Conference, GECCO 2010, pages 759–766. ACM, 2010.
  18. Faster black-box algorithms through higher arity operators. In Foundations of Genetic Algorithms, FOGA 2011, pages 163–172. ACM, 2011.
  19. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science, 276:51–81, 2002.
  20. Upper and lower bounds for randomized search heuristics in black-box optimization. Theory of Computing Systems, 39:525–544, 2006.
  21. Multiplicative drift analysis. Algorithmica, 64:673–697, 2012.
  22. Optimizing linear functions with the (1+λ)1𝜆(1+\lambda)( 1 + italic_λ ) evolutionary algorithm—different asymptotic runtimes for different instances. Theoretical Computer Science, 561:3–23, 2015.
  23. Fast genetic algorithms. In Genetic and Evolutionary Computation Conference, GECCO 2017, pages 777–784. ACM, 2017.
  24. Benjamin Doerr and Frank Neumann, editors. Theory of Evolutionary Computation—Recent Developments in Discrete Optimization. Springer, 2020. Also available at http://www.lix.polytechnique.fr/Labo/Benjamin.Doerr/doerr_neumann_book.html.
  25. Simulated annealing is a polynomial-time approximation scheme for the minimum spanning tree problem. CoRR, abs/2204.02097, 2022.
  26. On two problems of information theory. Magyar Tudományos Akadémia Matematikai Kutató Intézet Közleményei, 8:229–243, 1963.
  27. Theoretical and empirical analysis of parameter control mechanisms in the (1+(λ,λ))1𝜆𝜆(1+(\lambda,\lambda))( 1 + ( italic_λ , italic_λ ) ) genetic algorithm. ACM Transactions on Evolutionary Learning and Optimization, 2:13:1–13:39, 2022.
  28. Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence, 127:51–81, 2001.
  29. Thomas Jansen. Analyzing Evolutionary Algorithms – The Computer Science Perspective. Springer, 2013.
  30. On the choice of the offspring population size in evolutionary algorithms. Evolutionary Computation, 13:413–440, 2005.
  31. Daniel Johannsen. Random Combinatorial Structures and Randomized Search Heuristics. PhD thesis, Universität des Saarlandes, 2010.
  32. Performance analysis of randomised search heuristics operating with a fixed budget. Theoretical Computer Science, 545:39–58, 2014.
  33. W. Koepf. Hypergeometric Summation: An Algorithmic Approach to Summation and Special Function Identities. Vieweg, Braunschweig, Germany, 1998.
  34. Ching-Fang Liaw. A hybrid genetic algorithm for the open shop scheduling problem. European Journal of Operations Research, 124:28–42, 2000.
  35. Black-box search by unbiased variation. Algorithmica, 64:623–642, 2012.
  36. Theoretical analysis of local search strategies to optimize network communication subject to preserving the total number of links. International Journal on Intelligent Computing and Cybernetics, 2:243–284, 2009.
  37. Heinz Mühlenbein. How genetic algorithms really work: mutation and hillclimbing. In Parallel Problem Solving from Nature, PPSN 1992, pages 15–26. Elsevier, 1992.
  38. Analysis of evolutionary algorithms in dynamic and stochastic environments. In Benjamin Doerr and Frank Neumann, editors, Theory of Evolutionary Computation: Recent Developments in Discrete Optimization, pages 323–357. Springer, 2020. Also available at https://arxiv.org/abs/1806.08547.
  39. Randomized local search, evolutionary algorithms, and the minimum spanning tree problem. Theoretical Computer Science, 378:32–40, 2007.
  40. Bioinspired Computation in Combinatorial Optimization – Algorithms and Their Computational Complexity. Springer, 2010.
  41. The choice of the offspring population size in the (1,λ)1𝜆{(1,\lambda)}( 1 , italic_λ ) evolutionary algorithm. Theoretical Computer Science, 545:20–38, 2014.
  42. Unbiased black box search algorithms. In Proc. of GECCO’11, pages 2035–2042. ACM, 2011.
  43. A theory and algorithms for combinatorial reoptimization. Algorithmica, 80:576–607, 2018.
  44. Abraham Wald. Some generalizations of the theory of cumulative sums of random variables. The Annals of Mathematical Statistics, 16:287–293, 1945.
  45. Ingo Wegener. Theoretical aspects of evolutionary algorithms. In Automata, Languages and Programming, ICALP 2001, pages 64–78. Springer, 2001.
  46. Carsten Witt. Runtime analysis of the (μ𝜇\muitalic_μ + 1) EA on simple pseudo-Boolean functions. Evolutionary Computation, 14:65–86, 2006.
  47. Anna Zych-Pawlewicz. Reoptimization of NP-hard problems. In Adventures Between Lower Bounds and Higher Altitudes – Essays Dedicated to Juraj Hromkovič on the Occasion of His 60th Birthday, pages 477–494. Springer, 2018.
  48. Evolutionary Learning: Advances in Theories and Algorithms. Springer, 2019.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Denis Antipov (17 papers)
  2. Maxim Buzdalov (18 papers)
  3. Benjamin Doerr (131 papers)
Citations (21)

Summary

We haven't generated a summary for this paper yet.