Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Expert-Driven Genetic Algorithms for Simulating Evaluation Functions (1711.06841v1)

Published 18 Nov 2017 in cs.NE, cs.LG, and stat.ML

Abstract: In this paper we demonstrate how genetic algorithms can be used to reverse engineer an evaluation function's parameters for computer chess. Our results show that using an appropriate expert (or mentor), we can evolve a program that is on par with top tournament-playing chess programs, outperforming a two-time World Computer Chess Champion. This performance gain is achieved by evolving a program that mimics the behavior of a superior expert. The resulting evaluation function of the evolved program consists of a much smaller number of parameters than the expert's. The extended experimental results provided in this paper include a report of our successful participation in the 2008 World Computer Chess Championship. In principle, our expert-driven approach could be used in a wide range of problems for which appropriate experts are available.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Eli David (32 papers)
  2. Moshe Koppel (16 papers)
  3. Nathan S. Netanyahu (30 papers)
Citations (13)

Summary

We haven't generated a summary for this paper yet.