Generating Diverse and Competitive Play-Styles for Strategy Games (2104.08641v2)
Abstract: Designing agents that are able to achieve different play-styles while maintaining a competitive level of play is a difficult task, especially for games for which the research community has not found super-human performance yet, like strategy games. These require the AI to deal with large action spaces, long-term planning and partial observability, among other well-known factors that make decision-making a hard problem. On top of this, achieving distinct play-styles using a general algorithm without reducing playing strength is not trivial. In this paper, we propose Portfolio Monte Carlo Tree Search with Progressive Unpruning for playing a turn-based strategy game (Tribes) and show how it can be parameterized so a quality-diversity algorithm (MAP-Elites) is used to achieve different play-styles while keeping a competitive level of play. Our results show that this algorithm is capable of achieving these goals even for an extensive collection of game levels beyond those used for training.
- Diego Perez-Liebana (46 papers)
- Cristina Guerrero-Romero (2 papers)
- Alexander Dockhorn (16 papers)
- Linjie Xu (12 papers)
- Jorge Hurtado (1 paper)
- Dominik Jeurissen (5 papers)