Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Parallel Exploration via Negatively Correlated Search (1910.07151v2)

Published 16 Oct 2019 in cs.NE, cs.AI, and cs.LG

Abstract: Effective exploration is a key to successful search. The recently proposed Negatively Correlated Search (NCS) tries to achieve this by parallel exploration, where a set of search processes are driven to be negatively correlated so that different promising areas of the search space can be visited simultaneously. Various applications have verified the advantages of such novel search behaviors. Nevertheless, the mathematical understandings are still lacking as the previous NCS was mostly devised by intuition. In this paper, a more principled NCS is presented, explaining that the parallel exploration is equivalent to the explicit maximization of both the population diversity and the population solution qualities, and can be optimally obtained by partially gradient descending both models with respect to each search process. For empirical assessments, the reinforcement learning tasks that largely demand exploration ability is considered. The new NCS is applied to the popular reinforcement learning problems, i.e., playing Atari games, to directly train a deep convolution network with 1.7 million connection weights in the environments with uncertain and delayed rewards. Empirical results show that the significant advantages of NCS over the compared state-of-the-art methods can be highly owed to the effective parallel exploration ability.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Peng Yang (136 papers)
  2. Qi Yang (112 papers)
  3. Ke Tang (108 papers)
  4. Xin Yao (139 papers)
Citations (12)

Summary

We haven't generated a summary for this paper yet.