Emergent Mind

Neuroevolution in Deep Learning: The Role of Neutrality

(2102.08475)
Published Feb 16, 2021 in cs.NE

Abstract

A variety of methods have been applied to the architectural configuration and learning or training of artificial deep neural networks (DNN). These methods play a crucial role in the success or failure of the DNN for most problems and applications. Evolutionary Algorithms (EAs) are gaining momentum as a computationally feasible method for the automated optimisation of DNNs. Neuroevolution is a term which describes these processes of automated configuration and training of DNNs using EAs. However, the automatic design and/or training of these modern neural networks through evolutionary algorithms is computanalli expensive. Kimura's neutral theory of molecular evolution states that the majority of evolutionary changes at molecular level are the result of random fixation of selectively neutral mutations. A mutation from one gene to another is neutral if it does not affect the phenotype. This work discusses how neutrality, given certain conditions, can help to speed up the training/design of deep neural networks.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.