Emergent Mind

Hierarchical Learning Algorithm for the Beta Basis Function Neural Network

(1210.8124)
Published Oct 30, 2012 in cs.NE and cs.AI

Abstract

The paper presents a two-level learning method for the design of the Beta Basis Function Neural Network BBFNN. A Genetic Algorithm is employed at the upper level to construct BBFNN, while the key learning parameters :the width, the centers and the Beta form are optimised using the gradient algorithm at the lower level. In order to demonstrate the effectiveness of this hierarchical learning algorithm HLABBFNN, we need to validate our algorithm for the approximation of non-linear function.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.