BDBComp
Parceria:
SBC
Evolving Arbitrarily Connected Feedforward Neural Networks via Genetic Algorithms

Puma-Villanueva, W.J.Zuben, F.J.V.

Though several approaches have already been proposed in the literature to evolve neural network topologies for solving a wide range of machine learning tasks, this paper presents an alternative one, capable of evolving arbitrarily connected feed forward neural networks (ACFNNs), including linear and nonlinear neurons. A genetic algorithm is conceived to adjust the topology and also to perform variable selection. The weights of the obtained neural networks, with arbitrary topologies, are adjusted using a simple descent gradient algorithm. The purpose is to obtain high-quality and parsimonious predictors for two real-world and one synthetic time series. The obtained results are compared with the ones produced by traditional MLP models and Mixtures of Heterogeneous Experts (MHEs).

http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=5715225

Caso o link acima esteja inválido, faça uma busca pelo texto completo na Web: Buscar na Web

Biblioteca Digital Brasileira de Computação - Contato: bdbcomp@lbd.dcc.ufmg.br
     Mantida por:
LBD