In this paper we present an algorithm that optimizes artificial neural networks using Differential Evolution. The evolutionary algorithm is applied according the conventional neuroevolution approach, i.e. to evolve the network weights instead of backpropagation or other optimization methods based on backpropagation. A batch system, similar to that one used in stochastic gradient descent, is adopted to reduce the computation time. Preliminary experimental results are very encouraging because we obtained good performance also in real classification dataset like MNIST, that are usually considered prohibitive for this kind of approach.
Can differential evolution be an efficient engine to optimize neural networks?
Marco Baioletti;DI BARI, GABRIELE;Valentina Poggioni
;TRACOLLI, MIRCO
2018
Abstract
In this paper we present an algorithm that optimizes artificial neural networks using Differential Evolution. The evolutionary algorithm is applied according the conventional neuroevolution approach, i.e. to evolve the network weights instead of backpropagation or other optimization methods based on backpropagation. A batch system, similar to that one used in stochastic gradient descent, is adopted to reduce the computation time. Preliminary experimental results are very encouraging because we obtained good performance also in real classification dataset like MNIST, that are usually considered prohibitive for this kind of approach.File in questo prodotto:
Non ci sono file associati a questo prodotto.
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.