Publications:Training neural networks by stochastic optimisation


Do not edit this section

Keep all hand-made modifications below

Title Training neural networks by stochastic optimisation
Author Antanas Verikas and Adas Gelzinis
Year 2000
PublicationType Journal Paper
Journal Neurocomputing
Diva url
Abstract We present a stochastic learning algorithm for neural networks. The algorithm does not make any assumptions about transfer functions of individual neurons and does not depend on a functional form of a performance measure. The algorithm uses a random step of varying size to adapt weights. The average size of the step decreases during learning. The large steps enable the algorithm to jump over local maxima/minima, while the small ones ensure convergence in a local area. We investigate convergence properties of the proposed algorithm as well as test the algorithm on four supervised and unsupervised learning problems. We have found a superiority of this algorithm compared to several known algorithms when testing them on generated as well as real data.