BATCHgd.MLPnet {AMORE}R Documentation

Batch gradient descent training

Description

Modifies the neural network weights and biases according to the training set.

Usage

BATCHgd.MLPnet(net,P,T,n.epochs, n.threads=0L)

Arguments

net

Neural Network to train.

P

Input data set.

T

Target output data set.

n.epochs

Number of epochs to train

n.threads

Number of threads to spawn. If <1, spawns NumberProcessors-1 threads. If no OpenMP is found, this argument will be ignored.

Value

This function returns a neural network object modified according to the chosen data.

Author(s)

Manuel Castejón Limas. manuel.castejon@gmail.com
Joaquin Ordieres Meré. j.ordieres@upm.es
Ana González Marcos. ana.gonzalez@unirioja.es
Alpha V. Pernía Espinoza. alpha.pernia@unirioja.es
Francisco Javier Martinez de Pisón. fjmartin@unirioja.es
Fernando Alba Elías. fernando.alba@unavarra.es

References

Simon Haykin. Neural Networks – a Comprehensive Foundation. Prentice Hall, New Jersey, 2nd edition, 1999. ISBN 0-13-273350-1.

See Also

newff,train,BATCHgdwm.MLPnet


[Package AMORE version 0.2-15 Index]