# newff

Posted 2011.11.07 17:51
newff

## Purpose

Create feed-forward backpropagation network

## Syntax

• net = newff(P,T,[S1 S2...S(N-l)],{TF1 TF2...TFNl},
BTF,BLF,PF,IPF,OPF,DDF)


## Description

newff(P,T,[S1 S2...S(N-l)],{TF1 TF2...TFNl}, BTF,BLF,PF,IPF,OPF,DDF) takes several arguments

 P R x Q1 matrix of Q1 sample R-element input vectors T SN x Q2 matrix of Q2 sample SN-element target vectors Si Size of ith layer, for N-1 layers, default = [ ]. (Output layer size SN is determined from T.) TFi Transfer function of ith layer. (Default = 'tansig' for hidden layers and 'purelin' for output layer.) BTF Backpropagation network training function (default = 'trainlm') BLF Backpropagation weight/bias learning function (default = 'learngdm') PF Performance function. (Default = 'mse') IPF Row cell array of input processing functions. (Default = {'fixunknowns','removeconstantrows','mapminmax'}) OPF Row cell array of output processing functions. (Default = {'removeconstantrows','mapminmax'}) DDF Data divison function (default = 'dividerand')
,

and returns an N-layer feed-forward backpropagation network.

The transfer functions TFi can be any differentiable transfer function such as tansig, logsig, or purelin.

The training function BTF can be any of the backpropagation training functions such as trainlm, trainbfg, trainrp, traingd, etc.

 Caution    trainlm is the default training function because it is very fast, but it requires a lot of memory to run. If you get an out-of-memory error when training, try one of these:

• Slow trainlm training but reduce memory requirements by setting net.trainParam.mem_reduc to 2 or more. (See trainlm.)
• Use trainbfg, which is slower but more memory efficient than trainlm.
• Use trainrp, which is slower but more memory efficient than trainbfg.

The learning function BLF can be either of the backpropagation learning functions learngd or learngdm.

The performance function can be any of the differentiable performance functions such as mse or msereg.

## Examples

Here is a problem consisting of inputs P and targets T to be solved with a network.

• P = [0 1 2 3 4 5 6 7 8 9 10];
T = [0 1 2 3 4 3 2 1 2 3 4];


Here a network is created with one hidden layer of five neurons.

• net = newff(P,T,5);


The network is simulated and its output plotted against the targets.

• Y = sim(net,P);
plot(P,T,P,Y,'o')


The network is trained for 50 epochs. Again the network's output is plotted.

• net.trainParam.epochs = 50;
net = train(net,P,T);
Y = sim(net,P);
plot(P,T,P,Y,'o')


## Algorithm

Feed-forward networks consist of Nl layers using the dotprod weight function, netsum net input function, and the specified transfer function.

The first layer has weights coming from the input. Each subsequent layer has a weight coming from the previous layer. All layers have biases. The last layer is the network output.

Each layer's weights and biases are initialized with initnw.

Adaption is done with trains, which updates weights with the specified learning function. Training is done with the specified training function. Performance is measured according to the specified performance function.

newcf, newelm, sim, init, adapt, train, trains