Neural Network Toolbox | ![]() ![]() |
Update NNT 2.0 Elman backpropagation network
Syntax
net = nnt2elm(PR,W1,B1,W2,B2,BTF,BLF,PF)
Description
nnt2elm(PR,W1,B1,W2,B2,BTF,BLF,PF)
takes these arguments,
PR - R
x 2
matrix of min and max values for R
input elements.
W1 - S1
x (R+S1)
weight matrix.
B1 - S1
x 1
bias vector.
W2 - S2
x S1
weight matrix.
B2 - S2
x 1
bias vector.
BTF -
Backpropagation network training function, default = 'traingdx
'.
BLF -
Backpropagation weight/bias learning function, default = 'learngdm
'.
PF -
Performance function, default = 'mse
'.
and returns a feed-forward network.
The training function BTF
can be any of the backpropagation training functions such as traingd
, traingdm
, traingda
, and traingdx
. Large step-size algorithms, such as trainlm
, are not recommended for Elman networks.
The learning function BLF
can be either of the backpropagation learning functions such as learngd
or learngdm
.
The performance function can be any of the differentiable performance functions such as mse
or msereg
.
Once a network has been updated, it can be simulated, initialized, adapted, or trained with sim, init
, adapt
, and train.
See Also
![]() | nnt2c | nnt2ff | ![]() |