| Neural Network Toolbox | ![]() |
Nguyen-Widrow layer initialization function
Syntax
Description
initnw is a layer initialization function that initializes a layer's weights and biases according to the Nguyen-Widrow initialization algorithm. This algorithm chooses values in order to distribute the active region of each neuron in the layer approximately evenly across the layer's input space.
initnw(net,i) takes two arguments,
and returns the network with layer i's weights and biases updated.
Network Use
You can create a standard network that uses initnw by calling newff or newcf.
To prepare a custom network to be initialized with initnw:
net.initFcn to 'initlay'. (This will set net.initParam to the empty matrix [ ] since initlay has no initialization parameters.)
net.layers{i}.initFcn to 'initnw'.
To initialize the network call init. See newff and newcf for training examples.
Algorithm
The Nguyen-Widrow method generates initial weight and bias values for a layer, so that the active regions of the layer's neurons will be distributed approximately evenly over the input space.
Advantages over purely random weights and biases are:
weightFcn" is dotprod
netInputFcn" set to netsum
If these conditions are not met, then initnw uses rands to initialize the layer's weights and biases.
See Also
| initlay | initwb | ![]() |