Neural Network Toolbox | ![]() ![]() |
Syntax
net = newsom(PR,[D1,D2,...],TFCN,DFCN,OLR,OSTEPS,TLR,TND)
Description
Competitive layers are used to solve classification problems.
net = newsom
creates a new network with a dialog box.
net = newsom (PR,[D1,D2,...],TFCN,DFCN,OLR,OSTEPS,TLR,TND)
takes,
PR - R
x 2
matrix of min and max values for R
input elements.
Di -
Size of ith layer dimension, defaults = [5 8].
TFCN -
Topology function, default ='hextop
'.
DFCN -
Distance function, default ='linkdist
'.
OLR -
Ordering phase learning rate, default = 0.9.
OSTEPS -
Ordering phase steps, default = 1000.
TLR -
Tuning phase learning rate, default = 0.02;
TND -
Tuning phase neighborhood distance, default = 1.
and returns a new self-organizing map.
The topology function TFCN
can be hextop
, gridtop
, or randtop
. The distance function can be linkdist
, dist
, or mandist
.
Properties
Self-organizing maps (SOM) consist of a single layer with the negdist
weight function, netsum
net input function, and the compet
transfer function.
The layer has a weight from the input, but no bias. The weight is initialized with midpoint
.
Adaption and training are done with trains
and trainr
, which both update the weight with learnsom
.
Examples
The input vectors defined below are distributed over an two-dimension input space varying over [0 2] and [0 1]. This data will be used to train a SOM with dimensions [3 5].
Here the SOM is trained and the input vectors are plotted with the map that the SOM's weights have formed.
net = train(net,P); plot(P(1,:),P(2,:),'.g','markersize',20) hold on plotsom(net.iw{1,1},net.layers{1}.distances) hold off
See Also
![]() | newrbe | nncopy | ![]() |