Contents

learnsomb

Batch self-organizing map weight learning function

Syntax

[dW,LS] = learnsomb(W,P,Z,N,A,T,E,gW,gA,D,LP,LS)
info = learnsomb('code')

Description

learnsomb is the batch self-organizing map weight learning function.

[dW,LS] = learnsomb(W,P,Z,N,A,T,E,gW,gA,D,LP,LS) takes several inputs:

W

S-by-R weight matrix (or S-by-1 bias vector)

P

R-by-Q input vectors (or ones(1,Q))

Z

S-by-Q weighted input vectors

N

S-by-Q net input vectors

A

S-by-Q output vectors

T

S-by-Q layer target vectors

E

S-by-Q layer error vectors

gW

S-by-R gradient with respect to performance

gA

S-by-Q output gradient with respect to performance

D

S-by-S neuron distances

LP

Learning parameters, none, LP = []

LS

Learning state, initially should be = []

and returns the following:

dW

S-by-R weight (or bias) change matrix

LS

New learning state

Learning occurs according to learnsomb's learning parameter, shown here with its default value:

LP.init_neighborhood

3

Initial neighborhood size

LP.steps

100

Ordering phase steps

info = learnsomb('code') returns useful information for each code string:

'pnames'

Returns names of learning parameters.

'pdefaults'

Returns default learning parameters.

'needg'

Returns 1 if this function uses gW or gA.

Examples

This example defines a random input P, output A, and weight matrix W for a layer with a 2-element input and 6 neurons. This example also calculates the positions and distances for the neurons, which appear in a 2-by-3 hexagonal pattern.

p = rand(2,1);
a = rand(6,1);
w = rand(6,2);
pos = hextop(2,3);
d = linkdist(pos);
lp = learnsomb('pdefaults');

Because learnsom only needs these values to calculate a weight change (see Algorithm).

ls = [];
[dW,ls] = learnsomb(w,p,[],[],a,[],[],[],[],d,lp,ls)

Network Use

You can create a standard network that uses learnsomb with selforgmap. To prepare the weights of layer i of a custom network to learn with learnsomb:

  1. Set NET.trainFcn to 'trainr'. (NET.trainParam automatically becomes trainr's default parameters.)

  2. Set NET.adaptFcn to 'trains'. (NET.adaptParam automatically becomes trains's default parameters.)

  3. Set each NET.inputWeights{i,j}.learnFcn to 'learnsomb'.

  4. Set each NET.layerWeights{i,j}.learnFcn to 'learnsomb'. (Each weight learning parameter property is automatically set to learnsomb's default parameters.)

To train the network (or enable it to adapt):

  1. Set NET.trainParam (or NET.adaptParam) properties as desired.

  2. Call train (or adapt).

More About

expand all

Algorithms

learnsomb calculates the weight changes so that each neuron's new weight vector is the weighted average of the input vectors that the neuron and neurons in its neighborhood responded to with an output of 1.

The ordering phase lasts as many steps as LP.steps.

During this phase, the neighborhood is gradually reduced from a maximum size of LP.init_neighborhood down to 1, where it remains from then on.

See Also

| |

Was this topic helpful?