Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

New to MATLAB?

how can I do classification with Neural Network ?

Asked by seda

seda (view profile)

on 17 Dec 2012

i want to do data classification by using learning rule.i try to write some code but i must change my learning rate to get optimum solition.

if true

load fisheriris

traindata=[meas(1:25,:);meas(51:75,:);meas(101:125,:)];

testdata=[meas(26:50,:);meas(76:100,:);meas(126:150,:)];

traindata_class=[zeros(25,1);ones(25,1);-ones(25,1)];

testdata_class=[zeros(25,1);ones(25,1);-ones(25,1)];

traindata=traindata';

testdata=testdata';

traindata_class=traindata_class';

testdata_class=testdata_class';

net=newff(traindata,traindata_class,[5],{'tansig'},'traingdx');

net.trainParam.epochs=100;

net.trainParam.max_fail=20;

[net,tr]=train(net,traindata,traindata_class); cikis_ysa=sim(net,testdata); cikis_ysa=round(cikis_ysa);

hata=0;

for i=1:75

    if (cikis_ysa(i)-testdata_class(i)~=0)
        hata=hata+1;
    end
end
yuzdehata=100*hata/75;
dogruluk=100-yuzdehata;
end

0 Comments

seda

seda (view profile)

Products

No products are associated with this question.

2 Answers

Answer by Greg Heath

Greg Heath (view profile)

on 22 Dec 2012
Accepted answer

Your major mistakes were

1. Overriding the newff defaults maxepochs = 1000 and max_fail = 6

2. Not overriding the data division default trn/val/tst = 0.7/0.15/0.15

3. Not searching over 10 or more random weight initializations

When I make the change

net.divideFcn = '';

the max_fail specification is moot.

Then, intializing the random number generator and looping over

Ntrials = 20

weight initialization trials:

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

rng(0)

for i = 1:Ntrials

net=newff(traindata,traindata_class,[5],{'tansig'},'traingdx');

net.divideFcn = '';

[net, tr Ytrn ] = train(net,traindata,traindata_class);

Nepochs(i,1) = tr.epoch(end);

ytrn = round(Ytrn);

Nerrtrn(i,1) = sum(ytrn ~= traindata_class);

ytst = round( sim(net,testdata) );

Nerrtst(i,1) = sum(ytst ~= testdata_class);

end

disp( 'Nepochs Nerrtrn Nerrtst' )

disp( [ Nepochs , Nerrtrn , Nerrtst ] )

Nepochs Nerrtrn Nerrtst

   100     7     8
   100    18    15
   100    12    15
   100     9    13
   100    13    12
   100     7    10
   100     8    12
   100    30    29
   100    10    11
   100    48    49
   100     6    11
   100     8     9
   100    29    25
   100    15    15
   100     7    10
   100    42    43
   100     6    13
   100    37    37
   100     6    12
   100     9     8

Whereas the newff default maxepochs = 1000 yields

Nepochs Nerrtrn Nerrtst

        1000           0           7
        1000           0           7
        1000           1           7
        1000           1           8
        1000           1           7
        1000           1           7
        1000           1           8
        1000           3           7
        1000           0           7
        1000           1           7
        1000           1           7
        1000           1           7
        1000           0           7
        1000           2           7
        1000           0           7
        1000           3           6
        1000           0           7
        1000           0           7
        1000           1           7
        1000           3           6

However, you would probably do much better using the newff default training and learning functions.

net = newff(traindata,traindata_class,5);

0 Comments

Greg Heath

Greg Heath (view profile)

Answer by Greg Heath

Greg Heath (view profile)

on 21 Dec 2012
Edited by Greg Heath

Greg Heath (view profile)

on 21 Dec 2012

close all, clear all, clc

tic

[ x , t ] = iris_dataset;

whos

[ I N ] = size(x) % [ 4 150 ]

[ O N ] = size(t) % [ 3 150 ]

itrn = [ (1:25), (51:75), (101:125) ];

itst = [ (26:50), (76:100), (126:150) ];

xtrn = x( : , itrn); xtst = x( : , itst);

ttrn = t( : , itrn); ttst =t( : , itst);

trnclass0 = vec2ind(ttrn)

tstclass0 = vec2ind(ttst)

H = 5

Ntrials = 10

rng(0)

for i = 1:Ntrials

    net = newpr(xtrn,ttrn,H); % See help newpr
    net.divideFcn = '';           % Override default  0.7/0.15/0.15
    [net tr Ytrn Etrn ] = train(net,xtrn,ttrn);
    Nepochs(i,1) = tr.epoch(end);
    trnclass = vec2ind(Ytrn);
    Nerrtrn(i,1) = sum(trnclass ~= trnclass0);
    Ytst = sim(net,xtst);
    tstclass = vec2ind(Ytst);
    Nerrtst(i,1) = sum(tstclass ~= tstclass0);

end

disp( 'Nepochs Nerrtrn Nerrtst ');

disp( [ Nepochs, Nerrtrn, Nerrtst ] );

time = toc

Thank you for formally accepting my answer.

Greg

0 Comments

Greg Heath

Greg Heath (view profile)

Contact us