Code covered by the BSD License  

Highlights from
Neural Network add-in for PSORT

image thumbnail

Neural Network add-in for PSORT

by

 

28 Nov 2010 (Updated )

This add-in allows a Neural Network to be trained by Particle Swarm Optimization technique.

NN_training_demo
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%   This function copyright 2010, 2011, 2012 Tricia Rambharose
%   Created on: 2010/09/18
%   info@tricia-rambharose.com
% This is a simple function used as an example of creating a 
% neural network (NN)and using a PSO algorithm as the training function.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

function NN_training_demo

clc %Clear the command window.
close all %Close all open figures.
clear all

P = [5	8	2	8; 8	2	3	5]; %NN inputs
T = [10	4	7	2]; %NN targets

net = newff(P,T,2 );%create a NN with 2 neurons in the hidden layer

net.trainFcn = 'trainpso'; %set the NN training function to use a PSO approach. The name 'trainpso' is used for consistency with previous research (B. Birge, 2005)             
net.layers{1}.transferFcn = 'tansig'; %sigmoidal transer function for hidden layers
net.layers{2}.transferFcn = 'purelin'; %linear transfer function for output layer 
net.trainParam.goal = 0.001; % equivalent to PSO true global minimum.
net.trainParam.epochs = 1000; % equivalent to number of PSO iterations.
net.trainParam.max_fail = 10; % this is the maximum number of consecutive PSO iterations where no improved performance is found.
net.trainParam.plotPSO = true; % option tp plot PSO particles
                       
Y = sim(net,P); %Simulate the NN

display_NN_settings;

[net,tr] = train(net,P,T); %train NN

display_NN_results;

plot_epochs; 

end

Contact us