File Exchange

image thumbnail

Simple Neural Networks with K-fold Cross-Validation Manner

version 1.0.4 (6.77 KB) by Jingwei Too
This toolbox contains 6 type of neural networks (NN) using k-fold cross-validation, which are simple and easy to implement.

29 Downloads

Updated 22 Jul 2020

View Version History

View License

This toolbox contains six type of neural networks
(1) Artificial neural network (ANN)
(2) Feed Forward Neural Network (FFNN)
(3) Cascade Forward Neural Network (CFNN)
(4) Recurrent Neural Network (RNN)
(5) Generalized Regression Neural Network (GRNN)
(6) Probabilistic Neural Network (PNN)

The "Main" script shows the examples of how to use these neural network programs with the benchmark data set.

The displayed results include:
(1) Accuracy for each fold in k-fold cross-validation
(2) Average accuracy over k-folds
(3) Confusion matrix.

Cite As

Jingwei Too (2020). Simple Neural Networks with K-fold Cross-Validation Manner (https://www.mathworks.com/matlabcentral/fileexchange/71468-simple-neural-networks-with-k-fold-cross-validation-manner), MATLAB Central File Exchange. Retrieved .

Comments and Ratings (29)

Vahid Aryai

arun anoop m

authors kindly mention the published paper of this code.

sui yongbo

good!

Joana

Hi
Great program.
But if i want to change the k-fold CV to pseudo online training like 60:20:20 for the training validation and testing, how can i do that.?

Amirah Nabilah

Hello Jingwei, great toolbox. But how can do i increase the 30 iterations in this code instead of 7 iterations only? Because I only get 74% accuracy.

Arka Roy

getting numerous amount of errors....

shadi aosati

Hello. I want to do this code for ELM. Can you help me to do it?

Joana

Jingwei Too

Dear Naina,

I think your data preparation was not right. Normally we did not use raw data for classification. You need to extract features from EEG signals and then only fed the feature vector into the algorithm for classification.

Hope these helps

Joana

Secondly, I want the designed model 'net' be saved as i want to use it to classify on other dataset. i tried with save net command but it didn't work.
Can you please help.?

Joana

Hi Jingwi,
Thanks for the reply. :)

I have one more question regarding data preparation fro the Input. I have EEG data for 2-classes, recorded at 1200hz, with 32 EEG channels. i have extracted each class for second for 100 trials. So the data is in the format of number of channels x sampling frequency x trials = 32x1200x200.
I tried your program by converting it to 2D as: 38,400x200. SO input layers neurons are 38,400. and 3 hidden layers with neurons [10 10 10], one output neuron.
Is it the right way to do it.? or should i try something else.?

I tried using cell array function but it gives an error that 'Data distribution doesn't have equal number of time steps'.

I'll highly appreciate if you can guide me on this one. :)

Jingwei Too

Dear Naina,

This error "Requested 1x1045164955648 (7787.1GB) array exceeds maximum array size preference" means your pc RAM is not enough to run this program, your data is too big.

Second, you can go through "help" and go to "neural network toolbox" to get the detail explanation of my program since I am designing the program using MatLab built-in function.

Joana

If i run the code for JPNN it gives this error:

Error using network/subsasgn>network_subsasgn (line 555)
net.IW{1,1} must be a 204-by-226 matrix.

Error in network/subsasgn (line 14)
net = network_subsasgn(net,subscripts,v,netname);

Error in newpnn>create_network (line 125)

How to solve this please.?

Joana

Hi
I am trying to use the code for EEG classification, and i get this error:

Error using zeros
Requested 1x1045164955648 (7787.1GB) array exceeds maximum array size preference. Creation of arrays
greater than this limit may take a long time and cause MATLAB to become unresponsive. See array size limit
or preference panel for more information.

Error in nnMex.perfsJEJJ>iJacobianFunctionForDirection (line 45)
TEMP = zeros(1,ceil(hints.tempSizeFJ/8)*8);

Error in nnMex.perfsJEJJ (line 8)
[jacobian, TEMP] = iJacobianFunctionForDirection( direction, hints );

Error in nnCalcLib/perfsJEJJ (line 388)
lib.calcMode.perfsJEJJ(calcNet,lib.calcData,lib.calcHints);

Error in trainlm>initializeTraining (line 169)
[worker.perf,worker.vperf,worker.tperf,worker.je,worker.jj,worker.gradient] = calcLib.perfsJEJJ(calcNet);

Error in nnet.train.trainNetwork>trainNetworkInMainThread (line 28)
worker = localFcns.initializeTraining(archNet,calcLib,calcNet,tr);

Error in nnet.train.trainNetwork (line 16)
[archNet,tr] = trainNetworkInMainThread(archNet,rawData,calcLib,calcNet,tr,feedback,localFcns);

Error in trainlm>train_network (line 160)
[archNet,tr] = nnet.train.trainNetwork(archNet,rawData,calcLib,calcNet,tr,localfunctions);

Error in trainlm (line 59)
[out1,out2] = train_network(varargin{2:end});

Error in network/train (line 373)
[net,tr] = feval(trainFcn,'apply',net,data,calcLib,calcNet,tr);

Error in jRNN (line 28)
net=train(net,xtrain',dummyvar(ytrain)');

How to solve it please.?

Joana

Secondly, can you please share any document to clarify theory behind ANN.? Is it MLP or deep learning (As it has more than 3 layers), or it's sort of a feedforward neural network. I'm confused about what exactly it is. The ANN with multiple layers in it.
If i have to cite your code then i need to know the theory behind. So please share something that is definitive to describe what exactly it is. :)

DA Bong

It's a very useful for simple classification and evaluation.

Jingwei Too

Dear ILIAS TOUGUI,

I have done the testing by changing the label(1:50)=0. Based on my observation, I found that my toolbox cannot read the 'label' with value 0. Since the 'label' is just a labelling for your dataset. So, you can change your 'label' of value 0 to another value, and it will not affect your findings.

ILIAS TOUGUI

Hello and thank you for this toolset, however i got an error while appliying it on my dataset with a target variable of either (1 or 0).

Error using dummyvar (line 101)
Each element of GROUP must be a positive integer.

I noticed that dummyvar doesn't accept 0. how too fix this problem?

Thank you.

Jingwei Too

Dear Muhammad Tariq Sadiq,

I haven't published articles related to this toolbox. But, you can press "F1" in MATLAB for more information. Since the toolbox is built based on MATLAB built-in function, the description/detail can be found in MATLAB. On the one hand, you can search on google scholar for more information. Hope this can help.

@Jingwei Too can you please share the articles you used to implement these methods? I need details of these methods

Muhammad Tariq Sadiq

Jingwei Too

Dear Ali,
My toolbox only limit to classification tasks. You need to find other relevant codes for regression

Ali

Is it only limited to classification problems? can i apply this for regression problems?

Muhammad Tariq Sadiq

i tried to run the Main file and got the following error, kind help me to remove the error

Default value is not a member of type "nntype.performance_fcn".
Error using nnetParamInfo (line 28)
Conversion to struct from double is not possible.

Error in patternnet>get_info (line 91)
nnetParamInfo('performFcn','Performance Function','nntype.performance_fcn','crossentropy',...

Error in patternnet (line 41)
if isempty(INFO), INFO = get_info; end

Error in jNN (line 6)
h1=Hiddens(1); net=patternnet(h1);

Error in Main (line 21)
NN=jNN(feat,label,kfold,Hiddens,Maxepochs);

Taiwo Fasae

Nice toolset, thanks! However, I noticed in some functions that the neural net isn't being initialized before every fold training. Do you think this will change the results? My hunch is that using the 'train' function on an existing neural net (with existing parameters) will give a different result.

Umit Isikdag

Andres Moran Duran

Matlab

Tee Wei Hown

MATLAB Release Compatibility
Created with R2018a
Compatible with any release
Platform Compatibility
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Simple Neural Networks with K-fold Cross-Validation Manner