File Exchange

image thumbnail

Simple Machine Learning Algorithms for Classification

version 1.0.5 (6.5 KB) by Jingwei Too
Simple and ease of implementation. The machine learning algorithms include KNN, SVM, LDA, NB, RF and DT.

94 Downloads

Updated 22 Jul 2020

View Version History

View License

This toolbox contains six widely used machine learning algorithms
(1) K-nearest Neighbor (KNN)
(2) Support Vector Machine (SVM)
(3) Decision Tree (DT)
(4) Discriminate Analysis Classifier (DA)
(5) Naive Bayes (NB)
(6) Random Forest (RF)

The "Main" script shows examples of how to use these machine learning programs with the benchmark data set.

The displayed results include:
(1) Accuracy for each fold in k-fold cross-validation
(2) Average accuracy over k-folds
(3) Confusion matrix.

**********************************************************************************************************************************
Detail of machine learning methods can be found in the following papers:
[1] Too, J., Abdullah, A.R. and Saad, N.M., 2019. Classification of hand movements based on discrete wavelet transform and enhanced feature extraction. Int. J. Adv. Comput. Sci. Appl, 10(6), pp.83-89.
DOI: http://dx.doi.org/10.14569/IJACSA.2019.0100612

[2] Too, J., Abdullah, A.R. and Mohd Saad, N., 2019. Binary competitive swarm optimizer approaches for feature selection. Computation, 7(2), p.31.
DOI: https://doi.org/10.3390/computation7020031

Cite As

Too, Jingwei, et al. “Classification of Hand Movements Based on Discrete Wavelet Transform and Enhanced Feature Extraction.” International Journal of Advanced Computer Science and Applications, vol. 10, no. 6, The Science and Information Organization, 2019, doi:10.14569/ijacsa.2019.0100612.

View more styles

Too, Jingwei, et al. “Binary Competitive Swarm Optimizer Approaches for Feature Selection.” Computation, vol. 7, no. 2, MDPI AG, June 2019, p. 31, doi:10.3390/computation7020031.

View more styles

Comments and Ratings (15)

Chang hsiung

Amit DOegar

Nice Work, We appreciate , , in random forest when features are in numeric and response is categorical then error is coming in confusion mat
Error using confusion mat (line 71)
G and GHAT need to be the same type., Error in jRF (line 28)
con=confusion mat(ytest,pred);
Kindly advice

djim djim

I have got an error:
Undefined function 'fitcknn' for input arguments of type 'double'.

Error in jKNN (line 12)
Model=fitcknn(feat,label,'NumNeighbors',k,'Distance','euclidean');

Error in Main (line 18)
KNN=jKNN(feat,label,k,kfold);

could you support?

Financeo Putra

Excuse me, is there a way to display confusion matrix for each fold..?

Financeo Putra

Thank you very much for your answer Jingwei.. hope you don't mind if i ask you another question.. I already display sensitivity and specificity value by using classperf.. my question is.. how to display every value of sensitivity and specificity for each fold..? and is it possible to display every confusion matrix for each fold..? Thank You very much for your reply..

Jingwei Too

Dear Financeo Putra,
If you do 5-fold cross-validation, then it is possible to split your data into 80% and 20% test, and the perform 5 time test. However, if you plan to divide the data into data training 80% and data testing 20% and then perform 10 folds validation, it is impossible since 10-fold cross-validation will divide data into 90% train and 10% test, and then perform 10 time test.

Financeo Putra

Thank You for your reply Jingwei i really appreciate it.. i mean is it possible to divide the data into data training 80% and data testing 20% and then perform 10 folds validation with your code..? because i already try with holdout.. and its only perform 1 time test.. i mean can i do 10 folds validation but with data partition..? thank for your answer Jingwei.. i do really appreciate it..

Jingwei Too

Dear Financeo Putra,
My source codes are only applicable when you wish to apply k-fold cross-validation. Moreover, cross-validation allows you to get more comprehensive results instead of using the hold-out method.

Financeo Putra

excuse me, could you tell me how to split data training and data testing (80:20) with your code..?

fatma yasar

Jingwei Too

Dear SACHIN PATEL,
I do not provide program for plotting. You need to use a scatterplot or other relevant codes.

SACHIN PATEL

how to plot LDA? please include code

Rasool Reddy

good code, thanks

Tee Wei Hown

MATLAB Release Compatibility
Created with R2018a
Compatible with any release
Platform Compatibility
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Simple Machine Learning Algorithms for Classification