Thanks a lot for enhancing our understanding with this well commented code. I have a query regarding the sparsity constraint imposed in RBM, i.e in the pretrainRBM function. In order to update the hidden biases according to the sparsity constraint, why have you multiplied the gradients with 2.
Can anyone tell me how to implement a wrapper with Support vector machines.I've been trying to use the following code snippet for the purpose but it is always returning me one feature(which is the first one in case forward selection and last one in case of backward selection ).Can anyone explain to me why this is happening or give some other example as a demo to explain the feature selection process by using SVM. Many thanks in advance
%% FISHERIRIS DATA
X = randn(150,20);
y = species(1:100,:);
groups = ismember(species,'setosa');
X= scaleData(X); % to scale data in range [0,1]
%% CROSS VALIDATING
cc = cvpartition(y,'k',10);
%% SVM TRAINING AND TESTING FOR FEATURE %% SELECTION
opts = statset('display','iter');
fun = @(Xtrain,Ytrain,Xtest,Ytest)...