Rank: 125460 based on 0 downloads (last 30 days) and 0 file submitted
photo

TabZim

E-mail

Personal Profile:

 

Watch this Author's files

 

Comments and Ratings by TabZim View all
Updated File Comments Rating
07 Mar 2014 Deep Neural Network It provides deep learning tools of deep belief networks (DBNs). Author: Masayuki Tanaka

Thanks a lot for enhancing our understanding with this well commented code. I have a query regarding the sparsity constraint imposed in RBM, i.e in the pretrainRBM function. In order to update the hidden biases according to the sparsity constraint, why have you multiplied the gradients with 2.

dsW = dsW + SparseLambda * 2.0 * bsxfun(@times, (SparseQ-mH)', svdH)';

dsB = dsB + SparseLambda * 2.0 * (SparseQ-mH) .* sdH;

This does not match any update equation given by Lee et al. Could you please elaborate on this? Many thanks!

14 Jul 2011 Feature Selection using Matlab Select the subset of features that maximizes Correct Classification Rate. Author: Dimitrios Ververidis

Can anyone tell me how to implement a wrapper with Support vector machines.I've been trying to use the following code snippet for the purpose but it is always returning me one feature(which is the first one in case forward selection and last one in case of backward selection ).Can anyone explain to me why this is happening or give some other example as a demo to explain the feature selection process by using SVM. Many thanks in advance

%% FISHERIRIS DATA

load fisheriris
X = randn(150,20);
X(:,1:4)= meas(:,:);
y = species(1:100,:);
groups = ismember(species,'setosa');
y= groups(:,:)
X= scaleData(X); % to scale data in range [0,1]

%% CROSS VALIDATING

cc = cvpartition(y,'k',10);

%% SVM TRAINING AND TESTING FOR FEATURE %% SELECTION

opts = statset('display','iter');
OPTIONS=optimset('MaxIter',1000);
fun = @(Xtrain,Ytrain,Xtest,Ytest)...
(sum(~strcmp(Ytest,svmclassify(svmtrain(Xtrain,Ytrain),Xtest))))

[fs,history] = sequentialfs(fun,X,y,'cv',cc,'options',opts,'nfeatures',3)

%% END OF CODE

Contact us