Get from Ico-github-logo

Highlights from
Deep Learning Toolbox

  • F=makeLMfilters
    Returns the LML filter bank of size 49x49x48 in F. To convolve an
  • X=flipall(X)
  • allcomb(varargin)
    ALLCOMB - All combinations
  • caeapplygrads(cae)
  • caebbp(cae)
  • caebp(cae, y)
  • caedown(cae)
  • caenumgradcheck(cae, x, y)
  • caesdlm(cae, opts, m)
  • caetrain(cae, x, opts)
  • caeup(cae, x)
  • cnnapplygrads(net, opts)
  • cnnbp(net, y)
  • cnnff(net, x)
  • cnnnumgradcheck(net, x, y)
  • cnnsetup(net, x, y)
  • cnntest(net, x, y)
  • cnntrain(net, x, y, opts)
  • dbnsetup(dbn, x, opts)
  • dbntrain(dbn, x, opts)
  • dbnunfoldtonn(dbn, output...
    DBNUNFOLDTONN Unfolds a DBN to a NN
  • expand(A, S)
    EXPAND Replicate and tile each element of an array, similar to repmat.
  • f=tanh_opt(A)
  • flicker(X,fps)
  • fliplrf(x)
    FLIPLR Flip matrix in left/right direction.
  • flipudf(x)
    FLIPUD Flip matrix in up/down direction.
  • im2patches(im,m,n)
  • isOctave()
    detects if we're running Octave
  • max3d(X, M)
  • myOctaveVersion()
    return OCTAVE_VERSION or 'undefined' as a string
  • nnapplygrads(nn)
    NNAPPLYGRADS updates weights and biases with calculated gradients
  • nnbp(nn)
    NNBP performs backpropagation
  • nnchecknumgrad(nn, x, y)
  • nneval(nn, loss, train_x,...
    NNEVAL evaluates performance of neural network
  • nnff(nn, x, y)
    NNFF performs a feedforward pass
  • nnpredict(nn, x)
  • nnsetup(architecture)
    NNSETUP creates a Feedforward Backpropagate Neural Network
  • nntest(nn, x, y)
  • nntrain(nn, train_x, trai...
    NNTRAIN trains a neural net
  • nnupdatefigures(nn,fhandl...
    NNUPDATEFIGURES updates figures during training
  • normalize(x, mu, sigma)
  • patches2im(patches,n,m)
  • r=visualize(X, mm, s1, s2)
  • randp(P,varargin)
    RANDP - pick random values with relative probability
  • rbmdown(rbm, x)
  • rbmtrain(rbm, x, opts)
  • rbmup(rbm, x)
  • rnd(x)
  • saesetup(size)
  • saetrain(sae, x, opts)
  • scaesetup(cae, x, opts)
  • scaetrain(scae, x, opts)
  • sigm(P)
  • sigmrnd(P)
  • softmax(eta)
  • test_cnn_gradients_are_nu...
  • test_example_CNN
    ex1 Train a 6c-2s-12c-2s Convolutional neural network
  • test_example_DBN
    ex1 train a 100 hidden unit RBM and visualize its weights
  • test_example_NN
  • test_example_SAE
    ex1 train a 100 hidden unit SDAE and use it to initialize a FFNN
  • test_nn_gradients_are_num...
  • whiten(X, fudgefactor)
  • x=randcorr(n,R)
    RANDCORR Generates corremlated random variables
  • zscore(x)
  • caeexamples.m
    mnist data
  • runalltests.m
  • View all files
4.4 | 25 ratings Rate this file 740 Downloads (last 30 days) File Size: 14.1 MB File ID: #38310
image thumbnail

Deep Learning Toolbox



24 Sep 2012 (Updated )

Deep Belief Nets, Stacked Autoencoders, Convolutional Neural Nets and more. With examples.

| Watch this File

File Information

A Matlab toolbox for Deep Learning.
Deep Learning is a new subfield of machine learning that focuses on learning deep hierarchical models of data. It is inspired by the human brain's apparent deep (layered, hierarchical) architecture. A good overview of the theory of Deep Learning theory is Learning Deep Architectures for AI

For a more informal introduction, see the following videos by Geoffrey Hinton and Andrew Ng.

The Next Generation of Neural Networks (Hinton, 2007)
Recent Developments in Deep Learning (Hinton, 2010)
Unsupervised Feature Learning and Deep Learning (Ng, 2011)
If you use this toolbox in your research please cite:

Prediction as a candidate for learning deep hierarchical models of data (Palm, 2012)

Directories included in the toolbox
NN/ - A library for Feedforward Backpropagation Neural Networks

CNN/ - A library for Convolutional Neural Networks

DBN/ - A library for Deep Belief Networks

SAE/ - A library for Stacked Auto-Encoders

CAE/ - A library for Convolutional Auto-Encoders

util/ - Utility functions used by the libraries

data/ - Data used by the examples

tests/ - unit tests to verify toolbox is working

For references on each library check

Required Products MATLAB
MATLAB release MATLAB 7.11 (R2010b)
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (33)
01 Feb 2015 Aik Hong

Hi, i tried to run the test_example_DBN.m, i get the error below:

??? Attempted to access lmisys(5); index out of bounds because numel(lmisys)=4.

Error in ==> lmiunpck at 23
rs=lmisys(4); rv=lmisys(5); % row sizes of LMISET,LMIVAR

Error in ==> nnsetup at 26

Error in ==> dbnunfoldtonn at 10
nn = nnsetup(size);

is that something not right there? i didnt change anything in the code. Or is it something related to my Matlab version?

Comment only
30 Jan 2015 Po Sheng Wang

When I run the file of "test_example_CNN", I got the error of below
assert(~isOctave() ||
compare_versions(OCTAVE_VERSION, '3.8.0',
'>='), ['Octave 3.8.0 or greater is required
for CNNs as there is a bug in convolution in
previous versions. See Your
version is ' myOctaveVersion]);

It seems like my Octave is too old? So how can I update my Octave? Just download new Octave and install it and then everything will be great?? Thanks

Comment only
10 Jan 2015 Chris McCormick

I found the CNN library very informative for helping me learn more of the basics of Convolutional Neural Networks.

It's well written, though lacking in comments and documentation.

I wrote a post explaining the CNN example along with documented / commented versions of most of the CNN functions:

Comment only
05 Jan 2015 Hazem

Hazem (view profile)

I can't understand the example, what is this data and what the example used for?

05 Jan 2015 Hazem

Hazem (view profile)

I can't understand the example, what is this data and what the example used for?

24 Dec 2014 qu

qu (view profile)

Is there someone who would like to tell me how to install it?

19 Oct 2014 Hanan Shteingart

poor documentation makes this hardly usable.

for example:
function scae = scaesetup(cae, x, opts)
x = x{1};

code starts straight away without any parameter explaining. what is x? opts? cae?
you can look at the example code but it is hard to reverse engineer it.

13 Aug 2014 Umer

Umer (view profile)

29 Jul 2014 Tarek El-Gaaly

I just started using this code and was puzzled by the following (plz excuse me for my newbie questions):

How come the errors on MNIST in the examples are less than state of the art? I get ~0.07 error on the 2-layer DBN-NN in the example. State-of-the-art, as far as I am aware, is higher than this.

Also when visualizing the dbn.rbm{2}.W' layer I see pretty much garbage. There is no structure to the weights like dbn.rbm{1}.W'. What has to be done to enable higher level structure learning.

20 Jul 2014 Jane Shen

How to download this valuable toolbox here... can only from GitHub?

18 Jul 2014 BO

BO (view profile)

nntest.m has an error: it will give all 1s for expected, rather than the column index for each row of y.

function [er, bad] = nntest(nn, x, y)
labels = nnpredict(nn, x);
expected = I;
bad = find(labels ~= expected);
er = length(bad) / size(x, 1);

17 Jul 2014 Bruce Ferguson

It does not run under version R2014a. All tests crash.

Comment only
11 Jul 2014 Nadith

Nadith (view profile)

The best cleanest code that anyone could get at the moment... :) Thanks a lot. I'm working on extending the network capabilities depending on my work.

23 Jun 2014 Junhong YE  
10 Jun 2014 satya narayana  
10 Jun 2014 ming

ming (view profile)

10 Jun 2014 Xiaotian Wang  
03 Jun 2014 BUITEMS Quetta

how can I download this toolbox? I really it.

17 May 2014 Yao

Yao (view profile)

17 May 2014 Yao

Yao (view profile)


Comment only
11 May 2014 Rania

Rania (view profile)

how can i download this file please?

Comment only
01 May 2014 Litao Shen

I have a problem about using the nnbp, there is nn.e in nnbp while in nnsetup nn has no e. What shoud I do?

23 Apr 2014 Kazuhito Sato

Thanks for your code.
This code is very useful and exciting for me.

03 Apr 2014 ted p teng

ted p teng (view profile)

13 Mar 2014 Yong Ho

Thanks for your code! But, when I execute cnnexamples in CNN folder after modifying cnn.layers' kernel size from "5" to "4",
I got an error in cnnbp.m line 37 like "Array dimensions must match for binary array op."
please check this error message. Thanks.

15 Feb 2014 Andrew Diamond

Andrew Diamond (view profile)

Just tried to run the 2nd DBN example and it failed. First, the assert at line 6 of rbmtrain.m failed. From what I see, the assert should be ==0 not ~= 0 (numbatches should be integer).

Secondly, the example failed on nnff line 14. From what I can tell, the last size size in dbn.sizes should be 10 as that's the y (==> output layer) size.

Comment only
31 Jan 2014 Géraud

Thanks for your code! I would like to know if there is more documentation than the few examples that are given? Particularly for the cnn.layer in the CNN toolbox?

04 Dec 2013 Al

Al (view profile)

01 Jun 2013 random22

How do I find the probability of a label being selected with this code? I am new to neural networks and am just trying this toolbox out.

Comment only
12 Feb 2013 Johnathan  
27 Nov 2012 Jeff

Jeff (view profile)

02 Nov 2012 Ahmed

Ahmed (view profile)

Well written code that saved me a lot of time.

24 Sep 2012 Sebastien PARIS

Sebastien PARIS (view profile)

12 May 2014

Changed to use GitHub

Contact us