Code covered by the BSD License  

4.6

4.6 | 5 ratings Rate this file 134 Downloads (last 30 days) File Size: 1.06 MB File ID: #25247
image thumbnail

myCNN

by

 

06 Sep 2009 (Updated )

myCNN is a Matlab implementation of convolutional neural network (CNN).

| Watch this File

File Information
Description

The first CNN appeared in the work of Fukushima in 1980 and was called Neocognitron. The basic architectural ideas behind the CNN (local receptive fields,shared weights, and spatial or temporal subsampling) allow such networks to achieve some degree of shift and deformation invariance and at the same time reduce the number of training parameters.
Since 1989, Yann LeCun and co-workers have introduced a series of CNNs with the general name LeNet, which contrary to the Neocognitron use supervised training. In this case, the major advantage is that the whole network is optimized for the given task, making this approach useable for real-world applications.
LeNet has been successfully applied to character recognition, generic object recognition, face detection and pose estimation, obstacle avoidance in an autonomous robot etc.
myCNN class allows to create, train and test generic convolutional networks (e.g., LeNet) as well as more general networks with features:
- any directed acyclic graph can be used for connecting the layers of the network;
- the network can have any number of arbitrarily sized input and output layers;
- the neuron’s receptive field (RF) can have an arbitrary stride (step of local RF tiling), which means that in the S-layer, RFs can overlap and in the C-layer the stride can differ from 1;
- any layer or feature map of the network can be switched from trainable to nontrainable (and vice versa) mode even during the training;
- a new layer type: softmax-like M-layer.
The archive contains the myCNN class source (with comments) and a simple example of LeNet5 creation and training.

All updates and new releases can be found here: http://sites.google.com/site/chumerin/projects/mycnn

Acknowledgements

Cnn Convolutional Neural Network Class inspired this file.

MATLAB release MATLAB 7.2 (R2006a)
Other requirements For the proper training, the example needs the MNIST dataset, which can be downloaded from Yann LeCun's website http://yann.lecun.com/exdb/mnist/index.html or, in Matlab format, from my web page http://sites.google.com/site/chumerin/projects/mycnn
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (9)
18 Oct 2011 ben kamy

Hi,
Can any one tell me is it possible to implement face recognition application using neural networks cNN with eigenfaces i don knw muchh abt cNn . if possible please
send me a reply..

17 Dec 2010 syaiza syaiza

hi,i need to know how to insert alphabet characters (A-Z) into your code besides numeric characters (0-9). Any suggestions?

25 Oct 2010 José Méndez

Hi.

I'm testing your implementation for doing some work on face recognition. But I have to use more than 10 classes. I wonder if you can help me doing this modification on the code. I have been studying the code this day but I can't find where to modify for using 40 classes. I hope you can help me, please.

Greetings.

27 Jan 2010 Evgeniy Mzokov

You did a great job! Thanks )

16 Nov 2009 Ryan

This requires the NN toolbox and this should be mentioned. For example, mapstd() is a function from this toolbox which is used.

26 Oct 2009 Amiin  
09 Oct 2009 Nikolay Chumerin

[to Birkan Tunc]
Thank you for your response. Concerning your question...
The problem is in the way how the standard 2D convolution is done. Usually, when you convolve a matrix A (of format let say N-by-M) with a kernel k (of format n-by-m where n<=N and m<=M) you get a result C of format (N+n-1)-by-(M-m+1).
So, all kernels, except scalars (1-by-1), produce convolution result of format less then convolved matrix (A). Of course, you can use 'same' parameter in conv2 matlab function (e.g.
>> C = conv2(A, k, 'same');
and you will get the result of the same format as input (size(C) == size(A)).
But in this case you need to adapt the training routines as well. Theoretically it's a piece of cake, but technically it's not a trivial task (one has to properly handle borders during the backward convolution). Anyway, you can try. It would be nice to see the results of your work.

08 Oct 2009 Birkan Tunc

I have a few questions: I need a CNN for image processing instead of classification. That means I feed the network with an image and want get another image as the single output with the same size of input. How can I manage that? Is it possible by making simple modification to this code? Are there any references for this kind of CNNs?

I'll appreciate any kind of help

02 Oct 2009 Mihail Sirotenko

A respectable job have been done in this realization of CNN. There's a progress in comparison with my version. Particularly more flexible architecture of CNN, ability to make an arbitrary dimentions of kernel weight and other usefull things.

Updates
09 Sep 2009

- fixed some bugs related to system dependencies;
- added ChangeLog and content.m;
- more comments in the example.

23 Sep 2009

see ChangeLog inside the archive.

28 Sep 2009

- the first attempt to make the M-layer trainable
- a new private functions unfold.m and unfold2.m added
- soft_max now suppots n-dimensional array (not only 2D and 3D)
- improved myCNN object description

30 Sep 2009

- a graphical demo (demo_myCNN) added (thanks to Mikhail Sitotenko, submission 24291)
  - an example of pretrained network (myLeNet5-example.mat) added
  - the myCNN constructor is fixed (loading from file)

Contact us