How to train a neural network with genetic algorithm and back propagation?

Hello I want to train my neural network using a hybrid model of backpropagation and Genetic algorithm . Is it possible to use the two ona neural network for extremely high performance and also better results in less amount of time? Are there any such model available in MATLAB?

 Accepted Answer

I have never found an efficient use of GA for training a fixed topology NN. The only successful adaptive topology NNs I have designed had a single hidden layer with a variable number of elliptical or radial basis functions. However, they were not designed using GA.
I have posted a fixed topology tansig GA design recently
However, the design was more illustrative than useful.
My point of view is that GAs probably excel when the net topology is more complex than the MATLAB feedforward and feedback defaults. In particular, when both number of layers, nodes and connections are variable.
If there is an efficient way to combine GA and backprop I am not familiar with it. (Which doesn't necessarily mean that it doesn't exist).
Good Luck.
PS If you find a good reference, PLEASE let us know.
Thanks in advance,
Greg

11 Comments

Dear Sir, thank you for your response.I was wondering if a term called momentum in neural network training can find an global minima in the error space.I think its solves the problem of backpropogation getting stuck in local minima?Can you please comment about it
No.
Momentum prevents the weights from changing too quickly thereby reducing the probability of flying past a local minimum too fast and possibly zig-zagging back and forth across the minimum without converging.
Can you please tell me sir how this can be done with matlab neural network function?
I do not have access to the article. However, the concept has merit, especially if the target is multi-dimensional.
Do you know what assumptions are made, dimensions of input and target, etc?
Greg
I have sent u the link u can check it now
I missed your last post. Sorry.
I just downloaded the reference and hope to get to it sometime this week.
Greg
Sorry, I will not have the time to spend on this right now.
If you just want to find a non-optimal, but good, single-hidden layer model, my double loop search over number of hidden nodes (outer loop) and random number states (inner loop) which yields random trn/val/tst datadivisions and random initial weights has withstood the ravages of time.
I have posted zillions of examples in both the NEWSGROUP and ANSWERS. try searching on the combination
Hmin:dH:Hmax Ntrials
Hope this helps.
Greg
PS I have posted a genetic approach recently, however it is not as good as the double loop search.

Sign in to comment.

More Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Asked:

on 10 Jul 2016

Commented:

on 28 Aug 2016

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!