Is there a difference between incremental training using ADAPT and TRAIN in Neural Network Toolbox 6.0 (R2008a)?

4 views (last 30 days)
I have found that ADAPT (using "trains" as trainFcn) can be used to perform incremental training. Furthermore the documentation (under "Training Styles - Batch Training") states:
".Incremental training can only be done with adapt; train can only perform batch training."
The documentation also mentions that the TRAIN function with the trainFcn "trainc" also performs incremental training. From what I understood from the description it works identically to ADAPT with the advantage that it has additional stop-conditions (performance goal reached, max numer of epochs, max time).
I would like to know if TRAIN using "trainc" as trainFcn really performs incremental training although it was mentioned in the documentation that TRAIN can only perform batch training.

Accepted Answer

MathWorks Support Team
MathWorks Support Team on 27 Jun 2009
Adaption and training of networks have different purposes.
Training is the problem where there are a fixed set of known input and target vectors and the solution is a network that properly maps inputs to targets. The solution network does not continue learning once it has been trained.
Adaption is the problem where there is a sequence of input and target vectors which is unlimited. The solution is a network with initial conditions and an adaptive algorithm. The network is to continue learning as it is used because it is assumed that the nature of the input/target signals will change over time or that their nature will be different in unknown ways in different contexts where the network will be used.
Incremental learning can be used either for training or adaption. For training it is rarely the most efficient method. Batch learning is almost always more efficient as at each training step the whole dataset is used to determine changes. However, incremental training can be done (often less efficiently) by repeatedly updating the network to vectors in some order, over and over. Examples of incremental training are TRAINC (cycles over vectors) and TRAINR (presents vectors in random order).
Note that if TRAINC or TRAINR are called with data defining multiple time-series, they will update the network for one whole time-series at a time, cycling or randomly picking another time-series for each training step. So in this case incremental does not mean just one input/target vector is being applied.
In contrast, adaptive algorithms are usually incremental as the network will get data incrementally when actually used. A true batch adaptive algorithm would have to collect a quickly growing dataset which would grow without bounds as the network was used, which is almost always impractical. (There are adaptive algorithms which fall between strict incremental and batch, but we have not released one yet.) TRAINS updates the network incrementally according to timesteps of the data. Calling ADAPT will update a network for a time sequence and also return the network's outputs for that time period. Unlike with training, the network's outputs during adaption are relevant - the network output as it is learned is expected to be used.
  1 Comment
Greg Heath
Greg Heath on 2 Sep 2016
What are these systems? How are they related? Is there any reason why they should have exactly the same input/output relationship?
Also, you might want to consider input variable reduction.

Sign in to comment.

More Answers (0)

MathWorks Support

Products


Release

R2008a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!