How to use positive part: subplus(x) for neuron model (instead of tansig(x) or logsig(x))?

3 views (last 30 days)
Hello,
I want to use nonlinearity of subplus(x) = [x]+ and learn a neural network with back propagation. I understand that i must use a differentiable function. i can use instead the function y= log(1+exp(x)). I have 2 questions:
1. How can i create a transfer function such as y = log(1+exp(x)) instead of logsig or tansig?
2. Is there another way to do that? thanks a lot, Elia

Answers (1)

Greg Heath
Greg Heath on 29 Mar 2015
In the command line:
Use the commands
lookfor, help, doc and type
on the functions
hardlim
perceptron
adapt
The percepton uses the step transfer function hardlim. Hardlim is not differentiable. Therefore backpropagation via train cannot be used. Instead, use the learning function adapt.
Hope this helps.
Thank you for formally accepting my answer
Greg
  2 Comments
Greg Heath
Greg Heath on 30 Mar 2015
I don't understand why anyone would want to use subplus in a NNET.
Will you please explain your reason?
By the way subplus is differentiable. It's derivative is hardlim.

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!