Can neural net weights be negative?
83 views (last 30 days)
I have a question on neural net weights. My understanding is that the default is they add up to one. Can they be negative? (If they can't be negative, it seems to me I would need to make sure to make input data generally proportional to the target output. If they can be negative, then I can just let the system adapt to the proportionality.) Thanks!
Shashank Prasanna on 21 Jul 2014
Weights can be whatever the training algorithm determines the weights to be. If you take the simple case of a perceptron (1 layer NN), the weights are the slope of the separating (hyper)plane, it could be positive or negative.
Take a look at some examples in the neural network toolbox: