Note: This page has been translated by MathWorks. Please click here

To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

Neural network performance

`perf = crossentropy(net,targets,outputs,perfWeights)`

`perf = crossentropy(___,Name,Value)`

calculates a network performance given targets and outputs, with optional performance weights
and other parameters. The function returns a result that heavily penalizes outputs that are
extremely inaccurate (`perf`

= crossentropy(`net`

,`targets`

,`outputs`

,`perfWeights`

)`y`

near `1-t`

), with very little
penalty for fairly correct classifications (`y`

near `t`

).
Minimizing cross-entropy leads to good classifiers.

The cross-entropy for each pair of output-target elements is calculated as: ```
ce =
-t .* log(y)
```

.

The aggregate cross-entropy performance is the mean of the individual values:
`perf = sum(ce(:))/numel(ce)`

.

Special case (N = 1): If an output consists of only one element, then the outputs and
targets are interpreted as binary encoding. That is, there are two classes with targets of 0
and 1, whereas in 1-of-N encoding, there are two or more classes. The binary cross-entropy
expression is: `ce = -t .* log(y) - (1-t) .* log(1-y) `

.

supports customization according to the specified name-value pair arguments.`perf`

= crossentropy(___,`Name,Value`

)

Was this topic helpful?