MATLAB: Am I computing cross entropy incorrectly

cross entropyDeep Learning Toolboxerror functionneural network

I am working on a neural network and would like to use cross entropy as my error function. I noticed from a previous question that MATLAB added this functionality starting with R2013b. I decided to test the crossentropy function by running the simple example provided in the documentation. The code is reprinted below for convenience:
[x,t] = iris_dataset;
net = patternnet(10);
net = train(net,x,t);
y = net(x);
perf = crossentropy(net,t,y)
When I run this code, I get perf = 0.0367. To verify this result, I ran the code:
ce = -mean(sum(t.*log(y)+(1-t).*log(1-y)))
which resulted in ce = 0.1100. Why are perf and ce unequal? Do I have an error in my calculation?

Best Answer

If c classes are mutually exclusive, classifier target probability values should be the certain probability values of 0 or 1 and must sum to 1.
If the corresponding classifier uses a softmax output transfer function, output estimates are bounded by the open range (0,1) and sum to 1.
If classes are not mutually exclusive (e.g., tall, dark ,handsome ), 0 or 1 classifier target probability values do not have to sum to 1.
If the corresponding classifier uses a logsig output transfer function, output estimates are bounded by the open range (0,1) but are not constrained to have a unit sum.
A useful performance function is the crossentropy between outputs and targets.
For mutually exclusive targets and a softmax output, the corresponding form for crossentropy is
Xent1 = -sum( t.*log(y))
For non-mutually exclusive targets and a logsig output, the corresponding form for crossentropy is
Xent2 = -sum( t.*log(y)) + (1-t).*log(1-y))
For your example I get
clear all, clc
[ x, t ] = iris_dataset;
[ O N ] = size(t) % [ 3 150 ]
minmax0 = repmat([0 1],3,1)
checkt1 = max(abs( minmax(t)- minmax0))%[0 0]
checkt2 = max(abs(sum(t)-ones(1,N))) % 0
net = patternnet(10);
rng(0)
[ net tr y ] = train(net,x,t);
checky1 = max(abs( minmax(y)- minmax0))
% checky1 = [ 2.4214e-4 1.8807e-3 ]
checky2 = max(abs(sum(y)-ones(1,N))) % 2.2204e-16
perf = crossentropy(net,t,y) % 0.033005
Xent1 = mean(-sum(t.*log(y))) % 0.049552
Xent3 = mean(-sum((1-t).*log(1-y))) % 0.049464
Xent2 = mean(-sum(t.*log(y)+ (1-t).*log(1-y))) % 0.099015
Unfortunately, none of the following gives a formula
help crossentropy
doc crossentropy
type crossentropy
and the example in the website documentation incorrectly uses Xent2 which is only valid for nonexclusive classes.
If you search on crossentropy in the comp.ai.neural-nets newsgroup, you should find many posts on the topic.
Bottom Line: Xent2 is the correct answer. However, your calculation of crossentropy and Xent3 are not too far from mine. If you use rng(0) they should match.
Hope this helps.
Thank you for formally accepting my answer
Greg