MATLAB: Neural Network Regeneration Based on Result Weight and Bias

neural network

I have been playing with NN toolbox (patternnet). I know it's maybe absurd, but I was trying to re-build feed forward NN based on net's weight results. The model is quite simple. Input size is 90 inputs, and single hidden layer has 10 nodes, and output layer has 8 nodes. Thus, 8 classes total.
What I did is I got all these weight vectors (including bias) and simply multiply and summing for each node. And I used tansig for 1st layer's activation and softmax for 2nd layer's activation. However, the result is quite different.
As you can see, on the right, the error rate is quite bumped up. If anyone can notice what I am doing wrong, please let me know.
Here's my code
layer1 = net.IW{1,1};
layer2 = net.LW{2,1};
layer1Bias = net.b{1,1};
layer2Bias = net.b{2,1};
layer11 = layer1';
layer11 = [layer1Bias'; layer11];
layer11 = layer11';
layer22 = layer2';
layer22 = [layer2Bias'; layer22];
layer22 = layer22';
% now try my own feedforward
for j=1:1:row
% put bias at the first
eachInput = [1 testSet(j,:)];
% calc. sum net for each node in layer 1
for i=1:1:10
L1_net(i) = eachInput*layer11(i,:)';
end
L1_out = tansig(L1_net);
% pub bias at the first,as before
L2_in = [1 L1_out];
%calc sum net again for layer 2
for i=1:1:8
L2_net(i) = L2_in*layer22(i,:)';
end
%softmax as net.layer{2}.transFunc says
intSum = sum(exp(L2_net));
L2_out(j,:) = exp(L2_net)/intSum;
% L2_out(j,:) = softmax(L2_net); <- don't knwo why this didnt work well
end

Best Answer

You didn't take into account the default normalization of inputs and targets followed by the denormalization of the output.
Hope this helps.
Thank you for formally accepting my answer
Greg