I am trying to simulate the outputs for a neural network myself for later translation to java so i can run it on a mobile device. For this i generated the following simulation code for a network with two hidden layers and tangent-sigmoid nonlinear function at all layers:
function [ Results ] = sim_net( net, input )
y1 = tansig(net.IW{1} * input + net.b{1}); y2 = tansig(net.LW{2} * y1 + net.b{2}); Results = tansig(net.LW{6} * y2 + net.b{3});end
The sim_net function is then held up against matlab's own functions using the following code:
clc
clear all
net = feedforwardnet([20 20]);
net.divideParam.trainRatio = 75/100; % Adjust as desired
net.divideParam.valRatio = 15/100; % Adjust as desired
net.divideParam.testRatio = 10/100; % Adjust as desired
net.inputs{1}.processFcns = {}; % no preprocessing
net.outputs{2}.processFcns = {};
net.layers{1}.transferFcn = 'tansig';
net.layers{2}.transferFcn = 'tansig';
net.layers{3}.transferFcn = 'tansig';
% Train and Apply Network
[x,t] = simplefit_dataset;
[net,tr] = train(net,x,t);
for i=1:length(x)
disp(i) = sim_net(net,x(i));
disp2(i) = sim(net,x(i));
end
plot(disp)
hold on
plot(disp2)
legend('our code','matlabs code')
the plot of the two outputs:
however, a quick inspection using the following edit reveals that matlabs results are offset by a factor of 5, and scaled by a factor of 5 also
plot(disp)
hold on
plot((disp2-5)/5+0.1)
legend('our code','matlabs code')
However, matlab's net function shouldn't even be able to give values above 1 when using tansig as the last activation function?
Best Answer