MATLAB: How to calculate the NN outputs manually

neural networkneural network manual calculationsneural networksnn toolbox

Can anyway help me explaining manual calculation for testing outputs with trained weights and bias? Seems it does not give the correct answers when I directly substitute my inputs to the equations (Transfer function equations). Answers are different than what I get from MATLAB NN toolbox. How is it possible to get a large number as an output (eg: 100) when the output node has a transfer function, because as an example output from the "logistic" transfer function is always between 0 and 1?

Best Answer

If you use a squashing function on the output, then yes, it is impossible to get a result of 100 at an output. If you need to have outputs outside [0,1] or [-1,1], which are typical ranges for many squashing functions, I suggest using a linear transfer function on the output (or a rectified linear unit).
As for your main question, here is an example of how to calculate outputs manually if you have trained weights and biases. Suppose you had an input x that is 100-by-1 and 1000 hidden layer neurons (so a weight matrix w1 that is 100-by-1000 and bias b1 that is 1000-by-1).
Then, the input to the hidden layer is
z1 = w1'*x+b1;
and the output of the hidden layer is
h1 = f(z1); %where f is the hidden activation function (e.g. logistic, tanh, ReLU)
Next, if you have a single neuron in the output layer, you would have a second weight matrix w2 that is 1000-by-1 and a scalar bias b2. The output to the whole network is then given by
z2 = w2'*h+b2;
h2 = g(z2); %where g is the output activation function, not necessarily the same as f()
Hope this helps!