Solved – normalizing data for neural network

neural networksnormalization

I'm working on a neural network with back propagation for indoor localization. The input of the neural network is Received Signal Strengths (RSSs) and the output is a coordinate (x,y). I have normalized the input and output for training.

I used this equation for normalization:

normalized value = minOfNormalizedScale+(old value- minOfPreviousScale)(maxOfNewScale- minOfNormalizedScale)/(maxOfPreviousScale – minOfPreviousScale).

the new space is [0,1] the old space depends of the recorded values of RSSs , x , and y.

For localization process I need the error of the neural network to be measure in meters.
How can I de-normalize the result of the neural network( the coordinates)?.

I tried using this equation:

old value= normalized value*(maxOfOldScale-minOfOldScale)+minOfOldScale +.

is it correct?

Best Answer

In theory, you don't need to normalise your inputs as this is anyway done by the activation function. In practice, however, it's very useful to normalise both input and output tensors for training and testing in the ranges [0,1] or [-1,1] (for regression). After normalisation, you need to back-transform your output in order to make "predictions" on unseen data.

If you want to monitor the error in metres during the training phase then you must use a function to untransform (as you did already).