Solved – nnet package – is it neccessary to scale data

neural networksr

Could you tell me is it necessary to scale data ?
I am using this package for prediction.
Not scaled data give the same efficiency of prediction as scaled data.

Best Answer

It is not required to normalize data for neural network models, but it is advised, as training will take longer (weights have to compensate for differences across features) or might also give you different/worse results without normalization (see e.g. this question for some more details).

Obtaining the same efficiency with and without normalization is possible, but the underlying model can still be different. For example, here's a simple example using nnet in caret::train with and without data normalization that will give you different nets and performance over parameter sets - while still having similar optimal performance on CV partitions then:

    library(caret)
    m  <- train(x = iris[,1:4], y = iris[,5],
                method = 'nnet', metric = 'Kappa', 
                tuneGrid = expand.grid(size=1:8, decay=3**(-6:1)),
                trControl = trainControl(method = 'repeatedcv', 
                                        number = 10, 
                                        repeats = 10, 
                                        returnResamp = 'final'))

    mScaled <- train(x = (iris[,1:4]), y = iris[,5],
                        method = 'nnet', metric = 'Kappa', 
                        preProcess = c('center', 'scale'),
                        tuneGrid = expand.grid(size=1:8, decay=3**(-6:1)),
                        trControl = trainControl(method = 'repeatedcv', 
                                                number = 10, 
                                                repeats = 10, 
                                                returnResamp = 'final'))

    m$finalModel
    mScaled$finalModel
    plot(m)
    plot(mScaled)
    boxplot(data.frame(m$resample$Kappa, mScaled$resample$Kappa))