Neural Networks – How to Use Batch Norm for Input Standardization

batch normalizationneural networksstandardization

I need to train a model with an un-normalized dataset and I can not directly standardize it (subtract the mean and divide by the std), but I do have the mean and std for each feature. Thus I'm thinking about using the keras BN layer to do the normalization automatically, by using it directly after the input layer, freezing its weights for gamma & beta, and replace its running mean and std with my own. However, I have no clue how to change the moving_mean_initializer and moving_variance_initializer in the keras BN layer.

Best Answer

You should be able to set them to what you want using constant initializers.

Example:

init_obj = tf.keras.initializers.Constant(
    value=<value>
)

And pass this as the parameter value.