Solved – Neural network – meaning of weights

neural networksweights

I am using feed-forward NN. I understand the concept, but my question is about weights. How can you interpret them, i.e. what do they represent or how can they be undestrood (besied just function coefficients)? I have found something called "space of weights", but I am not quite sure what does it means.

Best Answer

Individual weights represent the strength of connections between units. If the weight from unit A to unit B has greater magnitude (all else being equal), it means that A has greater influence over B (i.e. to increase or decrease B's level of activation).

You can also think of the the set of incoming weights to a unit as measuring what that unit 'cares about'. This is easiest to see at the first layer. Say we have an image processing network. Early units receive weighted connections from input pixels. The activation of each unit is a weighted sum of pixel intensity values, passed through an activation function. Because the activation function is monotonic, a given unit's activation will be higher when the input pixels are similar to the incoming weights of that unit (in the sense of having a large dot product). So, you can think of the weights as a set of filter coefficients, defining an image feature. For units in higher layers (in a feedforward network), the inputs aren't from pixels anymore, but from units in lower layers. So, the incoming weights are more like 'preferred input patterns'.

Not sure about your original source, but if I were talking about 'weight space', I'd be referring to the set of all possible values of all weights in the network.