Solved – How to change a weight/bias with gradient

backpropagationgradient descentinterceptneural networksweights

After watching 3Blue1Brown's tutorial series, and an array of others, I'm attempting to make my own neural network from scratch.

So far, I'm able to calculate the gradient for each of the weights and biases.

Now that I have the gradient, how am I supposed to correct my weight/bias?

Should I:

  1. Add the gradient and the original value?
  2. Multiply the gradient and the original value?
  3. Something else? (Most likely answer)

In addition to this, I've been hearing the term learning rate being tossed around, and how it is used to define the magnitude of the 'step' to descend to minimum cost. I figured this may also play a major role in reducing the cost.

Best Answer

After you've found the gradients, which we'll call $dW,db$ as shorthand for the gradients of the cost function with respect to the weights and biases respectively, you can do a variety of things to your weights and biases $W,b$. It depends on the specific optimization algorithm you are using. The most simple optimization algorithm is vanilla gradient descent. In this case, you apply the transformations: $$W\rightarrow W-\alpha dW$$ $$b\rightarrow b-\alpha dW$$ Where $\alpha$ is your learning rate. The learning rate governs how quickly and well your model learns. A learning weight which is too small may make your model have to take many steps to converge; however, a learning rate which is too big might make your model overshoot the optimum and never converge at all. Finding the correct learning rate is an iterative process.

There are other optimization algorithms out there including with momentum, with Nesterov's momentum, rmsprop, Adam, Adamax, Adagrad, Adadelta, etc. Which optimization algorithm to use is also a question for the cross validation step of learning. Generally speaking though, I find Adam to work quite well for a variety of situations. You can look here for a more in depth description of the optimization algorithms.