Solved – Update weight vector regardless of correctness for perceptron algorithm

perceptron

For the perceptron algorithm, what will happen if I update weight vector for both correct and wrong prediction instead of just for wrong predictions? What will be the plot of number of wrong predictions look like w.r.t. number of passes? The algorithm of perceptron is the one proposed by Rosenblatt (1958) as below:

enter image description here

My question is that what will happen if we remove if condition and execute update for all instances in each pass.

Best Answer

In the learning algorithm of the perceptron, the weights are not updated after a correct response.

The learning rule says that the weight vector $\mathbf{w}=(w_i,...w_n)$ is updated according to $w_i(t+1)=w_i(t)+(d_j-y_j(t))x_{ji}$ (see wikipedia). So if the output $y_j$ (obtained applying the $j$th input vector $x_j$) is equal to the desired output $d_j$, that is $d_j=y_j(t)$, the rule becomes $w_i(t+1)=w_i(t)+0$, hence there is no change to the weight.