Let's say I have a neural network in matrix form. Inputs, hidden layer nodes and outputs are represented by row vectors, while the weights are matrices of the sizes [outputRows; inputRows].
Now, let's say I'd like to handle multiple inputs and outputs without having to iterate through columns of row vectors. For output calculation, this should work out without any trouble, as there will just be additional columns after vector multiplication. But for training with multiple input-output value pairs, up to now I relied on iterating through the columns. I couldn't find really meaningful resources regarding backpropagation on multimensional inputs/outputs. Is it possible and/or advisable?
Solved – multidimensional inputs, outputs and backpropagation
backpropagationneural networks
Best Answer
Yes, it's possible, not too difficult and therefore also advisable. I'm giving an code example in Java for forward- and backpropagation using
org.ejml.simple.SimpleMatrix
.