My dataset is too large to learn on at the same time, so I'm using a batch learning scheme like this:
net = fitnet(50,'trainbr');numBatches = 10batchSize = round(length(trainTrainingData)/numBatches);for iBatch = 1 : numBatches*3 [trainingBatch,batchIndices] = datasample(trainingData',batchSize,'Replace',false); outcomesBatch = trainingOutcomes(batchIndices); [net,tr] = train(net,trainingBatch',outcomesBatch);end
Essentially I'm sampling (without replacement) 10% of the data each time to train the neural net with (of course there's no guarantee I'll actually cover all the data by the end of training). I'm confused because I often see that when it begins learning on a new batch that the training and test MSEs will go up from the 0th epoch to the 1st epoch; i.e. the fit of the network before it started training on that new batch was better than after it began training on it.
I'm trying to just update/adjust the weights on each batch, but is it possible they're being overwritten? If so, how do I just tweak the weights from batch to batch without overwritting?
Best Answer