MATLAB: Using backpropagation on a pre-trained neural network

autoencoderbackpropagationDeep Learning ToolboxMATLABneural network

I am developing a project about autoencoders (based on the work of G. Hinton) and I have a neural network which is pre-trained with some Matlab scripts that I have already developed. Now I need to perform a fine-tuning stage through backpropagation, and I am trying to use the 'Neural Network Toolbox'. All my data are already pre-processed (zero mean, unit variance, and so on) and I don't need any more pre-processing, nor post-processing. How do I disable them? Besides, I want this backpropagation stage to begin from the weights and biases found with the pre-training (that are stored in some matrices).
The code I have written so far is
% set the network hidden layers' size and the training function
net = feedforwardnet([layer1, layer2, layer3, layer4, layer3, layer2, layer1], 'traincgp');
% the training data are stored as rows in Dtrain matrix
net = configure(net, Dtrain', Dtrain');
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% initialize weights and biases to the value found with pre-training
% Note: the weights are symmetric but the biases are not
net.IW{1} = vishid';
net.LW{8,7} = vishid;
net.LW{2,1} = hidpen';
net.LW{7,6} = hidpen;
net.LW{3,2} = hidpen2';
net.LW{6,5} = hidpen2;
net.LW{4,3} = hidtop';
net.LW{5,4} = hidtop;
net.b{1} = hidrecbiases';
net.b{2} = penrecbiases';
net.b{3} = penrecbiases2';
net.b{4} = toprecbiases';
net.b{5} = topgenbiases';
net.b{6} = hidgenbiases2';
net.b{7} = hidgenbiases';
net.b{8} = visbiases';
net.trainParam.epochs = 200;
% set the transfer functions: logsig for every layer
% except for the last one which is linear
for ii=1:size(net.layers, 1)-1
net.layers{ii}.transferFcn = 'logsig';
end;
net.layers{size(net.layers, 1)}.transferFcn = 'purelin';
% train the net
net = train(net, Dtrain', Dtrain');
% save the new weights
weight_i1_b = (net.IW{1})';
weight_12_b = (net.LW{2,1})';
weight_23_b = (net.LW{3,2})';
weight_34_b = (net.LW{4,3})';
weight_45_b = (net.LW{5,4})';
weight_56_b = (net.LW{6,5})';
weight_67_b = (net.LW{7,6})';
weight_78_b = (net.LW{8,7})';
hidrecbiases_b = (net.b{1})';
penrecbiases_b = (net.b{2})';
penrecbiases2_b = (net.b{3})';
toprecbiases_b = (net.b{4})';
topgenbiases_b = (net.b{5})';
hidgenbiases2_b = (net.b{6})';
hidgenbiases_b = (net.b{7})';
visbiases_b = (net.b{8})';
% save everything to file
% ...
The problem is that it's not working the way I expect: the error seems to go down but the result (which is the reconstruction of an image) is worse that what I obtain by just pre-training.
Can anyone tell me what I am missing here? Thank you!

Best Answer

After many attempts I found the solution: for some reason (which is not clear to me) the reconstruction script was not working correctly after the fine-tuning, even though it works well when I reconstruct my data after pre-training the network.
So the solution is actually quite simple: given an input 'x', the output produced by the network 'net' is just
y = net(x);
and this reconstruction is much better than the one obtained by just pre-training.