MATLAB: Transfer Learning on Unet: Image Input Size not matching for layers

cnnerrorimage inputneural networktransfer learningunet

Hello,
I am trying to retrain a couple layers of the U-net architecture with new data. However, some of the layers have a different input size and therefore, are giving me an error. How do I change the input to the layers without messing up the U-net architecture?
load SimNet1.mat
Net2 = SimNet1; %renaming U-Net
analyzeNetwork(Net2)
plot(Net2)
layers = Net2.Layers;
lgraph = layerGraph(layers)
lgraph = connectLayers(lgraph,'Encoder-Stage-1-ReLU-2','Decoder-Stage-4-DepthConcatenation/in2')
lgraph = connectLayers(lgraph,'Encoder-Stage-2-ReLU-2','Decoder-Stage-3-DepthConcatenation/in2')
lgraph = connectLayers(lgraph,'Encoder-Stage-3-ReLU-2','Decoder-Stage-2-DepthConcatenation/in2')
lgraph = connectLayers(lgraph, 'Encoder-Stage-4-ReLU-2','Decoder-Stage-1-DepthConcatenation/in2')
figure;
plot(lgraph)
%Real US Directories - Transfer Learning Images (260)
segDir = fullfile(%Fire Location); % Segmentations
USDir = fullfile(%File Location); % Labels = US Images
imds = imageDatastore(USDir); %DataStore of input training images - ultrasound images
classNames = ["bone","background"]; %labels
labelIDs = [1 0];
pxds = pixelLabelDatastore(segDir,classNames,labelIDs);
larray = [convolution2dLayer([1 1], 2,'NumChannels',64,'NumFilters',2,'Name','NewFinalConvLayer')];
lgraph = replaceLayer(lgraph,'Final-ConvolutionLayer',larray);
larray2 = pixelClassificationLayer('Name','NewPixelClassificationLayer','Classes',["bone" "background"]);
lgraph = replaceLayer(lgraph,'Segmentation-Layer',larray2);
larray3 = softmaxLayer('Name','NewSoftMaxLayer');
lgraph = replaceLayer(lgraph,'Softmax-Layer',larray3);
%Error is occuring for the next two layers
larray4 = [convolution2dLayer([3 3], 2,'NumChannels',128,'NumFilters',64,'Name','NewDecoderStage41layer')];
lgraph = replaceLayer(lgraph,'Decoder-Stage-4-Conv-1',larray4);
larray5 = [convolution2dLayer([3 3], 2,'NumChannels',64,'NumFilters',64,'Name','NewDecoderStage42Layer')];
lgraph = replaceLayer(lgraph,'Decoder-Stage-4-Conv-2',larray5);
figure;
plot(lgraph)
options = trainingOptions('adam','InitialLearnRate', 3e-4, ...
'MaxEpochs',100,'MiniBatchSize',15, ...
'Plots','training-progress','Shuffle','every-epoch');
ds = pixelLabelImageDatastore(imds,pxds) %returns a datastore based on input image data(imds - US images)
%and pxds (required network output - segmentations)
TLNet7 = trainNetwork(ds,lgraph,options)
save TLNet7

Best Answer

Based on the information on imageSize, numClasses & 'EncoderDepth', for the Net2 in your question add the value 'same' for the 'padding' Name-Value pair argument as follows for the NewFinalConvLayer, NewDecoderStage41layer & NewDecoderStage42Layer:
larray = [convolution2dLayer([1 1], 2,'NumChannels',64,'NumFilters',2,'Name','NewFinalConvLayer','Padding','same')];
lgraph = replaceLayer(lgraph,'Final-ConvolutionLayer',larray);
larray2 = pixelClassificationLayer('Name','NewPixelClassificationLayer','Classes',["bone" "background"]);
lgraph = replaceLayer(lgraph,'Segmentation-Layer',larray2);
larray3 = softmaxLayer('Name','NewSoftMaxLayer');
lgraph = replaceLayer(lgraph,'Softmax-Layer',larray3);
%Error is occuring for the next two layers
larray4 = [convolution2dLayer([3 3], 2,'NumChannels',128,'NumFilters',64,'Name','NewDecoderStage41layer','Padding','same')];
lgraph = replaceLayer(lgraph,'Decoder-Stage-4-Conv-1',larray4);
larray5 = [convolution2dLayer([3 3], 2,'NumChannels',64,'NumFilters',64,'Name','NewDecoderStage42Layer','Padding','same')];
lgraph = replaceLayer(lgraph,'Decoder-Stage-4-Conv-2',larray5);
analyzeNetwork(lgraph)
This would maintain the same U-Net structure but with the replaced layers and should no longer cause the mentioned error.
Related Question