The parameters of any layer (in this case you custom layer) should be defined when you create the layer array and stored as the layer's property.
For example, if we look at the "examplePreluLayer" example from the following documentation page:
>> edit examplePreluLayer.m
You might notice that there are 2 input arguments that define the layer:
1. numChannel
2. name (an optional 2nd input)
Also notice that "examplePreluLayer" stores a property"Alpha" in a structure:
>> layer.Alpha = rand([1 1 numChannels]);
Now, if we want to use this "examplePreluLayer" in our network, we would define:
>> layers = [ ...
imageInputLayer([28 28 1])
convolution2dLayer(5,20)
batchNormalizationLayer
examplePreluLayer(20)
fullyConnectedLayer(10)
softmaxLayer
classificationLayer];
Notice that the line of code above is the step where we would define our layers and pass in any extra parameters to them when necessary.
(e.g. above, I defined "examplePreluLayer" with numChannel = 20)
On the other hand, the "trainNetwork" function would only take the data ("X" & "Y"), array of layers ("layers") that was already defined and options:
>> trainedNet = trainNetwork(X,Y,layers,options)
Best Answer