MATLAB: In the Neural Network Toolbox, how can I set different trainParam values for each layer of the network

configureDeep Learning Toolboxneural networktrain

In the Neural Network Toolbox, how can I set different trainParam values for each layer of the network? For example, in the example "simpleclassInputs" dataset in the "Neural Net Pattern Recognition" App, I modified the net to add a hidden layer (hiddenLayerSize = [7 7]). I am using the training function 'trainlm'. I would like to set the values of "mu", "mu_dec", and "mu_inc" to different values for the hidden layers.
How can I do this?

Best Answer

From examples of regression using fitnet([ 5 5 ] ) and classification using patternnet([5 5]) I've deduced the following:
1. The learning rate, lr, and momentum constant, mc, can be specified for each set of layer weights between layer i and layer j.
2. For fitnet with default TRAINLM, the training properties listed in net.trainParam are fixed for all layer weights. In particular, mu, mu_dec, mu_inc and mu_max.
3. For patternnet with default TRAINSCG, the training properties listed in net.trainParam are fixed for all layer weights. In particular, sigma and lambda.
For example
clear all, clc
[x,t] = simpleclass_dataset;
[I N ] = size(x)
[ O N ] = size(t)
net = patternnet([ 5 5 ]);
rng(0)
net = configure(net,x,t)
% ---SNIP
layerWeights = net.layerWeights
% layerWeights =
%









% [] [] []
% [1x1 nnetWeight] [] []
% [] [1x1 nnetWeight] []
layerWeights21 = layerWeights{2,1}
% layerWeights21 =
%
% Neural Network Weight
%
% delays: 0
% initFcn: (none)
% initSettings: .range
% learn: true
% learnFcn: 'learngdm'
% learnParam: .lr, .mc
% size: [5 5]
% weightFcn: 'dotprod'
% weightParam: (none)
% userdata: (your custom info)
%
layerWeights21learnParam = layerWeights21.learnParam
% layerWeights21learnParam =
%
% Function Parameters for 'learngdm'
%
% Learning Rate lr: 0.01
% Momentum Constant mc: 0.9
% So, the specifications have the form
%
% net.layerWeights{ 2,1 }.learnParam.lr = lr21 ;
% net.layerWeights{ 3, 2 }.learnParam.lr = lr23 ;
%
% net.layerWeights{ 2,1 }.learnParam.mc = mc21 ;
% net.layerWeights{ 3, 2 }.learnParam.mc = mc32
trainParam = net.trainParam
% trainParam =
%
%
% Function Parameters for 'trainscg'
%
% Show Training Window Feedback showWindow: true
% Show Command Line Feedback showCommandLine: false
% Command Line Frequency show: 25
% Maximum Epochs epochs: 1000
% Maximum Training Time time: Inf
% Performance Goal goal: 0
% Minimum Gradient min_grad: 1e-006
% Maximum Validation Checks max_fail: 6
% Sigma sigma: 5e-005
% Lambda lambda: 5e-007
%
% Resulting in specifications of the form
net.trainParam.sigma = sigma0 ;
net.trainParam.lamda= lambda0 ;
Hope this helps.
Thank you for formally accepting my answer.
Greg.