MATLAB: Is the code correct for a multi-input Time Delay Neural Network

artificial neural networkDeep Learning Toolboxneural networktime delay neural network

Hi; I am working on a neural network project but I do not have any back ground about it. In fact; I need to design a TDNN (TIME DELAY NEURAL NETWORK) with 2 layers , the first layer has 20 inputs and 1 output and the second layer has one input and one output. Every FIR filter in each input channel includes 40 delay blocks and consequently 40 weights. The output of the first layer should be compared with target signal. In fact I have 1 target for 20 inputs. I have used this code in order to design the TDNN. I want to know if the code is correct for explained network. %Create a new network global net; net = network;
%2 Inputs
net.numInputs = 20;
%2 Layers
net.numlayers = 2;
%All Layers Biased
net.biasConnect = ones(net.numlayers,1);
%Connect the different inputs to the first set of layers
net.inputConnect(1,1:20) = 1; %Input 1 to layer 1
%Interconnect the hidden layers
net.layerConnect(2,1) = 1; %Connect Layer 1 to Layer 2
%Assign Output Nodes
net.outputConnect = [0 1]; %Set Layer 2 as the output.
%Assign Target Nodes for Training
net.targetConnect = [0 1]; %Set Layer 2 as the target output.
%Set the layer properties for layer 1
net.layers{1}.size =10; %HiddenLayerSize;% nafahmidam chie????? va 10 ra khodam neveshtam 00
net.layers{1}.transferFcn = 'tansig';
%Set the layer properties for layer 2
net.layers{2}.size = 1;% nafahmidam?????
net.layers{2}.transferFcn = 'purelin';
net.layers{2}.initFcn = 'initwb';
%Tapped Delay Lines
d1=0:45% khodam
net.inputWeights{1,1:20}.delays = d1; %Delays from Input 1 to Layer 1
%Network Performance Function
net.performFcn = 'mse'; %mse = mean squared error
net.adaptfcn = 'trains';
net.inputWeights{1,1}.learnFcn = 'learngdm';
for i=1:2
net.biases{i}.learnFcn = 'learngdm';
net.layerWeights{i,:}.learnFcn = 'learngdm';
end
% Training net.trainfcn = 'trainlm';
% Initialization net.initFcn = 'initlay'; for i=1:2 net.layers{i}.initFcn = 'initnw'; end net = init(net); net.trainParam.epochs = 1500;
view(net)
I have also a question about the train syntax for this network: tdnn_net= train(net,X,Y);
as I mentioned before the network has 20 inputs and 1 output and 1 target. Is 20x(1000) and (1×1000) the suitable size for X and Y( when 1000 is the length of data)?

Best Answer

1. I find it hard to believe that you need 20 inputs to predict one output.
2. You can find a much simpler basic code using
help timedelaynet
doc timedelaynet
3. Find the statistically significant nonnnegative lags for each of the 20 crosscorrelation functions of the target with each input. Search NEWSGROUP and ANSWERS using
greg timedelaynet nncorr
3. Based on the strength and number of each set of crosscorrelations, choose a. A subset of inputs b. A common subset of input delays
4. Use net.divideFcn = 'divideblock'
5. Loop over candidate values for the number of hidden nodes h = Hmin:dH:Hmax
6. for each value of h use an inner loop of i = 1:Ntrials random weight initialization to design numH*Ntrials (typically ~ 10*10=100) candidate designs. The above search in (3) will help.
Hope this helps.
Thank you for formally accepting my answer
Greg