I have project to evaluate the neural network tool for Thalassemia disease by using matlab software. Please help me how can I do it.
MATLAB: How to evaluate neural network tool (such as, BP and MLP) in matlab.
evaluate
Related Solutions
If the net
1. MUST be a 2-hidden-layer FFMLP
2. MUST use the 1st hidden layer to recognizably separate the 4 classes
3. MUST be trained via BackProp
Try versions of the following:
1. For each class, design a 136-1-1 two-class classifier using patternnet(1). Since the initial weights are random, design many ( 10 each?) and choose the best.
2. Use the weights of the 4 selected classifiers for the first layer weights of a 136-4-4 four-class classifier using patternnet([4 H]).
3. Choose a good value for H by trial and error.
4. If you cannot freeze the 1st layer weights with learning rates of zero, store the hidden layer outputs of the two-class classifiers to train the last two layers of the four-class classifier.
Hope this helps.
Greg
From a strictly coding perspective:
p=[119 88 82 204 10;79 59 73 150 10 ;53 55 68 103 20 ]; t=[1 1 1 1 0]; net=newff([0 255 ; 0 1],[5,1],{'logsig' 'tansig'});
%ERROR!, the first input should be 3-D, not 2-D
% Warning: NEWFF used in an obsolete way.
% See help for NEWFF to update calls to the new argument list.
%
% GEH: To be sure of the defaults for this VERY obsolete function, run again with the 2-D error corrected and ending semicolon removed. Then compare properties with the more recent, but also, obsolete version:
net = newff( p, t, 5,{'logsig' 'tansig'});
>> help newff
newff Create a feed-forward backpropagation network. Obsoleted in R2010b NNET 7.0. Last used in R2010a NNET 6.0.4. The recommended function is feedforwardnet.
% I don't agree with the recommendation. Use patternnet for classification/pattern-recognition and fitnet for regression/curvefitting. They both call feedforwardnet. The fitnet/patternnet/feedforwardnet trio have replaced the obsolete newfit/newpr/newff trio.
%The more recent version of newff automatically removes constant rows from input and output matrices, normalizes inputs and outputs to [-1,-1], divides the data into train/validation/test subsets with divison ratios 0.7/0.15/0.15 and uses tansig(i.e.,tanh) transfer functions for both hidden and output layers.
However, for the ancient version you are using, there is no automatic row removal, normalization, or data division. Furthermore, the default hidden and output transfer functions are tanh and linear (purelin).
If you cannot get the more recent version of newff (preferably newpr for classification) or better yet, the current patternnet, then do the following
1. Use mapminmax to normalize inputs to [-1,-1] 2. Use odd polarity tansig hidden transfer functions 3. Use logsig output transfer functions for [ 0 1 ] probabilistic outputs. 4. To avoid overfitting, try to choose the number of hidden layer transfer functions, H, so that the total number of unknown weights, Nw, are much less than the number of training equations Ntrneq.
For example, suppose you did not divide the data into trn/val/tst subsets:
{ I N ] = size(p) % [ 3 5 ]
[ O N ] = size(t) % [ 1 5 ]
Ntrn = N % 5
Ntrneq = N*O % 5
and the number of unknown weights is
Nw = (I+1)*H+(H+1)*O
then there is no overfitting (Nw < Ntrneq) if H <= Hub ( upper bound) where
Hub = -1 + ceil( (Ntrneq-O) / (I+O+1) ) % 0
Choosing H = 0 leads to a multinomial logistic regression model (Google Wikipedia\logistic_regression)
net = newff( [-1,1; -1,1, -1,1 ], [ [], 1 ], { 'logsig'});
or the later version
net = newff( p, t,[],{'logsig'} );
Hope this helps.
Thank you for formally accepting my answer.
Greg
P.S. For a real world problem you will need a hidden layer and much more data to prevent overfitting.
P.P.S. Try one or more of the MATLAB classification data sets in
help nndatsets.
For example, the crab gender classification dataset has 6 inputs and 2 outputs. If you show me your solutions, I'll show you mine!
help crab_dataset
Best Answer