MATLAB Answers

Neural network for bags-of-visual-words giving a pretty bad training/testing results.

2 views (last 30 days)
Preetham Manjunatha
Preetham Manjunatha on 7 Nov 2017
Dear All:
I created a feature matrix using (encode, bagOfFeatures functions) visual-bags-of-words using computer vision toolbox, it is 500 x 14404 (Xtrain = 10793, XVal = 1204 and XTest = 2407 samples). There are 14 classes, target matrix is 14 x 14404. When I use Matlab's default multiclass SVM that is using below code snippets:
categoryClassifier = trainImageCategoryClassifier(imdsTrainRandomized, bag_Train_BoFOri);
[confMatTr,knownLabelIdxTr,predictedLabelIdxTr,scoreTr] = evaluate(categoryClassifier, imdsTrainRandomized);
[confMatVl,knownLabelIdxVl,predictedLabelIdxVl,scoreVl] = evaluate(categoryClassifier, imdsValRandomized);
[confMatTs,knownLabelIdxTs,predictedLabelIdxTs,scoreTs] = evaluate(categoryClassifier, imdsTestRandomized);
I am getting testing accuracy around 70%, with almost 70% for precision and recall. That's pretty decent for the given dataset.
For neural network, I am using patternet with [800 800 900] and other combinations with 2 and 3 hidden layers. Unfortunately, the training accuracy is about 18-20 %, whereas test accuracy is around 8 -12 % only. I tried various hidden units for 2 and 3 hidden layers (combinations up to 5:20:800-1000 hidden units), but testing accuracy is < 10%. Compared to default SVM results is very low.
I really appreciate any help.
Thanks,
Below is my code snippet for training/validating/testing NN:
% Feature matrix (transposed) and % Labels/targets (transposed)
load 500VocSize_StronFeat_0p8.mat
% {Create/concatenate the feature and}
Xtrain = featureMatrixTrain_BoFOri;
Xval = featureMatrixVal_BoFOri;
XTest = featureMatrixTest_BoFOri;
Ttrain = targetTrain;
Tval = targetVal;
TTest = targetTest;
x = [Xtrain; Xval; XTest]';
t = [Ttrain; Tval; TTest]';
x = double(x);
% {Hidden units neurons}
pairs = [800 800 900];
net = patternnet(pairs);
% {Network options
%//%*******************************************************************
% Choose Input and Output Pre/Post-Processing Functions
% For a list of all processing functions type: help nnprocess}
net.input.processFcns = {'removeconstantrows','mapminmax'};
net.output.processFcns = {'removeconstantrows','mapminmax'};
% {Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivide}
net.divideFcn = 'divideind'; % {Divide data by index}
net.divideMode = 'sample'; % {Divide up every sample}
%
net.divideParam.trainInd = 1 : size(Xtrain,1);
net.divideParam.valInd = size(Xtrain,1) + 1 : ...
size(Xtrain,1) + size(Xval,1);
net.divideParam.testInd = size(Xtrain,1) + size(Xval,1) + 1 :...
size(Xtrain,1) + size(Xval,1) + ...
size(XTest,1);
% Goal
net.trainFcn = 'trainscg'; {%'trainlm' 'trainscg' 'traingdx' 'traingda' 'traingdm' 'traingd'}
net.trainParam.epochs = 50000;
net.trainParam.goal = 1e-6 ;
net.trainParam.showCommandLine = true;
net.trainParam.show = 25;
net.trainParam.max_fail = 50000;
net.trainParam.min_grad=1e-7;
net.trainParam.lr = 0.01;
% {Change the transfer function for all hidden layers [output layer in deafult 'softmax' 'tansig']}
for i = 1:size(pairs,2)
net.layers{i}.transferFcn = 'tansig';
end
% {Turn on/off nntraintoll window}
net.trainParam.showWindow = true;
% {Callback of neural netwrok function
% Train the Network}
[net,tr] = train(net,x,t);
% Test the Network
y = net(x);
ActualTrainind = vec2ind(t(:,tr.trainInd(:)));
PredictTrainind = vec2ind(y(:,tr.trainInd(:)));
percentErrorsTrain = sum(ActualTrainind ~= PredictTrainind)/numel(ActualTrainind);
ActualTestind = vec2ind(t(:,tr.testInd(:)));
PredictTestind = vec2ind(y(:,tr.testInd(:)));
percentErrorsTest = sum(ActualTestind ~= PredictTestind)/numel(ActualTestind);

  4 Comments

Show 1 older comment
Preetham Manjunatha
Preetham Manjunatha on 7 Nov 2017
Yes, I have tried with irisdataset with a net = patternnet([15 15 20]); and getting precision/recall around 95%. Also, one major change is instead of 'divideind', I have used 'dividerand'. The rest remains same.

Sign in to comment.

Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!