classifier

15 views (last 30 days)
FIR
FIR on 7 Apr 2012
Answered: Agustin on 17 Nov 2014
i have a dataset of 100x6,i want to classify these and find the accuracy using random forest and mlp ,i have classifeid using svm and knn,but dont know how to do with MLP and random forest ,please do help

Accepted Answer

Greg Heath
Greg Heath on 11 Apr 2012
For Neural Net classification, see the documentation for patternnet and the classification demo example.
Hope this helps.
Greg

More Answers (4)

Agustin
Agustin on 17 Nov 2014
I have written the following code to do cross-validation using TreeBagger (I use the fisheriris dataset):
load fisheriris
X = meas;
y = species;
%data partition
cp = cvpartition(y,'k',10); %10-folds
%prediction function
classF = @(XTRAIN,ytrain,XTEST)(predict(TreeBagger(50,XTRAIN,ytrain),XTEST));
&missclassification error
missclasfError = crossval('mcr',X,y,'predfun',classF,'partition',cp);
I hope it is useful.

Ilya
Ilya on 7 Apr 2012
If you have Statistics Toolbox and MATLAB 9a or later, you can use TreeBagger. Please read the documentation and take a look at the examples. Follow up with a specific question if something remains unclear.
For MLP, take a look at the Neural Network Toolbox.
  2 Comments
FIR
FIR on 10 Apr 2012
how to add cv to this treebagger please help
B = TreeBagger(ntrees,X,Y)
Greg Heath
Greg Heath on 11 Apr 2012
See the pattern recognition and classification demos in the Neural Network Toolbox.
Hope this helps.
Greg

Sign in to comment.


Ilya
Ilya on 10 Apr 2012
You can use out-of-bag error as an unbiased estimate of the generalization error. Train TreeBagger with 'oobpred' set to 'on' and call oobError method.
If you insist on using cross-validation, do 'doc crossval' and follow examples there.
  2 Comments
FIR
FIR on 11 Apr 2012
i have a code
load fisheriris
groups=species;
cvFolds = crossvalind('kfold', groups, 10); %# get indices of 10-fold CV %# get indices of 10-fold CV
cp = classperf(groups);
for k=1:10
b = TreeBagger(10,meas,species,'oobpred','on');
cp = classperf(groups,b)
end
but i get error as
Error using TreeBagger/subsref (line 884)
Subscripting into TreeBagger using () is not allowed.
Error in classperf (line 219)
gps = varargin{1}(:);
Error in yass (line 10)
cp = classperf(groups,b)
please help
Ilya
Ilya on 11 Apr 2012
First, this thread has become convoluted. If you want to post another question, I suggest that you post it as a new question, not as an answer to your own old question.
Second, I suggested that you look at function crossval, not crossvalind. You can cross-validate using crossvalind too, but crossval offers an API that reduces the amount of code you need to write.
Third, whether you choose to use crossval or crossvalind, please take a look at the examples and follow them closely. In particular, 2nd example for crossval here http://www.mathworks.com/help/toolbox/stats/crossval.html shows what you need to do. You would need to replace the function handle classf in that example with a function which has two lines of code in it: 1) Train a TreeBagger on Xtrain and Ytrain, and 2) Predict labels for Xtest using the trained TreeBagger.

Sign in to comment.


Richard Willey
Richard Willey on 11 Apr 2012
I did a webinar a couple years titled:
"Computational Statistics: An Introduction to Classification with MATLAB". You can watch the recorded webinar online. The demo code and data sets are available on MATLAB Central.
  3 Comments
Richard Willey
Richard Willey on 11 Apr 2012
Not sure what you mean by rbf? Radial Basis Functions?
Greg Heath
Greg Heath on 11 Apr 2012
Search the Newsgroup using
heath newrb design
Hope this helps.
Greg

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!