Neural Network Classification: cost function, regularization parameter, availability of the hidden layers...

2 views (last 30 days)
How do I derive the cost function J from a nn classification model?
How can I set a regularization parameter? Is this approached using your weights capability?
Are the hidden layer values available after modeling?

Accepted Answer

Greg Heath
Greg Heath on 28 May 2015
You cannot derive a cost function from a model.
You specify the cost function for a design.
If you are new at this, start out by accepting the NNToolbox defaults. First check out the relatively simple examples in the classification documentation
help patternnet
doc patternnet
Then search the NEWSGROUP and ANSWERS for posted examples. For example
greg patternnet
The default transfer functions are tansig (hidden layer) and softmax (output layer)
The "cost" function is crossentropy
Many questions on terminology and details can be obtained using the commands help, doc and type, e.g.,
help patternnet
doc patternnet
type patternnet
Hope this helps.
Thank you for formally accepting my answer
Greg
  1 Comment
Richard Palmer
Richard Palmer on 28 May 2015
Thanks for your response. I still have some questions. I believe that I know the formulae for the cost functions for the models in the toolkit. For instance, the cost function for a regularized LR function is
(Sum(h(xi)-yi)**2 + lambda sum(theta**2))/(2 m)
for a regularized linear regression. I would like to retrieve the theta coefficients and want to know how to set the lambda (regularization) parameter.
The unregularized cost function will give different values for the training/evaluation/test sets. Can the crossentropy function be decomposed into it's constituent parts?
If this is in the documentation, I apologize. I haven't found the information at this point.
Again, thanks for replying.

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!