This challenge is to return the WH_delta and WP_delta, given X, WH, WP, EPY using ReLU on the hidden layer and Softmax on the output layer. Test Cases will accumulate dWP and dWH to solve neural nets for Counter, Subtractor,Mux. Test Cases will have four output cases. ReLU performs well on multiple output cases.
[dWP,dWH]=Neural_Back_Propagation_ReLU(X,WH,WP,EPY)
The matlab Latex code for making the Back Propagation chart included in template.
Solution Stats
Solution Comments
Show comments
Loading...
Problem Recent Solvers9
Suggested Problems
-
3811 Solvers
-
922 Solvers
-
1353 Solvers
-
255 Solvers
-
Magic is simple (for beginners)
11245 Solvers
More from this Author305
Problem Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!