gradient descent for custom function
5 views (last 30 days)
Show older comments
I have four equations:
1) e = m - y
2) y = W_3 * h
3) h = z + W_2 * z + f
4) f = W_1 * x
I want to update W_1, W_2 and W_3 in order to minimize a cost function J = (e^T e ) by using gradient descent.
x is an input, y is the output and m is the desired value for each sample in the dataset
I would like to do W_1 = W_1 - eta* grad(J)_w_1
W_2 = W_2 - eta* grad(J)_w_2
W_3 = W_3 - eta* grad(J)_w_3
Going through documentation I found out that you can train standard neural networks. But notice that I have some custom functions, so I guess it would be more of an optimization built in function to use.
Any ideas?
2 Comments
Matt J
on 24 Apr 2024
x is an input, y is the output and m is the desired value for each sample in the dataset
It looks like z is also an input. It is not given by any other equations.
Answers (2)
Matt J
on 24 Apr 2024
Edited: Matt J
on 24 Apr 2024
so I guess it would be more of an optimization built in function to use.
No, not necessarily. Your equations can be implemented with fullyConnectedLayers and additionLayers.
3 Comments
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!