Neural net backpropogation as per EasyNN-plus
Show older comments
Hi all. I'm not sure if anyone has heard of EasyNN-plus (EasyNN-plus) but I'm trying to build something similar in Matlab. According to the FAQ EasyNN-plus uses a backpropagation neural net and a transfer function of logistic function. That is 1.0 / (1.0 + e (-net input)) . I'm trying to design something in Matlab that would do the same. It doesn't seem to be possible in the Deep Network Designer as this doesn't allow data input of the type used in the Regression Learner for example 9 inputs with one output i.e. not graphical input or time series. I may have completly missed what is already possible so if someone could point me in the right direction it would be most appreciated.
SPG
Accepted Answer
More Answers (2)
Shujaat
on 27 Sep 2024
0 votes
https://techpassion.co.uk/how-does-a-smart-tv-work-read-complete-details/
I am writing to inquire about the recent integration of new technologies into MATLAB. Specifically, I am interested in learning more about any advancements or updates related to machine learning, AI, or real-time data processing that have been incorporated into MATLAB’s toolboxes. Additionally, I would appreciate any information on upcoming features or collaborations with emerging technologies. Could you please provide some details or direct me to relevant resources?
Shujaat
on 27 Sep 2024
0 votes
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!