how can one utilize a dropout layer in a neural network during prediction?

14 views (last 30 days)
I was hoping to use dropout layers at prediction time with an LSTM network in order to get confidence intervals.
Apparently, dropout layers only randomly set connections to 0 during training time.
From the dropout reference:
"A dropout layer randomly sets input elements to zero with a given probability. At training time, the layer randomly sets input elements to zero given by the dropout mask rand(size(X))<Probability, where X is the layer input and then scales the remaining elements by 1/(1-Probability). This operation effectively changes the underlying network architecture between iterations and helps prevent the network from overfitting. A higher number results in more elements being dropped during training. At prediction time, the output of the layer is equal to its input."
This explains why repeated calls to predictions with the same input result in the same output.
Has anyone come up with a workaround?
Thank you for your help,
-Dino
  1 Comment
Michael Phillips
Michael Phillips on 12 Mar 2021
Hi Dino - did you ever create a custom dropout layer that works during network testing? If so would you be willing to share it? Thanks!

Sign in to comment.

Answers (1)

Sourav Bairagya
Sourav Bairagya on 10 Feb 2020
Usually dropout layers are used during training to avoid overfitting of the neural network. Currenly, 'dropoutLayer' of 'Deep learning toolbox' doesn't performs dropout during prediction. If you want to use dropout during prediction, you can write a custom dropout layer which does dropout in both 'forward' and 'prediction' method.
You can leverage this link to get idea about writing custom layers:

Categories

Find more on Image Data Workflows in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!