LSTM Sequence to One Regression

9 views (last 30 days)
juan pedrosa
juan pedrosa on 26 Oct 2019
Commented: juan pedrosa on 7 Jun 2021
I'm trying to train a LSTM network for Sequence to one regression, but I'm having problems with my dataset, although I'm using the definition given by Mathworks here
My train set is a N by 1 cell array where N=2,396,493 and each sequence is an 8 by 22 double.
My response set is a N by R matrix where N=2,396,493 and R = 8
I'm using a mini batch size of 300 and when I try to train the network this is the error output:
Error using trainNetwork (line 165)
Unable to perform assignment because the size of the left side is 8-by-300 and the size of the right side is 1-by-300.
I've tried different setups for the response set by transposing it or make it an N by 1 cell array to no results. I did trained a Sequence to sequence network but I think I'll get better results with a Sequence to one network, any advices please?
[EDIT]
It seems that the minibatch size is the problem (bug?), if the minibatch size is set to 1 then the training begins without issues.
Thank you for your time.
  1 Comment
juan pedrosa
juan pedrosa on 17 Sep 2020
it has been almost a year and the error still prevails, I've lost hope in matlab, I'm moving to tensorflow.

Sign in to comment.

Answers (3)

shubhan vaishnav
shubhan vaishnav on 11 Feb 2021
send the code

Josephine Morgenroth
Josephine Morgenroth on 19 Apr 2021
I'm having the same issue - trying to the use sequence-to-one framework using OutpuMode = 'last' with no success. I have a time series dataset with 10 features to predict 3 targets, with a total of 30 sequence/target rows. The code runs fine, but the LSTM predicts the same value for all the sequences! Has anyone seen an example where this structure was successfully used in MATLAB?
  1 Comment
Michael Hesse
Michael Hesse on 11 May 2021
hi josephine, i'm working on the same problem. in the last couple days i figured out that the
padding option does have a huge impact on the training and prediction performance.
in particular for my case: the setting 'SequencePaddingDirection', 'left', ... has brought the breakthrough.
hope it helps, michael

Sign in to comment.


Niccolò Dal Santo
Niccolò Dal Santo on 7 Jun 2021
Hi Juan,
It would be very helpful if you could share the architecture you want to train, since this error might be caused by a mismatch between the output size of the network and the ground truth responses. In order to make sure that the network outputs the expected size (8 in your case) you can use the analyzeNetwork function to check the output of the network:
R = 8;
numFirstLstmHiddenUnits = 100;
numSecondLstmHiddenUnits = R;
layers = [sequenceInputLayer(R)
lstmLayer(numFirstLstmHiddenUnits, 'OutputMode', 'sequence')
lstmLayer(numSecondLstmHiddenUnits, 'OutputMode', 'last')
regressionLayer];
% Shows that the activations of the second lstmLayer, which is the output
% size of the network is equal to 8.
analyzeNetwork(layers)
In general, you should set the output size of the layer immediately preceding the regressionLayer to the expected size of the responses.
For example, if the layer preceding the regressionLayer is a lstmLayer, you should set its number of hidden units as the expected output size. The following piece of code trains a regression recurrent network with two LSTM layers. The first one has an arbitrary number of hidden units (set to 100), the second LSTM layer immediately precedes the regressionLayer, hence its number of hidden units is set to R=8, which is the size of each output observation:
N = 1000; % Number of observations.
X = cell(1,N);
R = 8; % Input and output size.
seqLength = 22; % Sequence lenghts
% Create training dataset inputs.
for i=1:N
X{i} = rand(R,seqLength);
end
% Create training dataset responses.
Y = rand(N, R);
% The first LSTM layer can have an arbitrary number of units
numFirstLstmHiddenUnits = 100;
% Define number of LSTM hidden size for the secons LSTM layer
% as the number of responses for each observation,
% since LSTM will be directly followed by a regressionLayer
% and that is our target output size.
numSecondLstmHiddenUnits = R;
layers = [sequenceInputLayer(R)
lstmLayer(numFirstLstmHiddenUnits, 'OutputMode', 'sequence')
lstmLayer(numSecondLstmHiddenUnits, 'OutputMode', 'last')
regressionLayer];
options = trainingOptions('sgdm', ...
'MaxEpochs', 2, ...
'MiniBatchSize',300, ...
'ExecutionEnvironment', 'cpu');
% Train network.
net = trainNetwork(X,Y,layers,options);
As an alternative, one can use a fullyConnectedLayer as layer immediately preceding the regressionLayer, with as many neurons as the expected output size of the network. This is shown for instance in this example for time series forecasting. In this case the previous example would be as follows:
N = 1000; % Number of observations.
X = cell(1,N);
R = 8;
seqLength = 22;
% Create training dataset inputs.
for i=1:N
X{i} = rand(R,seqLength);
end
% Create training dataset responses.
Y = rand(N, R);
% Define arbitrarily the number of units for the LSTM layers.
numLstmHiddenUnits = 100;
% Define the number of neurons of the fullyConnectedLayer preceding
% the regressionLayer to the expected output size.
numFullyConnectedNeurons = R;
layers = [sequenceInputLayer(R)
lstmLayer(numLstmHiddenUnits, 'OutputMode', 'sequence')
lstmLayer(numLstmHiddenUnits, 'OutputMode', 'last')
fullyConnectedLayer(numFullyConnectedNeurons)
regressionLayer];
options = trainingOptions('sgdm', ...
'MaxEpochs', 2, ...
'MiniBatchSize',300, ...
'ExecutionEnvironment', 'cpu');
% Train network.
net = trainNetwork(X,Y,layers,options);
Thanks,
Niccolò

Products


Release

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!