Training Data type error for a CNN using trainnet function

17 views (last 30 days)
Trying to use a convolution1dLayer for my sequence input data put when I try to train it i get the error:
"Error using trainnet
Invalid targets. Network expects numeric or categorical targets, but received a cell array."
I've looked at many exemples of how the data must be structed but even if is in the same format, it doesn't work.
For the predictors I'm doing a test with only 4 observations, each one with 4 features and 36191 points:
For the targets there are also for observations with only one target each and also 36191 points:
I can't understand why it doesn't accept it, like I said, its equal to many other exemples. I leave down here the code for the CNN-LSTM network and the trainnet function:
lgraph = layerGraph();
tempLayers = [
sequenceInputLayer(4,"Name","input")
convolution1dLayer(4,32,"Name","conv1d","Padding","same")
globalAveragePooling1dLayer("Name","gapool1d")];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = lstmLayer(25,"Name","lstm");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = lstmLayer(25,"Name","lstm_1");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
concatenationLayer(1,2,"Name","concat")
lstmLayer(55,"Name","lstm_2")
dropoutLayer(0.5,"Name","drop")
fullyConnectedLayer(1,"Name","fc")
sigmoidLayer("Name","sigmoid")];
lgraph = addLayers(lgraph,tempLayers);
% clean up helper variable
clear tempLayers;
lgraph = connectLayers(lgraph,"gapool1d","lstm");
lgraph = connectLayers(lgraph,"gapool1d","lstm_1");
lgraph = connectLayers(lgraph,"lstm","concat/in1");
lgraph = connectLayers(lgraph,"lstm_1","concat/in2");
plot(lgraph);
epochs = 800;
miniBatchSize = 128;
LRDropPeriod = 200;
InitialLR = 0.01;
LRDropFactor = 0.1;
valFrequency = 30;
options = trainingOptions("adam", ...
MaxEpochs=epochs, ...
SequencePaddingDirection="left", ...
Shuffle="every-epoch", ...
GradientThreshold=1, ...
InitialLearnRate=InitialLR, ...
LearnRateSchedule="piecewise", ...
LearnRateDropPeriod=LRDropPeriod, ...
LearnRateDropFactor=LRDropFactor, ...
MiniBatchSize=miniBatchSize, ...
Plots="training-progress", ...
Metrics="rmse", ...
Verbose=0, ...
ExecutionEnvironment="parallel");
CNN_LTSM = trainnet(trainDataX, trainDataY, dlnetwork(lgraph),"mse",options);
using version 2023b

Accepted Answer

Milan Bansal
Milan Bansal on 23 Jul 2024
Hi Zowie Silva,
I understand that you are facing an error while using trainnet function for training a sequence-to-sequence neural network problem.
Use "analyzeNetwork" to look into the structure of your neural network.
From the above screenshot, it can be observed that the model is losing the "T" dimension after "gapool1D" layer. Due to which the model is expecting a numerical or categorical target value instead of sequence.
The Neural Network built by you is using globalAveragePooling1dLayer. As per the documentation of globalAveragePooling1dLayer, for time series and vector sequence input (data with three dimensions corresponding to the "C" (channel), "B" (batch), and "T" (time) dimensions), the layer pools over the "T" (time) dimension. That means, after the pooling, only "C" and "B" dimension will remain.
If you wish to do the average pooling along channel dimension, you can create a functionLayer and replace the "gapool1D" layer with it. Please refer to the following code snippet to modify your model:
lgraph = layerGraph();
functionLayer = functionLayer(@(X) mean(X, 1), 'Name', 'avgPoolChannel'); % defining function layer
tempLayers = [
sequenceInputLayer(4,"Name","input")
convolution1dLayer(4,32,"Name","conv1d","Padding","same") % using the function layer
functionLayer];
lgraph = addLayers(lgraph,tempLayers);
tempLayers = lstmLayer(25,"Name","lstm");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = lstmLayer(25,"Name","lstm_1");
lgraph = addLayers(lgraph,tempLayers);
tempLayers = [
concatenationLayer(1,2,"Name","concat")
lstmLayer(55,"Name","lstm_2")
dropoutLayer(0.5,"Name","drop")
fullyConnectedLayer(1,"Name","fc")
sigmoidLayer("Name","sigmoid")];
lgraph = addLayers(lgraph,tempLayers);
% clean up helper variable
clear tempLayers;
lgraph = connectLayers(lgraph,"avgPoolChannel","lstm"); % connect to function layer
lgraph = connectLayers(lgraph,"avgPoolChannel","lstm_1"); % connect to function layer
lgraph = connectLayers(lgraph,"lstm","concat/in1");
lgraph = connectLayers(lgraph,"lstm_1","concat/in2");
Please refer to the following documentation links to learn more about:
Hope this helps!
  1 Comment
Zowie Silva
Zowie Silva on 24 Jul 2024
Hi Milan.
Firstly, thank you for your support and explanation!
I've tried your code and I've received the following error:
Layer 'input': Invalid input data. Invalid size of channel dimension. Layer expects input with channel dimension size 4 but received input with size 36191.
remoteNetwork = train(remoteTrainer, remoteNetwork, workerMbq);
spmd
[net, trainerInfo] = computeTraining(trainer, net, mbq);
net = train(trainer, net, mbq);
Error in trainnet (line 91)
[net,info] = deep.internal.train.trainnet(mbq, net, loss, options, ..."
But to solve it I've just change de dimensions of the cells in TrainDataX and TrainDataY to (36191x4 and 36191x1, respectively)
Again, thanks a lot for the help, this problem has been quite the headache.

Sign in to comment.

More Answers (0)

Categories

Find more on Image Data Workflows in Help Center and File Exchange

Products


Release

R2023b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!