GAN -- differentiable must be traced dlarray scalar--- error

8 views (last 30 days)
Hi,
I am implementing Generative Adversial Network on 271x30x3 .mat dataset. I am getting an error in dlgradient function while trying to train the network.
"the differentiable must be a traced dlarray scalar"
I am attaching the code here.. please help me.
augmenter = imageDataAugmenter( ...
'RandXReflection',true, ...
'RandScale',[1 2]);
augimds = augmentedImageDatastore([271 30],imds,'DataAugmentation',augmenter);
filterSize = [5 3];
numFilters = 64;
numLatentInputs = 100;
layersGenerator = [
imageInputLayer([12 1 numLatentInputs],'Normalization','none','Name','in')
transposedConv2dLayer(filterSize,8*numFilters,'Name','tconv1')
batchNormalizationLayer('Name','bn1')
reluLayer('Name','relu1')
transposedConv2dLayer(filterSize,4*numFilters,'Stride',2,'cropping',1,'Name','tconv2')
batchNormalizationLayer('Name','bn2')
reluLayer('Name','relu2')
transposedConv2dLayer(filterSize,2*numFilters,'Stride',2,'cropping',1,'Name','tconv3')
batchNormalizationLayer('Name','bn3')
reluLayer('Name','relu3')
transposedConv2dLayer([5 2],numFilters,'Stride',2,'cropping',1,'Name','tconv4')
batchNormalizationLayer('Name','bn4')
reluLayer('Name','relu4')
transposedConv2dLayer([5 2],3,'Stride',2,'cropping',1,'Name','tconv5')
tanhLayer('Name','tanh')
];
lgraphGenerator = layerGraph(layersGenerator);
dlnetGenerator = dlnetwork(lgraphGenerator);
scale = 0.2;
dropoutProb = 0.5;
layersDiscriminator = [
imageInputLayer([271 30 3],'Normalization','none','Name','in')
convolution2dLayer(filterSize,numFilters,'Stride',3,'Padding',1,'Name','conv1')
reluLayer('Name','lrelu1')
convolution2dLayer(filterSize,2*numFilters,'Stride',3,'Padding',1,'Name','conv2')
batchNormalizationLayer('Name','bn2')
reluLayer('Name','lrelu2')
maxPooling2dLayer(2,'Stride',1,'padding',1,'Name','mp1');
convolution2dLayer(filterSize,4*numFilters,'Stride',3,'Padding',1,'Name','conv3')
batchNormalizationLayer('Name','bn3')
reluLayer('Name','lrelu3')
maxPooling2dLayer(2,'Stride',1,'padding',1,'Name','mp2');
batchNormalizationLayer('Name','bn4')
reluLayer('Name','lrelu4')
];
lgraphDiscriminator = layerGraph(layersDiscriminator);
dlnetDiscriminator = dlnetwork(lgraphDiscriminator);
figure
subplot(1,2,1);
plot(lgraphGenerator);
title("Generator");
subplot(1,2,2);
plot(lgraphDiscriminator);
title("Discriminator");
numEpochs = 1000;
miniBatchSize = 25;
augimds.MiniBatchSize = miniBatchSize;
flipFactor = 0.3;
learnRateGenerator = 0.0002;
learnRateDiscriminator = 0.0001;
trailingAvgGenerator = [];
trailingAvgSqGenerator = [];
trailingAvgDiscriminator = [];
trailingAvgSqDiscriminator = [];
gradientDecayFactor = 0.5;
squaredGradientDecayFactor = 0.999;
ZValidation = randn(1,1,numLatentInputs,64,'single');
dlZValidation = dlarray(ZValidation,'SSCB');
f = figure;
f.Position(3) = 2*f.Position(3);
imageAxes = subplot(1,2,1);
scoreAxes = subplot(1,2,2);
lineScoreGenerator = animatedline(scoreAxes,'Color',[0 0.447 0.741]);
lineScoreDiscriminator = animatedline(scoreAxes, 'Color', [0.85 0.325 0.098]);
legend('Generator','Discriminator');
ylim([0 1])
xlabel("Iteration")
ylabel("Score")
grid on
iteration = 0;
start = tic;
% Loop over epochs.
for i = 1:numEpochs
% Reset and shuffle datastore.
reset(augimds);
augimds = shuffle(augimds);
% Loop over mini-batches.
while hasdata(augimds)
iteration = iteration + 1;
% Read mini-batch of data.
data = read(augimds);
% Ignore last partial mini-batch of epoch.
if size(data,1) < miniBatchSize
continue
end
% Concatenate mini-batch of data and generate latent inputs for the
% generator network.
X = cat(4,data{:,1}{:});
Z = randn(12,1,numLatentInputs,size(X,4),'single');
% Normalize the images
X = (single(X)/255)*2 - 1;
% Convert mini-batch of data to dlarray specify the dimension labels
% 'SSCB' (spatial, spatial, channel, batch).
dlX = dlarray(X, 'SSCB');
dlZ = dlarray(Z, 'SSCB');
% Evaluate the model gradients and the generator state using
% dlfeval and the modelGradients function listed at the end of the
% example.
[gradientsGenerator, gradientsDiscriminator, stateGenerator, scoreGenerator, scoreDiscriminator] = dlfeval(@modelGradients, dlnetGenerator, dlnetDiscriminator, dlX, dlZ, flipFactor);
dlnetGenerator.State = stateGenerator;
% Update the discriminator network parameters.
[dlnetDiscriminator.Learnables,trailingAvgDiscriminator,trailingAvgSqDiscriminator] = ...
adamupdate(dlnetDiscriminator.Learnables, gradientsDiscriminator, ...
trailingAvgDiscriminator, trailingAvgSqDiscriminator, iteration, ...
learnRateDiscriminator, gradientDecayFactor, squaredGradientDecayFactor);
% Update the generator network parameters.
[dlnetGenerator.Learnables,trailingAvgGenerator,trailingAvgSqGenerator] = ...
adamupdate(dlnetGenerator.Learnables, gradientsGenerator, ...
trailingAvgGenerator, trailingAvgSqGenerator, iteration, ...
learnRateGenerator, gradientDecayFactor, squaredGradientDecayFactor);
% Every 100 iterations, display batch of generated images using the
% held-out generator input.
if mod(iteration,validationFrequency) == 0 || iteration == 1
% Generate images using the held-out generator input.
dlXGeneratedValidation = predict(dlnetGenerator,dlZValidation);
% Tile and rescale the images in the range [0 1].
I = imtile(extractdata(dlXGeneratedValidation));
I = rescale(I);
% Display the images.
subplot(1,2,1);
image(imageAxes,I)
xticklabels([]);
yticklabels([]);
title("Generated Images");
end
% Update the scores plot
subplot(1,2,2)
addpoints(lineScoreGenerator,iteration,...
double(gather(extractdata(scoreGenerator))));
addpoints(lineScoreDiscriminator,iteration,...
double(gather(extractdata(scoreDiscriminator))));
% Update the title with training progress information.
D = duration(0,0,toc(start),'Format','hh:mm:ss');
title(...
"Epoch: " + epoch + ", " + ...
"Iteration: " + iteration + ", " + ...
"Elapsed: " + string(D))
drawnow
end
end
ZNew = randn(1,1,numLatentInputs,16,'single');
dlZNew = dlarray(ZNew,'SSCB');
dlXGeneratedNew = predict(dlnetGenerator,dlZNew);
I = imtile(extractdata(dlXGeneratedNew));
I = rescale(I);
figure
image(I)
axis off
title("Generated Images")
function [gradientsGenerator, gradientsDiscriminator, stateGenerator, scoreGenerator, scoreDiscriminator] = ...
modelGradients(dlnetGenerator, dlnetDiscriminator, dlX, dlZ, flipFactor)
% Calculate the predictions for real data with the discriminator network.
dlYPred = forward(dlnetDiscriminator, dlX);
% Calculate the predictions for generated data with the discriminator network.
[dlXGenerated,stateGenerator] = forward(dlnetGenerator,dlZ);
dlYPredGenerated = forward(dlnetDiscriminator, dlXGenerated);
% Convert the discriminator outputs to probabilities.
probGenerated = sigmoid(dlYPredGenerated);
probReal = sigmoid(dlYPred);
% Calculate the score of the discriminator.
scoreDiscriminator = ((mean(probReal)+mean(1-probGenerated))/2);
% Calculate the score of the generator.
scoreGenerator = mean(probGenerated);
% Randomly flip a fraction of the labels of the real images.
numObservations = size(probReal,4);
idx = randperm(numObservations,floor(flipFactor * numObservations));
% Flip the labels
probReal(:,:,:,idx) = 1-probReal(:,:,:,idx);
lossGenerator = -mean(log(probGenerated));
lossDiscriminator = -mean(log(probReal)) -mean(log(1-probGenerated));
% For each network, calculate the gradients with respect to the loss.
gradientsDiscriminator = dlgradient(lossDiscriminator, dlnetDiscriminator.Learnables);
gradientsGenerator = dlgradient(lossGenerator, dlnetGenerator.Learnables,'RetainData',true);
end
function I = matRead(digitDatasetPath)
% Load data and get matrices from the structure
Inp = load(digitDatasetPath);
f=fields(Inp);
I = Inp.(f{1});
end

Answers (1)

Gayathri
Gayathri on 9 May 2025
Edited: Gayathri on 9 May 2025
To resolve the error please try calling the "dlgradient" function using a "dlfeval" function from the "modelGradients" function as shown below. Please do not use "dlfeval" function to invoke the "modelGradients" function.
function [gradientsGenerator, gradientsDiscriminator, stateGenerator, scoreGenerator, scoreDiscriminator] = ...
%%all the code till here as it is, just illustrating the main part%%
lossGenerator = -mean(log(probGenerated));
lossDiscriminator = -mean(log(probReal)) -mean(log(1-probGenerated));
% For each network, calculate the gradients with respect to the loss.
[gradientsDiscriminator, gradientsGenerator] = firstOrderGradients(lossDiscriminator, lossGenerator, dlnetDiscriminator.Learnables, dlnetGenerator.Learnables)
end
function [gradientsDiscriminator, gradientsGenerator] = firstOrderGradients(lossDiscriminator, lossGenerator, discLearnables, genLearnables)
gradientsDiscriminator = dlgradient(lossDiscriminator, discLearnables);
gradientsGenerator = dlgradient(lossGenerator, genLearnables,'RetainData',true);
end
For more information on "dlfeval" function, refer to the below link:
Hope this resolves the issue!

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!