Poor results for neural-network based image segmentation

I am trying to segment images of animal embryos, like this one:
I would like to extract the entire oval-shaped embryo without getting any of the background or the appendages that stick off of the embryo, like in this hand-drawn mask:
I have around 350 training images of embryos that have been hand-segmented like this one, and I had trained a small convolutional neural-network to try and segment these images automatically. The network has this structure:
opts = trainingOptions('sgdm', ...
'InitialLearnRate',1e-3, ...
'MaxEpochs',5, ...
'MiniBatchSize',4);
numFilters = 64;
%%
filterSize = 3;
numClasses = 2;
layers = [
imageInputLayer([500 1000 1])
convolution2dLayer(filterSize,numFilters,'Padding','same')
reluLayer()
maxPooling2dLayer(2,'Stride',2)
convolution2dLayer(filterSize,numFilters,'Padding','same')
reluLayer()
transposedConv2dLayer(4,numFilters,'Stride',2,'Cropping','same');
convolution2dLayer(1,numClasses);
softmaxLayer()
pixelClassificationLayer()
];
Training the network for with the settings above leads to an accuracy of around 94%, but when I actually look at its performace on the training images, it is not doing a good job with removing the appendages:
This problem persists for most of the images in the training set, and I haven't even tested it on validation data because it's performing so poorly on the training set. I can't manually chop off the appendages via image erosion, because the angle and length of the appendages changes, so I would need to set the image erosion parameters manually for each image, and I have hundreds of thousands of images.
What can I do to improve performance of the pixel classification network?
Thank you!

 Accepted Answer

I can't manually chop off the appendages via image erosion
bwlalphashape from this FEX download,
may help.
A=imread('https://www.mathworks.com/matlabcentral/answers/uploaded_files/1514819/image.png');
A=im2gray(A);
B=imfill(A>135,'holes');
mask=~bwlalphaclose(~B,45);
imshow(imfuse(A,mask,'falsecolor'))

5 Comments

Thank you for your answer! This is almost there, but the angles where green meets pink are slightly off, which results in sharp corners in the mask. I will keep tinkering with the FEX download you mentioned. But I don't really think any combination of built-in matlab functions will be able to get the smooth mask that I showed above, without hand-tuning function parameters for each separate image, which is not feasible for me.
But the stem will always be in the upper left, correct? And about that width?
The stem is always in the upper left corner, correct. Here are some other example images. The widths are similar, but angle the appendages make with the embryo, and the curvatuve of the embryo in that region, varies:
Here's a smoother version,
Images=["https://www.mathworks.com/matlabcentral/answers/uploaded_files/1514934/image.png",
"https://www.mathworks.com/matlabcentral/answers/uploaded_files/1514939/image.png",
"https://www.mathworks.com/matlabcentral/answers/uploaded_files/1514944/image.png",
"https://www.mathworks.com/matlabcentral/answers/uploaded_files/1514949/image.png"];
for i=1:numel(Images)
figure
getMask(Images{i});
end
function mask=getMask(Image)
A=imread(Image);
A=im2gray(A);
[m,n]=size(A);
B=bwareafilt(imfill(A<220,'holes'),1);
C=bwareafilt(~bwlalphaclose(~B,40),1);
b=bwboundaries(C); b=fliplr(b{1});
b=sgolayfilt(b,3,231,[],1);
mask=poly2mask(b(:,1), b(:,2),m,n);
mask=imerode(mask,strel('disk',21));
mask=imdilate(mask,strel('disk',22));
imshow(labeloverlay(A,mask,'Transparency',0.85,'Color','spring'));
end
That looks amazing! Thank you, that solves my problem.

Sign in to comment.

More Answers (1)

Maybe increase the fitting capacity of the network. Add another encoder/decoder layer and/or increase the filter size?

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Asked:

on 18 Oct 2023

Edited:

on 19 Oct 2023

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!