How to extract information from a 'true color' image and interpolate?

Good afternoon,
I would like to extract information contained in a 'true color' image. I explain myself a little bit. The attached image ('17p.png') is the vertical displacement in milimeters of a concrete beam at certain magnitude of vertical load. The displacement associated to each color is shown in the scale bar located to the right of the image. This bar has horizontal black lines across it to define some values. When you zoom in around a black line, the color palette is not continuous (see ZoomInBar.png). I would like to know the displacement "field" of the colored area in the beam (i.e., the displacement of each colored pixel). Then I will compare the displacement of two points (two pixels) with the data registered by two displacement transducers (see image LVDT-Points.png). I will repeat this process for multiple images. My idea to do this is:
(1) Open, extract the metadata and show the image
imagename = [num2str(i) 'p.jpg'];
[im,cmap] = imread(imagename); % Read the index and color map. Color map rescale to [0,1]
imshow(im,cmap); % Display the image
immetadata = imfinfo(imagename); % metadata to check the type of image
imageinfo(im,immetadata) % open the Image Information Tool.
(2) Select manually some points in the scale bar (ideally the point at the center between black lines), extract the RGB value, and associate a displacement to those values (manually). The scale will be constant in all the images, so I will do this only one time. For example: a point at the center of the second red box from top to bottom have a displacement of (1.52745+1.4263)/2=1.476875 mm
% Select the point on the bar (at the center)
disp('Select n points in the scale then press Enter');
[scx,scy] =ginput; % X&Y coordinates are changed when extracting info from the RGB
scx = int32(scx);
scy = int32(scy);
% Input the displacement for each color (average between two values
% associated to two black lines)
prompt = "Input the displacement values (n points, use [x1; x2; ...]): ";
imscale = input(prompt);
% Extract the RGB value of the n points. The first line (RGB color) of
% sc_fgr have a displacement value equal to the first line of imscale.
for j=1:length(imscale)
sc_rgb(j,1) = im(scy(j),scx(j),1); % Red, the image is 675x1219=scy,scx
sc_rgb(j,2) = im(scy(j),scx(j),2); % Green
sc_rgb(j,3) = im(scy(j),scx(j),3); % Blue
end
(3) Select two points in the colored area of the beam (indicated in LVDT-Point.png image) to extract their RGB values and then calculate the displacement based on a interpolation using the data associated to each color determined in step (2). Since the image is 'true color', the interpolation must use the same palette. The first idea I had was to calculate the displacement of those points. However, if I can have the entire displacement field of the whole colored area will be great.
% Select the points of interest
[x,y] =ginput;
px = int32(x);
py = int32(y);
% Extract the RGB value of the points.
for k=1:length(px)
p_rgb(k,1) = im(scy(k),scx(k),1); % Red, the image is 675x1219=scy,scx
p_rgb(k,2) = im(scy(k),scx(k),2); % Green
p_rgb(k,3) = im(scy(k),scx(k),3); % Blue
end
% Interpolate to calculate the displacement
% I do not know how to so this step. I will interpo
I don't know how to interpolate a RGB value (3 values, red-green-blue) with displacement (1 value). I changed the RGB to grayscale (inten = 0.299*sc_rgb(1,1) + 0.587*sc_rgb(1,2) + 0.114*sc_rgb(1,3)), but the dark red and dark blue (top and bottom in the scale bar) look like black.
It would be great to have the data associated with this image, but unfortunatelly I only have the images.
Thank you very muck for your kind help.
All the best,
Eric

1 Comment

This is going to be a problem unless you know how the original pseudocolor image was created.
There are a few things we would ideally want to know in order to know what the displacement is based on the pseudocolored region in the image:
  • the applied colormap
  • the original grayscale image without the overlay
  • the overlay opacity
That said, we might be able to cut some corners if we knew the applied colormap accurately (it appears to just be jet()). We might be able to just use hue as a proxy for the index value, but chances are that it wouldn't be sufficiently accurate.

Sign in to comment.

 Accepted Answer

As I said, the given image is the weighted sum of two things:
% compositeimage = alpha*pseudocolorimage + (1-alpha)*grayscaleimage;
In this case, we know compositeimage, and we know the set of colors from which pseudocolorimage was created, but we don't know pseudocolorimage itself. We don't know alpha or grayscaleimage (unless you know otherwise). We need to know pseudocolorimage in order to estimate the original data.
That leaves us with few options. We can try to use color information (specifically hue) on the assumption that it will be unaffected by the composition so long as the grayscale background image is neutral.
% read the image
inpict = imread('https://www.mathworks.com/matlabcentral/answers/uploaded_files/1517356/17p.png');
% crop out the rough area of interest
inpict = imcrop(inpict,[104 294 993 239]);
% get the hue of the applied colormap
maplen = 256;
CTref = jet(maplen); % the map does appear to be jet()
[Href,~,~] = rgb2hsv(CTref);
% function is not unique-valued (this is problematic)
% but luckily, most of the image stays away from those hues
plot(Href)
xlabel('index')
ylabel('hue')
% get the hue of the pseudocolor image region
[H,S,~] = rgb2hsv(inpict);
% convert the hue image into data-scale
indpict = rgb2ind(repmat(H,[1 1 3]),repmat(Href,[1 3]));
datarange = [0.0102 1.6286];
idxrange = [0 length(Href)-1];
%displacement = imrescale(indpict,idxrange,datarange); % nice and concise
displacement = rescale(indpict,datarange(1),datarange(2),... % ugh
'inputmin',idxrange(1),'inputmax',idxrange(2));
% find gray (unmapped) regions
% hue data will be invalid in these areas anyway
mask = S > 0.01;
% display it
hi = imagesc(displacement,datarange);
hi.AlphaData = mask; % hide the junk unmapped data
axis image
colorbar('location','southoutside')
colormap(jet)
As you can see, this has a couple problems. First, the applied colormap (jet()) is constant-valued on hue near its ends. Any parts of the image in those regions will be ambiguous. We could get an idea of where those ambiguous regions are and how much of a problem they actually pose. It's only a relatively small portion of the image.
Hreflim = [32 224]/256*size(Href,1); % manually picked
ambiguousregion = indpict<Hreflim(1) | indpict>Hreflim(2);
ambiguousregion = ambiguousregion & mask;
imshow(ambiguousregion)
Second, the image hue (after compression) is not entirely independent of the grayscale background. That can be seen as error in the estimated displacement image.
These are other examples of doing the reverse colormapping with plain pseudocolor images:
Related:

7 Comments

Hi DGM, many thanks for your prompt answer and throughtout explanation to make this work. Two thumbs up!!! I was looking for some answers to your comments. First, I tried to attached the original image (TIFF format) but it is too heavy (58.6 MB). If there a way I can share it with you? Google drive or email, maybe? Related to the other two pieces of information you request, I am afraid I don't knoe them.
Related to your explanation, I totally get the idea you are following. Correct me if I am wrong, but you are rescaling the colored section of the image using a colored map based on 256 colors and assigning a displacement value to each pixel. The resulted image looks blurry for the reason you mentioned. Those little dots you see in the image are black circles used in the test. If I followed you, with the original image, you can subtract those black circles from the colored image to get the "real" displacement (rainbow rectangle area) and hopefully get a smooth results. I don't know how to do that, though. Is there a way a can share the grayscale (alpha) image? Many thanks for your help
Cheers,
Eric
If we knew alpha (a scalar) and the original grayscale image, then we can calculate
pseudocolorimage = (compositeimage + grayscaleimage*(alpha - 1))/alpha;
(assuming they're all approprately scaled floats, of course)
That should give us a plain opaque pseudocolor image without any background content. In reality, there may still be some artifacting from the JPG, but if you were working from TIFFs, that might not be an issue.
In lieu of those missing bits of the original composition, you can try to process your TIFF using the example code to see if it's any different, but I don't know that I could do much more with it alone.
One thing to consider is that it's a possibility that what I'm attributing to hue error might actually be part of the original pseudocolor displacement map. I don't know how that map was created, so maybe that process itself is sensitive to variations in image value.
Other than that, I'm not sure of a better way to estimate the orignal data from a composite image like this. This is kind of a fundamental problem with making semitransparent pseudocolor visualizations of data. It's very convenient to have the data colocated on the photograph, but even by eye, the background content is a detriment to readability.
As to the data rescaling, we know visually that Y = [0.0102 1.6286] corresponds to the ends of the applied colormap. We can exploit rgb2ind() to find the closest match in the reference color table, and the result is an array of indices in the range [0 length(Href)-1]. I suppose I should fix the way I did it in the answer... Either way, we can use rescale() or imrescale() to do the simple linear rescaling from map indices to displacement values.
DGM, thanks for all your help and guidelines. Indeed, the hue error you mentioned is actually part of the displacement field of the beam. If you pay attention to the image, a support was placed below each of those areas preventing the beam movement, so the results you got seems right. I'll try to process the TIFF file as you suggest and ask for the original data to get the original color map.
Cheers,
Hello DGM,
If you have some time I would like to ask you a follow up question related to this issue. The script you provided worked wonderfully until I found an image with dark red as the maximum displacement value. This means the complete red band is used in the displacement map (see Fig1.png). The processed image shows one single red color (same Hue value, see Fig2.png) which ended up in the same value of displacement (see Fig3.png). Is there a wat to extract the displacement values of different red colors? (use a diferent color map or "cut" the HSV color map to avoid two red scales at the end of it). I don't know if this has to do with the problems you mentioned in your original answer.
Many thanks for your help,
Cheers,
Erick
Oof. Well, we kind of knew this would be a problem. I came up with a workaround, but it's not pretty. There's probably a better way, but this is what I have.
Disregarding the particular behavior of jet() for a moment, we need some color component which is unaffected by a scalar alpha composition in sRGB. Ignoring extreme cases, H would normally work, but obviously brightness components (e.g. V, L,Y) will be affected by the BG content. Similarly, HSV/HSL S will be affected by anything which affects brightness. On the other hand, chroma should be brightness-invariant. So long as the alpha is constant over the image (true for scalar alpha), we shouldn't have to worry about the offset caused by the composition, so long as C is normalized consistently.
We know that jet() is constant-valued on H near its ends. It just so happens that it's linear on C in those same regions. We can create a composite function of H and C which will yield a strictly monotonic behavior over the jet() map. Once that's possible, we can use that function value instead of H to do the mapping.
% read the image
inpict = imread('https://www.mathworks.com/matlabcentral/answers/uploaded_files/1536480/Fig1.png');
% crop out the rough area of interest
inpict = imcrop(inpict,[39 193 1002 276]);
% get the components of the applied colormap
maplen = 256;
CTref = jet(maplen); % the map does appear to be jet()
[Href,~,Vref] = rgb2hsv(CTref);
Cref = Vref - min(CTref,[],2); % HSV chroma
Cref = mat2gray(Cref); % normalize
% create some composite function of H and C
% which is overall monotonic decreasing over the map length
Fref = generatemetric(Href,Cref);
plot(Fref)
xlabel('index')
ylabel('Fref')
% get the components of the pseudocolor image region
[H,S,V] = rgb2hsv(inpict);
C = V - min(im2double(inpict),[],3); % HSV chroma
C = mat2gray(C); % normalize
% convert to our composite metric as before
F = generatemetric(H,C);
% try just using direct interpolation instead of dealing with rgb2ind()
datarange = [0.128 16.49]; % from original colorbar extents
Dref = linspace(datarange(1),datarange(2),size(Fref,1)).';
displacement = interp1(Fref,Dref,F);
% find gray (unmapped) regions
% hue data will be invalid in these areas anyway
mask = S > 0.01;
% display it
hi = imagesc(displacement,datarange);
hi.AlphaData = mask; % hide the junk unmapped data
axis image
colorbar('location','southoutside')
colormap(jet)
% rather than doing logical masking to create a strictly piecewise function
% combine H and C as a weighted sum, using a function of H as a weighting array
% this allows the transitions to be eased, unlike a logical masking approach
% the particular offset values are somewhat arbitrary
function F = generatemetric(H,C)
% normalize H
H = mat2gray(H,[0 2/3]);
% tol must be significantly <0.37 in order to
% exclude contributions from the central peak of C
tol = 0.25;
% eased composition at bottom of map (blue)
alph = max((H - 1 + tol)/tol,0);
F = alph.*(2.5-C) + (1-alph).*H;
% eased composition at top of map (red)
alph = 1 - min(H/tol,1);
F = alph.*(C-1.5) + (1-alph).*F;
% normalize output
F = mat2gray(F,[-1.5 2.5]);
end
Bear in mind that there's still the annotation mark on the image.
Note also that I'm not using rgb2ind() here. Using interp1() is probably going to be a lot more accurate here. I really should've done that in the first place.
Hi, thank you for the updated script. It works as intended! I agree, a linear interpolation seems more reasonable in this case.
Thanks, you rocks!!
Yeah, I'm used to these pseudocolor-to-data estimation problems being presented in a way where using rgb2ind() is appropriate because we're essentially trying to do 3D lookup. This problem is fairly unique in that we were forced to reduce it to 1D, but I just kept using rgb2ind() out of habit.

Sign in to comment.

More Answers (0)

Asked:

on 21 Oct 2023

Commented:

DGM
on 14 Nov 2023

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!