Camera characterization gives "symetrical results"?

I downloaded the code from :
I modified the code for my purposes. Please find all the attached elements to run the script at your end. What I don't understand is how the final resulting image looks exactly like the starting image? I understand how the 3rd degree polynomial is calculated but there must be something I miss in applying the M3 matrix back to my original image?
Any help is appreciated.

2 Comments

I am no mathematician but I reasoned, once the M3 matrix is calculated, and the values of the camera responses converted to XYZ, before the Lab / DeltaE color differences are computed, why not "peek" into the XYZ values? So that's what I did in the following, by copying the XYZ values into a Temporary variable, for later convert to sRGB to display on my monitor :
M3=txyz'/trgb3';
pxyz3 = (M3*trgb3')';
pxyz3(pxyz3<0)=0;
Temp = pxyz3;
Original = xyz2srgb_D50(Temp);
Original255 = Original*255;
Original_img = reshape(Original, 14, 10, 3);
resized_orginal = imresize(Original_img, [14*200, 10*200], 'nearest');
rotated_image = rot90(resized_orginal);
flipped_vertical = flipud(rotated_image);
imshow(flipped_vertical);
The result is "perfect" :
Still. I'm unlcear as two what is that I'm doing that does not give me these results when I process the image through the same M3 matric?
Well, ... I'll be damned. I went back to redo the RGB "patch extraction". I started off the very same RGB image I used for "validation" (DSC_0331 Assemblé Redressé 5x3.png). I ran the image through the 'patchmask' m file. Had to massage the data around in Excel to get it into a 140x3 form. But then I executed the script same as before, exactly, using this data. And, lo and behold, it looks like I'm getting the expected result, this time. Here is a screen capture of the final (beautiful!) "matched" image :
The results are not "perfect" when I inspect row 5 'neutral patches' but it's mighty encouraging.
Now the question is "Can I replicate this experiment?", just to be sure before I brag to my Photoshop students...

Sign in to comment.

Answers (1)

Looks like there was a discrepancy between the RGB data used to train the model and the RGB data used to test the model. It's so much tedious work, at this stage? I was trying to avoid recreating everything from scratch... I was confident my data was "right" and it's only after having paid umpteen visits to ChatGPT and Twitter Grok, and scraping the web for possible references, that I decided in extremis to bite the bullet and recreate the original training data set again, painstakingly... Hope my mistake helps someone in the future.

Products

Release

R2023a

Asked:

on 25 Jan 2025

Answered:

on 27 Jan 2025

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!