Ad
Ad
Ad
Pages: « 1 [2] 3 4 5 »   Bottom of Page
Print
Author Topic: More Color Checker questions WRT accurate color  (Read 16408 times)
Tim Lookingbill
Sr. Member
****
Offline Offline

Posts: 1053



WWW
« Reply #20 on: November 26, 2011, 06:45:27 PM »
ReplyReply

Wonder what light source they used to develop and define the "Standard Observer" when coming up with the Lab color model?

Was that D50? And how bright was it? 100cd/m2? 120 cd/m2?

Does the "Standard Observer" and Lab color space model know how bright my monitor and ambient light are?

I really don't see how exact precise numbers can be maintained with all the variances involved and demonstrated reproducing an image, but considering all the billions of images online, we seem to do quite well.
Logged
MarkM
Full Member
***
Offline Offline

Posts: 230



WWW
« Reply #21 on: November 26, 2011, 07:04:30 PM »
ReplyReply

Wonder what light source they used to develop and define the "Standard Observer" when coming up with the Lab color model?

Was that D50? And how bright was it? 100cd/m2? 120 cd/m2?

Does the "Standard Observer" and Lab color space model know how bright my monitor and ambient light are?

The CIE D illuminants like D50 did not yet exist when the standard observer color matching functions were made.

The standard observer data is based on two sets of experiments. One from John Guild and one from David Wright. Guild's experiments matched monochromatic sources to trichromatic primaries from filtered tungsten light (2900K). Wright's experiments matched the monochromatic sources to monochromatic primaries at 650, 530, and 460 nanometers. Since they are comparing emissive sources, the illuminant is not really relevant.

How the CIE went from Guild & Wright's data to 'modern' colorimetry is an interesting story which can be read in this PDF: http://www.cis.rit.edu/research/mcsl2/research/broadbent/CIE1931_RGB.pdf

LAB space, of course, came much later—1976.

Logged

Tim Lookingbill
Sr. Member
****
Offline Offline

Posts: 1053



WWW
« Reply #22 on: November 26, 2011, 08:17:12 PM »
ReplyReply

The CIE D illuminants like D50 did not yet exist when the standard observer color matching functions were made.

The standard observer data is based on two sets of experiments. One from John Guild and one from David Wright. Guild's experiments matched monochromatic sources to trichromatic primaries from filtered tungsten light (2900K). Wright's experiments matched the monochromatic sources to monochromatic primaries at 650, 530, and 460 nanometers. Since they are comparing emissive sources, the illuminant is not really relevant.

How the CIE went from Guild & Wright's data to 'modern' colorimetry is an interesting story which can be read in this PDF: http://www.cis.rit.edu/research/mcsl2/research/broadbent/CIE1931_RGB.pdf

LAB space, of course, came much later—1976.



Oh, yeah, I got the "Standard Observer" mixed up with Lab space. How could I have done that? Now it's clear as mud. I really don't know how the hell we got this far with this much confusion.

Oh great! Another damn white paper to make it all clear as mud.
Logged
RFPhotography
Guest
« Reply #23 on: November 27, 2011, 07:44:41 AM »
ReplyReply


Oh great! Another damn white paper to make it all clear as mud.
   Grin

As far as I know, the current DNG spec can handle either matrix or LUT profiles.  Which type either the DNG Profile Editor or the XRite Passport software create is unknown.

It seems to make sense that the XRite software is assuming a D65 source since the published values are for the sRGB space.  If it's doing a transform from one illuminant (the shooting condition) to D65, that will introduce errors as has been pointed out.  But what is it assuming as the shooting condition to do that transform?  The WB of the image?  If so, then would using an image to create the profile that isn't properly white balanced (i.e., against one of the neutral patches) potentially create a larger error factor?

Using these profiles as anything other than a decent starting point is really expecting too much.  24 patches isn't a lot.  To really get something even close to 'accuracy' would require a much larger number of patches.  The Colormunki reads 100 patches, if memory serves.  Other profile creation tools more than that.  Even then, profiles aren't perfect. 
Logged
bjanes
Sr. Member
****
Offline Offline

Posts: 2714



« Reply #24 on: November 27, 2011, 07:59:49 AM »
ReplyReply

One question that comes to mind: are DNG profiles matrix-based? If so, it's certainly understandable that it would not be able to bring all the patches into line. If it's got a lookup-table though, I would expect the individual patches to match almost by definition.

The answer is both. See the Adobe documentation here.

Regards,

Bill
Logged
bjanes
Sr. Member
****
Offline Offline

Posts: 2714



« Reply #25 on: November 27, 2011, 08:20:57 AM »
ReplyReply


As far as I know, the current DNG spec can handle either matrix or LUT profiles.  Which type either the DNG Profile Editor or the XRite Passport software create is unknown.

The DNG profile editor can create both matrix and LUT profiles as per the Adobe tutorial.


It seems to make sense that the XRite software is assuming a D65 source since the published values are for the sRGB space.  If it's doing a transform from one illuminant (the shooting condition) to D65, that will introduce errors as has been pointed out.  But what is it assuming as the shooting condition to do that transform?  The WB of the image?  If so, then would using an image to create the profile that isn't properly white balanced (i.e., against one of the neutral patches) potentially create a larger error factor?

Actually, the X-rite spec publishes values for both D65 sRGB and D50 L*a*b. One can use the Bradford chromatic adaption model to convert from one space to another with a different illuminant, and the average error for converting from D50 to D65 is about 1.4 Delta E (Danny Pascale). The question arises, does this take into account the illuminant used to take the picture or merely the transformation from a D50 to D65 space when the illuminant used to take the picture was D50?

Regards,

Bill
Logged
digitaldog
Sr. Member
****
Offline Offline

Posts: 8027



WWW
« Reply #26 on: November 27, 2011, 09:26:47 AM »
ReplyReply

My impression is that if your profile can reproduce these 24 colors to match published values then in theory other colors would fall more or less into place within an acceptable range.

More or less, maybe yes, maybe no. But bigger issue is the assumption that nailing 24 colors now in some way produces a profile or process that creates ideal, push button color. I don't know anyone working in reproduction that has found, a profile (DNG or ICC) produces this effect solely using a profile. Depending on the original, even in a fixed and controlled studio condition, there's a bit more work involved to produce a visual match.


Quote
Isn't that the whole point of jumping through the hoop of making custom DNG profiles? If the only advantage of custom profiles is that you can accurately reproduce a macbeth chart (and even that seems questionable at this point) why would anyone do it? Why not just use a grey card and go from there?

The point is to get closer to your goal. There's no question that good profiles do this and there's no question in my mind that the don't acheive any guarantees of all other colors becoming visually acceptable or "accurate".  

Quote
Also, all the talk of different spectral distributions is a bit beside the point. We're not trying to match spectral reflectance—I think we'd all be happy with a metameric match.

Its not beside the point when we actually have to prove something like accurate color reproduction. Or prove the process isn't partially subjective in terms of a visual match (since the numbers don't produce a match in your example). Numbers that should provide a match don't and numbers that do provide a visual match should not (the beauty of a metameric match).

The OP asks why the numbers are not working completely. Eric beautifully described why. My initial point was, just producing a visual match of super 24 patches is useful if you want to match 24 solid colors. It doesn't guarantee all other colors will visually match (and at this point, the OP still needs to convert to an output space and print the data I assume. That too will complicate the success of the match of other colors).

Profiles ARE useful. But lets not put too much faith in what they provide in something as complex as are reproduction, a difficult process.
« Last Edit: November 27, 2011, 09:42:34 AM by digitaldog » Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
RFPhotography
Guest
« Reply #27 on: November 27, 2011, 09:37:33 AM »
ReplyReply

The DNG profile editor can create both matrix and LUT profiles as per the Adobe tutorial.
Fine.  That leaves the XRite then.  

Quote
Actually, the X-rite spec publishes values for both D65 sRGB and D50 L*a*b. One can use the Bradford chromatic adaption model to convert from one space to another with a different illuminant, and the average error for converting from D50 to D65 is about 1.4 Delta E (Danny Pascale). The question arises, does this take into account the illuminant used to take the picture or merely the transformation from a D50 to D65 space when the illuminant used to take the picture was D50?

Regards,

Bill

Yes, that's right, it does.  I knew that since I posted it earlier.  Forgot.  Re: the highlighted section, that's basically the question I was posing.  Is the illuminant of the actual image taken into account, if so how and how is the transformation being done from that to 6500 or 2850 (I don't say 5000 because 5000 isn't an option for creating the profile).  Or does it simply assume the image was taken under one of the two lighting conditions that can be used to create the profile (6500 or 2850)?  Either way, the potential error factor gets larger, doesn't it?
« Last Edit: November 27, 2011, 09:42:02 AM by BobFisher » Logged
sandymc
Full Member
***
Offline Offline

Posts: 235


« Reply #28 on: November 27, 2011, 09:44:33 AM »
ReplyReply

One question that comes to mind: are DNG profiles matrix-based? If so, it's certainly understandable that it would not be able to bring all the patches into line. If it's got a lookup-table though, I would expect the individual patches to match almost by definition.

DNG profiles must have a matrix, and can optionally have tables. So far as I know however (Eric may correct me on this), the Profile Editor does not use tables for color matching.

Sandy
Logged
elolaugesen
Full Member
***
Offline Offline

Posts: 189


« Reply #29 on: November 27, 2011, 03:52:13 PM »
ReplyReply

I am not very experienced in this.  I also take images of original paintings and sometimes have problems with accurate colours.  If I open an image in ACR then it seems to me there is a potential problem in that when you save the file in DNG and then go to Color passport  you may inadvertantly have changed the image. 
There are a lot of defaults in ACR and it is easy to miss turning all of them off.  If one is on and you then convert to DNG and then go to passport you are not working with the original camera colors.....

The recommended procedure I was given was to convert to DNG using the Adobe DNG Converter.  Then Drop the image in  to Color passport..

I might be wrong.     But it is so easy to mess this up by missing a step..

cheers elo
Logged
elolaugesen
Full Member
***
Offline Offline

Posts: 189


« Reply #30 on: November 27, 2011, 04:25:45 PM »
ReplyReply

See this thread on Color checker Passport..


http://forums.dpreview.com/forums/readflat.asp?forum=1006&message=38035883&changemode=1

this is not what I was told by ....   the Experts? at the focus on imaging exhibition where I bought the package...

will do some more research..   ( this may be why I have the occasional problem)

my actual procedure for camera calibration is
1.  take a bracketed image of the color checker
2.  take all images necessary of work
3.  Convert all to dng using the adobe color checker
4.  then open in ACR check the color checker image that is best....   and drop the converted image(original) to the passport program...

 This procedure is wrong based on the thread at dpreview

I  always use the white balance patch.....

more work.....
cheers elo

Logged
bjanes
Sr. Member
****
Offline Offline

Posts: 2714



« Reply #31 on: November 28, 2011, 09:38:37 AM »
ReplyReply

The numbers for that chart are derived from a machine (spectrophotometer) measuring the spectral response of each color, gray and black patches under a given amount of light and color temperature illuminant as Eric indicated. What that spectro "saw" is by the numbers and doesn't take into account human's adaptive nature to brightness, contrast and saturation of colors to any given scene viewed.

If you want the scene as seen by the spectrophotometer to correlate with the photographic image, you will have to adjust the viewing conditions such that the adaption of the human visual system is the same for viewing the image as for the original scene. I downloaded your corrected image and looked at it with Imatest Colorcheck. The DeltaEs are small, indicating that your profile did quite a good job or reproducing the chart. When comparing your image on my calibrated monitor to my own ColorChecker, I see a good match even though my monitor is 6500K and my Solux viewing lamp is 4700K. My visual system can make the necessary accommodation even for this mismatch of color temperature.



This is much like the affects on perception when the lights are turned on in a darkened movie theatre where our eyes immediately see a loss in contrast and richness. A spectro if it were possible to measure from the movie screen would still see the CCchart color patches as being the same because the spectro's is using its own light source and not the movie theatre's.

I don't think that is true. To measure the screen with a spectrophotometer, you would have to have an instrument that reads the actual luminance of the screen, not a reflection instrument that uses its own light source. In other words, you would need a radiance pixmap of the scene. When you turn on the theater lights, you not only change color and brightness adaption of the viewer, but the theater lights dilute the luminances on the screen, and this is most prominent in the shadows, since the luminance of the theater lights is added to the values produced by the projector, and the effect is most marked in the shadows, where the luminance produced by the theater lights is much greater than the shadow luminances produced by the projector. The effect is analogous to flare light, which washes out the shadows more than the highlights. If you wanted the image to appear good with the lights on, you would have to greatly increase the luminance of the projector so that the projected shadow luminances would not be overwhelmed by the theater lights.

If I am looking at a reflection print and increase the ambient illumination, the highlight and shadow values are increased linearly according to the change in illumination.


The image samples below I took with my Pentax K100D DSLR in an attempt to get the gray and black patch readouts to measure as close as possible to the published Lab numbers using a curve adjust. My illuminant and light source is direct sunlight for all shots. You can measure yourself how close I got. The last image shows the profile applied along with the settings that gave exact Lab numbers to a real scene captured under the same light intensity. Note the lack of contrast of the overall appearance of the surrounding scene. This is the same effect that happens when the lights are turned on in a darkened theatre.

I can tell you for sure my eyes saw that scene as much brighter and full of contrast than what's depicted.

The color checker is a low contrast scene and the luminances are easily within the range of the monitor and even a print, so the scene can be rendered without any luminance compression. However, an outdoor scene has much greater contrast and a linear rendering looks flat, and a sigmoid tone curve can help fit the luminance of the scene to that of the output medium. For a HDR scene, a global tone curve fails and local adjustments are necessary. This is the difference between scene rendering and output rendering. If our monitors could reproduce the actual luminances in the scene, no luminance compression would be necessary and we could use the scene referred image directly.

Regards,

Bill

Logged
Tim Lookingbill
Sr. Member
****
Offline Offline

Posts: 1053



WWW
« Reply #32 on: November 28, 2011, 12:32:07 PM »
ReplyReply

Bill, thanks for going through the trouble of verifying the numbers. Nice to know my DigitalColor Meter and X-rite i1Display monitor profile isn't too far off.

Let me be clear the profile didn't do all the work to get to the numbers you measured on the last image that includes the blue doll and surrounding scene. ACR settings were all zero'ed out except for Color Temp=5000K/Tint=0, Brightness +60 and Black 1 and the adjusted curve shown.

As for the "dark theatre with the lights turned on" reference I was loosely using that to illustrate the differences between human vision and colorimetric accuracy according to spectro measured Lab numbers in the CCchart. As another analogy from my understanding, if a spectro is used to measure and profile a display in a dimly lit studio without ambient light taken into account and that calibrated/profiled display is later placed outdoors in broad daylight displaying the same CCchart, the appearance according to human perception of the display would be greatly affected but the Lab numbers would still be the same. This is what I meant.

Just to add, the blue doll/CCchart image you measured from to get the accurate Delta E numbers can't be edited to look as it did to the original scene without changing the accuracy of the CCchart Lab numbers. I've tried it and the only edit I could add was shadow definition shown in the image below. So a simple sigmoid curve applied would not work and would throw off the CCchart Lab numbers in doing so.

But really what I was trying to point out to the OP is why it's difficult to get exact CCchart numbers and still have the image look according to human vision. Spectro's aren't human.

But I can tell you the profile works quite well in my experience maintaining balance across a wide range of colors within a wide range of images of various dynamic range when trying to edit the image to get it to look according to human vision. That's really all that can be expected from a profile since we can't calibrate and profile reality which is what the digital camera is trying to capture.
« Last Edit: November 28, 2011, 12:35:25 PM by tlooknbill » Logged
Tim Lookingbill
Sr. Member
****
Offline Offline

Posts: 1053



WWW
« Reply #33 on: November 28, 2011, 01:00:43 PM »
ReplyReply

This is how the CCchart looked according to how I perceived the overall scene. Note the simple edits applied going from Brightness +60 to +90 and Contrast 0 to +25 with the same curve that includes the shadow tweak.

Note most of the a/b readouts remain the same while the Luminance channel increased. Not all color a/b channels stayed the same. Also note the skin tone luminance went from 66L to 79L, but the Purple patch went from 31L to only 37L.

Why didn't the luminance equally increase with both patches? Non-linear behavior? I'ld like that one answered.
Logged
bjanes
Sr. Member
****
Offline Offline

Posts: 2714



« Reply #34 on: November 28, 2011, 02:29:50 PM »
ReplyReply

As for the "dark theatre with the lights turned on" reference I was loosely using that to illustrate the differences between human vision and colorimetric accuracy according to spectro measured Lab numbers in the CCchart. As another analogy from my understanding, if a spectro is used to measure and profile a display in a dimly lit studio without ambient light taken into account and that calibrated/profiled display is later placed outdoors in broad daylight displaying the same CCchart, the appearance according to human perception of the display would be greatly affected but the Lab numbers would still be the same. This is what I meant.

I see your point now. The viewing conditions and observer adaptions are important. With regard to the accuracy of the capture, the essential task is to have the color values in the file match those of the subject. As Eric pointed out, what the camera sees will be affected by the light used to illuminate the target. The camera can produce a metameric match for that illumination, but results may be different for another illuminant, as shown by the Metacow. Scene brightness also affects perception as shown here.

Even if the image produces a good match, the viewing conditions may affect perception, and the CIECAM02 model attempts to account for these factors. A CIECAM02 plugin is available for Photoshop and tries to take some of these factors into account. I don't know how to use it, but it is interesting to play around with.



But really what I was trying to point out to the OP is why it's difficult to get exact CCchart numbers and still have the image look according to human vision. Spectro's aren't human.

But I can tell you the profile works quite well in my experience maintaining balance across a wide range of colors within a wide range of images of various dynamic range when trying to edit the image to get it to look according to human vision. That's really all that can be expected from a profile since we can't calibrate and profile reality which is what the digital camera is trying to capture.

This link explains how spectrophotometer data can be translated to reproduce the appearance of the object with a given illuminant. This is presumably how x-rite produced the L*a*b values for the chart for D50 illumination. Even though you used daylight illumination in your test, the camera white balance and profile was able to produce values very close to the D50 reference values. I can download a virtual ColorChecker in CIE L*a*b from Bruce Lindbloom's site and convert it to D65 AdobeRGB in Photoshop. Does this produce the same values that would be obtained  by calculating the RGB values for D65 illumination? In other words, is the illuminant used for the photograph and the white point of the working space taken into account by this transformation? It appears that the profile is quite accurate for reproducing the ColorChecker, but it's performance for an actual scene with more colors might not be so good.

Regards,

Bill
Logged
madmanchan
Sr. Member
****
Offline Offline

Posts: 2100


« Reply #35 on: November 28, 2011, 02:59:53 PM »
ReplyReply

Yes, I understand that, but what I'm still not getting is why creating a profile using a color chart image, then applying that profile to the same image doesn't result in the values the software thinks the chart patches have. Isn't the software essentially building a transform that converts the input image data to the ideal color patch data? If so, at least for the color components, I'd expect that if I apply that transform to the data used to build the transform I'd get the ideal values.

Perhaps what I'm seeing are the DNG editor's ideal values? Is there a published set of values for the patches that the DNG editor uses?

Which "ideal values" are you referring to?  The RGB/Lab/HSL readouts in the DNG PE don't report the ideal values.  They report the values of the input image after a color matrix transform has been applied, but before the color table transform has been applied. (The color table is what you create in DNG PE by adding control points to adjust color.) In other words, the readouts report the input (not the output) of the color transform.
Logged

madmanchan
Sr. Member
****
Offline Offline

Posts: 2100


« Reply #36 on: November 28, 2011, 03:03:14 PM »
ReplyReply

Eric, quite true. But one can use chromatic adaption as discussed in Section 5 of Danny Pascale's RGB Coordinates of the Macbeth Color Checker. One can also use chromatic adaption to derive R'G'B' values for other illuminants. X-Rite publishes values for D50 CIE L*a*b and D65 sRGB. Danny and Bruce Lindbloom give values for multiple illuminants. The other factor is the actual illuminant used to take the picture. In my case, I use Solux 4700K bulbs which are not that far from D50 and rely on Camera Raw to perform a transform from the camera XYZ data using white balance and whatever else to ProPhotoRGB D50. In any case, one can compare the observed rendered values under the employed illuminant and compare them to the published or measured values of the chart and obtain a measure of the calibration of the system--illuminant, chart, camera, and raw converter. Your comments would be appreciated.

Of one had the SPD of the Solux bulbs and the spectral reflectance properties of the chart, one could calculate the D50 L*a*b values directly using the 2 degree standard observer data, but this seems to be overkill for most purposes.

Even with chromatic adaptation, the resulting numerical values are going to be quite different.  For example, illuminant A adapted to D50 is very different (both visually and numerically) compared to D50 directly.

It is true that visually there will be only small differences if you're using D50-based reference values, and the actual illuminant is similar spectrally to D50.

Logged

madmanchan
Sr. Member
****
Offline Offline

Posts: 2100


« Reply #37 on: November 28, 2011, 03:08:07 PM »
ReplyReply

My impression is that if your profile can reproduce these 24 colors to match published values then in theory other colors would fall more or less into place within an acceptable range. Isn't that the whole point of jumping through the hoop of making custom DNG profiles? If the only advantage of custom profiles is that you can accurately reproduce a macbeth chart (and even that seems questionable at this point) why would anyone do it? Why not just use a grey card and go from there?

Yes and no.  It depends a lot on which set of "other colors" you're talking about (for example, blue hydrangea flower petals are notoriously difficult to reproduce accurately without throwing off other colors).  In essence one has to pick what "colors" (really, spectral radiances) to optimize for.
Logged

RFPhotography
Guest
« Reply #38 on: November 28, 2011, 03:54:30 PM »
ReplyReply

Which "ideal values" are you referring to?  The RGB/Lab/HSL readouts in the DNG PE don't report the ideal values.  They report the values of the input image after a color matrix transform has been applied, but before the color table transform has been applied. (The color table is what you create in DNG PE by adding control points to adjust color.) In other words, the readouts report the input (not the output) of the color transform.

How does the user know what control points to add and what adjustments to make?  The user can't refer to the values published on the XRite or Lindbloom sites because the values in the DNG PE are based on a linear gamma.  So where does the user start when trying to make adjustments?  Or is it just to be assumed that the initial step of the matrix transform does the job?
Logged
RDoc
Newbie
*
Offline Offline

Posts: 18


« Reply #39 on: November 28, 2011, 10:44:57 PM »
ReplyReply

Which "ideal values" are you referring to?  The RGB/Lab/HSL readouts in the DNG PE don't report the ideal values.  They report the values of the input image after a color matrix transform has been applied, but before the color table transform has been applied. (The color table is what you create in DNG PE by adding control points to adjust color.) In other words, the readouts report the input (not the output) of the color transform.

By "ideal values" I meant the values for the color checker patches programmed into the DNG editor and used by the color checker profile wizard.

My understanding of what the DNG wizard does is that it reads the input values from the color patches in the loaded image, then creates a transform intended to map the input values to the values programmed into it for the patches. I'm not doing any manual tweaking of the control points so whatever is created by the wizard is what's in the created profile.

Then in ACR I apply that profile to the same input image of the color checker. I expected that the output of the transform would be very close to the values for the patches programmed into the DNG editor. Is that not the case?

As I asked earlier, are the values for the patches that the DNG editor is trying to achieve by the profile transform available anywhere? The reason I ask is that when I compare the Lab values in the ACR application in ProPhoto mode to the published values for the color checker, they don't match numerically and are visibly different.
Logged
Pages: « 1 [2] 3 4 5 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad