Ad
Ad
Ad
Pages: « 1 ... 3 4 [5] 6 »   Bottom of Page
Print
Author Topic: Camera White Balance and Post Processing  (Read 17405 times)
bjanes
Sr. Member
****
Offline Offline

Posts: 2756



« Reply #80 on: November 02, 2011, 12:31:43 PM »
ReplyReply


It's much like ventriloquism tricks the human auditory system by having the ventriloquist attenuate certain directional audio frequencies humans use to determine source and direction of a particular sound. (Capital Record's audio engineers put this knowledge to good use in coming up with their famous stereo imaged "echo chamber"). Measuring these mimicked frequencies with a machine won't necessarily deliver identical wave pattern formations but the actual sound was enough to fool the listener's ear due to their position to the ventriloquist.

A spectrophotometer only reads energy waves of light and assigns a number no matter what type of light source. There's no attempt or consideration in attenuating these spectral frequencies into fooling the eye. This is what Andrew refers to as "scene referred". It's accurate according to a spectrophotometer but it doesn't take into consideration human optic's persistent adapting to the surround effect of that scene that a spectrophotometer has no clue about.

So when we capture "scene referred" color that may be "accurate" according to a machine, when we view this out of its surround that our eyes adjusted for at the time we tripped the shutter and now view in our dark warm lit studio, the "accurate" color looks butt ugly on our calibrated display which makes it even more uglier from the display's attempt at artificially mimicking D65/6500K daylight.

Good points, but AFAIK, all color reproduction must take chromatic adaption into account. The CIE standard observer assumes complete chromatic adaption.

Regards,

Bill
Logged
digitaldog
Sr. Member
****
Offline Offline

Posts: 8608



WWW
« Reply #81 on: November 02, 2011, 12:47:31 PM »
ReplyReply

So when we capture "scene referred" color that may be "accurate" according to a machine, when we view this out of its surround that our eyes adjusted for at the time we tripped the shutter and now view in our dark warm lit studio, the "accurate" color looks butt ugly on our calibrated display which makes it even more uglier from the display's attempt at artificially mimicking D65/6500K daylight.

Exactly. So the correct term here, accurate (colorimetrically correct) makes a butt ugly reproduction, inaccurate values produce a visual match. The later has values which one may be able to use to produce some kind of deltaE report that shows huge ‘errors’ yet the result is what we desire, a visual match. That is also subjective.

And when we use metrics like dE, we are examining a single solid color patch or a group of them. Even if we make a report with thousands of such colors, those are not the same as viewing an image in context. If you want to measure a spot color off the Ferrari, even a few dozen, fine. You could report an accurate or inaccurate colorimetric match (which probably doesn’t visually match on the output referred media).

All this talk of accuracy is silly with imagery which is far too complex to analyze in such simplistic terms. Just look at Bill’s evidence that his display is accurate. A few colors (selected by the software itself), which is supposed to define the accuracy of its entire color space. And using the same instrument. Not worthless but not very telling. This is all fine if you are asking about the accuracy of color patches measured and then output to these solid patches (deltaE of the profile as an example). Or the consistency of a device over time. Using the same process to define the accuracy of a image when you can simply look at it and say “it matches” is enough. That analysis is subjective. So again, using the term accuracy here when what we are expecting is a color match of a complex image is counterproductive and disingenuous.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
Tim Lookingbill
Sr. Member
****
Offline Offline

Posts: 1147



WWW
« Reply #82 on: November 02, 2011, 03:05:50 PM »
ReplyReply

Quote
Good points, but AFAIK, all color reproduction must take chromatic adaption into account. The CIE standard observer assumes complete chromatic adaption.

You left out one major factor that's not included in the CIE standard observer, emotions and memory in relation to what was seen in the actual scene and how it's perceived on an artificial viewing environment that is a darkened studio and LCD CCFL/LED backlit display. With these variables makes nailing it by the numbers and having it look as intended about as impossible as predicting the weather.

Ballpark is the best that can be done. That's what I've found making a custom DNG camera profile with a Color Checker Chart which doesn't have enough sample colors to compensate for all possible scenes captured with WB appearance being the least of concerns which the profile is forced to dovetail into because of optical effects on human perception not factored in by a machine or any color description standards or encoding.

It was what I was pointing out in Bob's sample image where saturation levels play tricks with cool/warm relationships that has to be judged by the overall look of the image and not by each color by the numbers.

I figured from this optical effect as the reason why my X-rite i1Display calibration software doesn't induce/correct for chromatic adaptation in the matrix transforms of the profile that control hue/saturation very well when I choose a 5000K white point appearance on my display.

In color managed apps using this profile loaded in the system colors affected by this change to WB appearance aren't corrected for as they would appear on a print under the same color temp light. An error in technology? Or is the software written/tuned to write the matrices of the profile expecting 6500K white?

« Last Edit: November 02, 2011, 03:08:22 PM by tlooknbill » Logged
Tim Lookingbill
Sr. Member
****
Offline Offline

Posts: 1147



WWW
« Reply #83 on: November 02, 2011, 04:28:50 PM »
ReplyReply

Below is an example from my experimenting with the way different "flavors"=(spectral reflectance qualities imbued to an image) of two types of light that bring their own color palette to the overall look of an image in creating an optical effect that can't be corrected with WB adjustments whether in front or behind the camera. "Tungsten" WB preset was used incamera for the one on the left and "Fluorescent White Light" on the right of the Alzo rated at 5500K.

Note the brown color of the hand lit by a hot light on the left compared to the greenish patina of the hand on the right. I can't make the one on the right have the same hue of brown on the left no matter what I do WB editing the Raw images in ACR. Notice the one on the right has R=G=B neutral background but it looks blue. This "undercolor" or "patena" effect has been used by Dutch Master painters using paint on canvas long before Kodak engineers got into the act of image reproduction. This optical effect also can't be factored in by CIE Standards for determining how accuracy should look according to WB appearance.

Even if you could get the color patches I sampled from the skin tones to match exactly by the numbers included in the corners, the overall appearance between the images would be even farther apart because machines can't over ride this optical effect.

Whether you light the scene with the sun, flashlight, living room light, Alien Bees strobes, you name it, each light will imbue the image with their own undercolor with some offering their own color palette that's more or less noticeable in varying degrees depending on the subject and the light lit by it.


I exaggerated this effect by editing both images to bring out this undercolor optical effect influenced by various light sources.
Logged
RFPhotography
Guest
« Reply #84 on: November 02, 2011, 05:58:17 PM »
ReplyReply

I agree that simply setting the white point will not ensure an accurate reproduction of the image. However, it is a first start.

That's not really what you've been intimating up till now.  You've been saying that getting the WB 'accurate' will result in 'accurate' colour reproduction in the scene.  If it's nothing more than a 'first start' then it's really no better than any other method. 

Quote
I use the DNG profile editor to create a custom profile for my camera which improves the color accuracy

Don't necessarily disagree with that.  I've got a Passport as well and use it for the same purpose.  But even then I may still have to tweak the colour to get it to be as close a match as possible.

Quote
Of course, the colors are not 100% accurate. Use of a custom profile helps, but since the CFA filters of any camera do not meet the Luther-Ives criteria, metameric error will occur and it can only be minimized by a profile. I agree that the use of a cloud is not the best way to measure white balance. However, my point was that if the image contains a predominant saturated color and no neutrals, automatic color balance will be thrown off. This was demonstrated with a close up of the red patch of the color checker and AutoWB. In this situatiion one can take the WB from another shot under identical conditions or even better, take a custom WB from a neutral target. With no accurate WB, you could adjust the image to produce a pleasing result, but without a reference point, you would have to rely on memory to get the color right. Color memory is not very reproducible.

Well, at least you've admitted that your method isn't 100% 'accurate'.  That's a start.  And if I need to match colours as closely as possible then I'm not going to work from memory either.  Nor do I think Andrew or Tim or a lot of others would.  The difference is in the recognition that what you're calling 'accurate' may not end up being a 'match' in the final image.

Quote
I haven't done museum reproduction work, but I understand that the best results are obtained with a multishot back using apochromatic lenses and a custom ICC profile, perhaps in Capture One. What is your experience?

Sorry, where did I say I was doing museum reproduction?  As far as equipment, The Tate in London uses, I believe, Hasselblad with both single and multi-shot backs.  I don't know what RAW converter they use but suggesting C1 is the best is like saying LR is the best or DxO.  What's best for one may not be for another.  While something like the Hassy H4D-200MS is a terrific system, I can't justify the $45k cost (before lenses).  So I make due with a less 'accurate', single shot, full frame DSLR and the best lenses I can afford.  Still a pretty darn good system and far better than the compacts or low end DSLR/kit lens options many artists use if they do their own reproduction work. 
Logged
madmanchan
Sr. Member
****
Offline Offline

Posts: 2101


« Reply #85 on: November 03, 2011, 01:23:24 PM »
ReplyReply

Random question:  blue skies are often more "purple-ish" when I view them directly with my eyes, and more "blue-ish" when I view them while wearing sunglasses.  Definitely a different hue.  Which is more accurate?
Logged

bjanes
Sr. Member
****
Offline Offline

Posts: 2756



« Reply #86 on: November 03, 2011, 01:24:09 PM »
ReplyReply

Nope. Not so. As you can see, I clicked on a white patch! In fact, the RGB values change based on which white I WB on. Some more than others. Now if you were looking closely, you’d see I WB on the first white patch of the Macbeth. The 2nd is a better move. The difference in the rendering is far from subtle. Yet WB I did. BUT, in your own examples here, you WB on a cloud to produce an “accurate” WB (again, this is nonsense description). Since you seem to feel one can WB on any white, neutral or not, you will see as I did in this example, WB is not ‘accurate’ and which white you WB plays a big role in getting a preferred (pleasing) rendering. The white patch as measured in that Passport IS neutral. And yet, WB on that one patch, the result is a warm rendering which is not ‘accurate’ but more important, not pleasing to me as the image creator looking at the scene and the capture on my display. Had I shot this under candlelight, I may have preferred this warm rendering. If I wished to express the image shot under Solux as candlelight, this image would be fine.

As I said before, there is something seriously wrong with your white balance in that shot, when you get a WB of 5100K with a tint of +10 when the correct WB would be around 4100K. You used the white patch, which is inadvisable since that patch is not spectrally neutral. It is difficult to make spectrally neutral paint with that high of a reflectance. The average values for the neutral patches in sRGB values are shown below as determined by Bruce Lindbloom.



Another reason for not using the white patch is to avoid clipping of channels, but ACR won't allow a WB with clipped channels. Patch 2 is usually recommended since it is unlikely to be clipped and the signal to noise is high in this patch. However, if your camera has a good S:N, any of the patches 2-6 can be used with good results. An 18% spectrally neutral card is fine with the D3. Here is the WB I got for the D3 with neutral patches 1 through 7. However, the error induced by reading from the white patch is minimal, and I suspect that your camera has serious nonlinearity. However, you could check the WB by determining if the RBG values are equal in the mid patches. I requested this information, but you did not respond and implied that there must be something seriously wrong with my display. The readings I posted show that they were not equal in your screen shot when viewed in Photoshop and assigning the sRGB to the download of your image. I presume that the color numbers in a screen shot would be for your monitor profile and you then converted to sRGB for viewing on the internet. The actual screen colors would vary with the white point of the monitor and its color response.



WB is subjective. Why do you suppose a company like X-Rite put a range of white’s with warmth and coolness into the Passport? Accuracy is term that doesn’t belong in the conversation. That’s the bottom line. Your use of the term ‘accurate’ is misleading. As I said here, you may say you WB and its accurate and I can disagree. You can say its close, I can say its closer or not closer and its all subjective. You’ve as yet supplied no methodology of measuring scene colorimetry and providing any proof that its accurate to the output referred data you end up with. It matches what you believe is the original, great. Calling that accurate instead of describing this as subjective with no way to back up an accuracy metric should be dismissed.

I would submit that WB is not subjective. If your camera had a sensor fully compliant with the Luther-Ives criteria, you would get accurate colors only if you used the proper white balance. Of course, no camera sensor is Luther-Ives compliant, but you will likely get the most accurate color reproduction when using the true WB and a good camera profile. If you want pleasing colors, then feel free to tweak the image to your heart's content. The tinted patches on the Passport are not to obtain an accurate white balance, but to change the WB to produce an artistic rendering. If you wanted the image to be warmer, you could use merely use a neutral patch and then alter the WB temperature to achieve the same effect.

With the flower, obviously I do not have colorimetry. However, I did present data using the Color Checker with well defined patches and demonstrated that auto WB was reasonably with my camera accurate when it is determined from the entire color checker, but is seriously deficient with a close up of the red patch, and I presented quantitative data with DeltaE and DeltaC. Thus far, all that you have presented is a faulty WB and a screen capture showing undefined colors.

Regards,

Bill
Logged
bjanes
Sr. Member
****
Offline Offline

Posts: 2756



« Reply #87 on: November 03, 2011, 01:44:03 PM »
ReplyReply

That's not really what you've been intimating up till now.  You've been saying that getting the WB 'accurate' will result in 'accurate' colour reproduction in the scene.  If it's nothing more than a 'first start' then it's really no better than any other method. 

The purpose of my post was that an accurate WB will give more accurate colors than would be obtained with a poor WB. The colors may have to be tweaked, but I think that it is a better start than would be obtained with an inaccurate WB. The flower in my shot had no neutral colors and the auto WB was off. I should have used my WhiBal, but could have set the WB to daylight in ACR and obtained reasonable results. I merely chose the image from what I had available from a recent shoot to demonstrate a point: auto WB is not good when the scene contains no neutrals and a predominant strongly saturated color. Adjusting the WB to produce a match was not possible since I no longer had the flower in front of me and my recollection of the color would likely be faulty.

Sorry, where did I say I was doing museum reproduction?  As far as equipment, The Tate in London uses, I believe, Hasselblad with both single and multi-shot backs.  I don't know what RAW converter they use but suggesting C1 is the best is like saying LR is the best or DxO.  What's best for one may not be for another.  While something like the Hassy H4D-200MS is a terrific system, I can't justify the $45k cost (before lenses).  So I make due with a less 'accurate', single shot, full frame DSLR and the best lenses I can afford.  Still a pretty darn good system and far better than the compacts or low end DSLR/kit lens options many artists use if they do their own reproduction work. 

Where did I say you were doing museum reproduction? And where did I say that C1 would be the best raw converter? I don't do museum reproduction, but I understand that the best results might be obtained with a full ICC camera profile under standardized conditions and suggested that C1 would be a possible raw converter, since it allows use of ICC camera profiles. ACR with a DNG profile from a color checker would be a reasonable alternative, but more patches are needed for a more comprehensive profile.

Regards,

Bill
Logged
digitaldog
Sr. Member
****
Offline Offline

Posts: 8608



WWW
« Reply #88 on: November 03, 2011, 01:46:48 PM »
ReplyReply

As I said before, there is something seriously wrong with your white balance in that shot, when you get a WB of 5100K with a tint of +10 when the correct WB would be around 4100K.

You want the DNG to see for yourself? Ask yourself how this is ‘wrong’. You see in the screen grab the WB tool after clicking on the white patch, you see the results of the WB values. So what’s seriously wrong? Well for one, its not ‘accurate’? Not necessary to use such language, I can SEE its too warm, the point of my first post correcting your language!

Quote
You used the white patch, which is inadvisable since that patch is not spectrally neutral.

Like your clouds? So now your take is, WB is ‘accurate’ but you have to click on something spectrally neutral? So you’re updating your story now?

Quote
Patch 2 is usually recommended since it is unlikely to be clipped and the signal to noise is high in this patch.

Your preaching to the choir. I said this already.

Quote
However, if your camera has a good S:N, any of the patches 2-6 can be used with good results. An 18% spectrally neutral card is fine with the D3. Here is the WB I got for the D3 with neutral patches 1 through 7. However, the error induced by reading from the white patch is minimal, and I suspect that your camera has serious nonlinearity. However, you could check the WB by determining if the RBG values are equal in the mid patches. I requested this information, but you did not respond and implied that there must be something seriously wrong with my display. The readings I posted show that they were not equal in your screen shot when viewed in Photoshop and assigning the sRGB to the download of your image. I presume that the color numbers in a screen shot would be for your monitor profile and you then converted to sRGB for viewing on the internet. The actual screen colors would vary with the white point of the monitor and its color response.

Bla, bla, bla.... The point of the illustration was to point holes in your “WB is accurate” nonsense, no reason to respond about your display or other readings. WB is subjective, you still don’t seem to get that. Even if I WB on the 2nd patch, I may or may not like the rendering. It may or may not better visually match the scene! And you see to believe that all illuminants when set for a neutral WB are ‘accurate’ let alone produce a desired subjective rendering which is just silly talk.

Quote
I would submit that WB is not subjective.

You clearly have, while no one as yet agrees with you, probably based on all the reasons expressed in this long set of posts. Accurate how, based on colorimetry of the scene? If you can’t provide such a metric, how can you say its accurate? Seems like a totally subjective statement since you have nothing else to back up the science.

Quote
If your camera had a sensor fully compliant with the Luther-Ives criteria, you would get accurate colors only if you used the proper white balance.

And if my aunt was a man, she'd be my uncle. Stay on track if you are going to ignore your own advise and give up. Tell us how the values are colorimetrically accurate.

Quote
With the flower, obviously I do not have colorimetry.


Yet you continue to believe and preach its accurate color. So its accurate or its not? Either way, you are using what process to define accuracy?

Quote
However, I did present data using the Color Checker with well defined patches and demonstrated that auto WB was reasonably with my camera accurate when it is determined from the entire color checker, but is seriously deficient with a close up of the red patch, and I presented quantitative data with DeltaE and DeltaC.


What you illustrated is with your camera system, you can hose the WB assumption, nothing more. Reasonably close? You came up with that metric how? Its difficult to take you seriously with such language.

Quote
Thus far, all that you have presented is a faulty WB and a screen capture showing undefined colors.

Yes! Exactly doing what you yourself said would produce accurate color. That’s exactly why I posted it!
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
digitaldog
Sr. Member
****
Offline Offline

Posts: 8608



WWW
« Reply #89 on: November 03, 2011, 01:49:29 PM »
ReplyReply

The purpose of my post was that an accurate WB will give more accurate colors than would be obtained with a poor WB.

Ah, so if I shoot a spectrally neutral white patch while shooting a girl on the beach at sunset, then WB on it, its more accurate?

Same question, this time the scene is shot under candlelight. Or in an early morning fog bank.

Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
RFPhotography
Guest
« Reply #90 on: November 03, 2011, 02:23:00 PM »
ReplyReply

The purpose of my post was that an accurate WB will give more accurate colors than would be obtained with a poor WB. The colors may have to be tweaked, but I think that it is a better start than would be obtained with an inaccurate WB. The flower in my shot had no neutral colors and the auto WB was off. I should have used my WhiBal, but could have set the WB to daylight in ACR and obtained reasonable results. I merely chose the image from what I had available from a recent shoot to demonstrate a point: auto WB is not good when the scene contains no neutrals and a predominant strongly saturated color. Adjusting the WB to produce a match was not possible since I no longer had the flower in front of me and my recollection of the color would likely be faulty.

I don't know about anybody else, but I'm getting dizzy going around in these circles.  The bottom line is that your method isn't 'accurate' it's just a different starting point.  Nothing more.

Quote
Where did I say you were doing museum reproduction? And where did I say that C1 would be the best raw converter? I don't do museum reproduction, but I understand that the best results might be obtained with a full ICC camera profile under standardized conditions and suggested that C1 would be a possible raw converter, since it allows use of ICC camera profiles. ACR with a DNG profile from a color checker would be a reasonable alternative, but more patches are needed for a more comprehensive profile.

You said your impression was that a multishot back with APO lenses was the best tool for the job in museum reproduction and that that it might be even better using ICC profiles in C1 then asked what my experience was.  That, to me, seems to be asking what my experience is in doing museum reproduction.  Do even read what you write?  I think other RAW converters can work with ICC profiles as well.  I believe Hassy's own Phocus software can.  ICC profiles necessary?  Better than a DNG profile?  Debatable.  But let's get back to the issue.  Do you know why multishot backs are suggested as the ultimate option?  How many shots?  The Hassy does, I think, 6.  There are others that do 16.  Why the extra 10?  Oh never mind.  It was a red herring argument to begin with and it didn't catch me out as you probably thought it might so let's just drop it, shall we. 
Logged
bjanes
Sr. Member
****
Offline Offline

Posts: 2756



« Reply #91 on: November 03, 2011, 03:05:02 PM »
ReplyReply

You want the DNG to see for yourself? Ask yourself how this is ‘wrong’. You see in the screen grab the WB tool after clicking on the white patch, you see the results of the WB values. So what’s seriously wrong? Well for one, its not ‘accurate’? Not necessary to use such language, I can SEE its too warm, the point of my first post correcting your language!

Like your clouds? So now your take is, WB is ‘accurate’ but you have to click on something spectrally neutral? So you’re updating your story now?

Yes, I would like to have the DNG. Can you upload it to your web site and provide a link? You say the rendering is too warm, but are the neutrals neutral? Do the color patches have the proper RGB values? If so, I would say that your capture is accurate. If you think that it is too warm and make your correction to produce a pleasing result and the color values vary from what they should be, I would say that your reproduction is inaccurate, even though you may find it pleasing. From you screen shot I can't determine what any of the values in the capture are, since the meaning of the RGB values in that capture are not defined.

My use of clouds for WB is not optimal, but of the images in that shoot, the clouds were the best reference point that was available. If I had used a WhiBal target positioned next to the flower, you likely would still insist on some adjustment to produce color pleasing to you. A color match to the flower would rely on memory and would be entirely subjective.

Bla, bla, bla.... The point of the illustration was to point holes in your “WB is accurate” nonsense, no reason to respond about your display or other readings. WB is subjective, you still don’t seem to get that. Even if I WB on the 2nd patch, I may or may not like the rendering. It may or may not better visually match the scene! And you see to believe that all illuminants when set for a neutral WB are ‘accurate’ let alone produce a desired subjective rendering which is just silly talk.

Yes! Exactly doing what you yourself said would produce accurate color. That’s exactly why I posted it!

You may or may not like the rendering, but do the colors of the patches in your image match those on the color checker? If so, I would say that your rendering is accurate. Your perception of the scene is irrelevant--for all I know you could have been on LSD and hallucinating. One can measure color stimuli, but not perception. Nowhere did I state that an accurate WB would produce the desired subjective rendering. I want my rendering into a defined color space with a defined white point, and I will go from there to produce the intended result. Your improper WB produced improper colors and proved nothing. Post the DNG and I will analyze the colors with Imatest and see how accurate they are.

An accurate capture of the color checker accurately printed on a paper surface similar to that of the color checker would reproduce the target and it would not matter what the viewing conditions were (assuming that the paper didn't exhibit metameric failure like the first Epson pigment device.

As to the sunset scene or a candlelit scene, WB will not produce the perceived scene colors, since the eye does not chromatically adapt to such red colors. At more reasonable temperatures of 3200-6000K, adaption does occur and WB reproduces the perceived colors.

Regards,

Bill
« Last Edit: November 03, 2011, 03:27:04 PM by bjanes » Logged
digitaldog
Sr. Member
****
Offline Offline

Posts: 8608



WWW
« Reply #92 on: November 03, 2011, 03:25:41 PM »
ReplyReply

Yes, I would like to have the DNG. Can you upload it to your web site and provide a link?

I’ll put it on my public iDisk**.

Quote
You say the rendering is too warm, but are the neutrals neutral?


Nope.

Quote
Do the color patches have the proper RGB values?

Now what would be proper? Its been handled in LR. Proper would be MelissaRGB using percentage values. And no, the values are probably not proper considering the color appearance doesn’t look anything like the scene (its too warm, as stated).

Quote
If so, I would say that your capture is accurate.


Your kidding with that question right?

Quote
If you think that it is to warm and make your correction to produce a pleasing result and the color values vary from what they should be, I would say that your reproduction is inaccurate, even though you may find it pleasing.
 

Its neither pleasing or a good reproduction of the scene BECAUSE of the CURRENT WB. That’s the point! I’m not sure you are reading and understanding what I and others have written (I’m not the first person to suggest this).

Quote
From you screen shot I can't determine what any of the values in the capture are, since the meaning of the RGB values in that capture are not defined.

It doesn’t matter what the current values are! They can be changed any number of ways.

Quote
My use of clouds for WB is not optimal, but of the images in that shoot, the clouds were the best reference point that was available.

And according to you accurate. That is what originally raised the hairs on my neck. Your process and more critically, your language!

Quote
If I had used a WhiBal target positioned next to the flower, you likely would still insist on some adjustment to produce color pleasing to you. A color match to the flower would rely on memory and would be entirely subjective.

Yet WB isn’t subjective? You are a difficult fellow to understand.

Quote
You may or may not like the rendering, but do the colors of the patches in your image match those on the color checker?


No, I’d need to gather scene referred colorimetry and the data in LR is output referred. Apples and oranges. I’d need data about the specific illuminant (because just measuring the colors with the i1Pro, its resulting RGB values don’t take this into account, its got its own light source).

Quote
If so, I would say that your rendering is accurate.


I bet you would. But that would be a really big and unproven assumption, the root of our disagreement.

Quote
Your perception of the scene is irrelevant--for all I know you could have been on LSD and hallucinating.


After all these posts, I wish. But since my sense of the scene is my perception, maybe you’ll tell us how you’d get past this to label it accurate otherwise? I’ve been asking for an accuracy metric and technique for days now.

Quote
Nowhere did I state that an accurate WB would produce the desired subjective rendering.


Its accurate Colorimetrically? You go about this how?

Quote
I want my rendering into a defined color space with a defined white point, and I will go from there to produce the intended result.


And this relates to the actual scene colorimetry how, and how do you correlate and calculate the accuracy from scene to this color space? That would go a long way to understanding your use of the term ‘accurate’. And on the same topic, after you’ve told us how you come to this calculation, what’s the metric for less than and inaccurate?

Quote
Post the DNG and I will analyze the colors with Imatest and see how accurate they are.

Which will tell us what about the scene colorimetry? If you alter any of the settings in a raw converter, then what? Imatest somehow knows about the scene? It disregards the RGB rendering?

Quote
As to the sunset scene or a candlelit scene, WB will not produce the perceived scene colors, since the eye does not chromatically adapt to such red colors.


So using your yet undefined methods and terminology, this is inaccurate use of WB?


**My public iDisk:

thedigitaldog

Name (lower case) public
Password (lower case) public

Public folder Password is “public” (note the first letter is NOT capitalized).

To go there via a web browser, use this URL:

http://idisk.mac.com/thedigitaldog-Public
« Last Edit: November 03, 2011, 03:28:02 PM by digitaldog » Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
bjanes
Sr. Member
****
Offline Offline

Posts: 2756



« Reply #93 on: November 03, 2011, 04:53:31 PM »
ReplyReply

I’ll put it on my public iDisk**.  

Thanks. I downloaded the image and looked at in camera raw and noted that it was overexposed and required a large negative exposure correction. ACR wouldn't let me WB on the white patch. I looked at the image in Rawnalize and the green channel is heavily clipped. That is not a good square for white balance. The other neutral patches are intact and give reasonable WB values of about 4150K. As I suspected, your WB was screwed up. Why didn't you use a nonclipped area for the WB?



Now what would be proper? Its been handled in LR. Proper would be MelissaRGB using percentage values. And no, the values are probably not proper considering the color appearance doesn’t look anything like the scene (its too warm, as stated).

As I understand Melissa, it uses ProPhotoRGB primaries and linear encoding, but reports the RGB values according to an sRGB tone curve. The percentage values can easily be converted to 8 bit notation. However, I would simply export as a TIFF in ProPhotoRGB. Bruce Lindbloom gives the values for the color checker for ProPhotoRGB and I would compare the values of the patches in your image with what Bruce says they should be. If they match, I would say your capture is accurate. When I get the time I might do that using the faulty WB from patch 1 and a better WB from patch 2.



Your kidding with that question right?

No, I am not. See above. With this approach all this confusion about scene colorimetry is circumvented. If I have an accurate capture of the color checker and print it out using an accurate profile, I should have a good match to the target. If I use a bad white balance, I suspect the match would be poor.
  
Its neither pleasing or a good reproduction of the scene BECAUSE of the CURRENT WB. That’s the point! I’m not sure you are reading and understanding what I and others have written (I’m not the first person to suggest this).

Why don't you try using a proper white balance? The others comprise one or two at most. In these forums, I have found that when a lesser known person challenges an authority who behaves in an arrogant and demeaning fashion, not many others wish to join in support of the lesser known author and share in the denigration. Sometimes I receive private messages of support, as I did with the camera color space exchange with you.

Regards,

Bill


Logged
digitaldog
Sr. Member
****
Offline Offline

Posts: 8608



WWW
« Reply #94 on: November 03, 2011, 06:37:43 PM »
ReplyReply

I downloaded the image and looked at in camera raw and noted that it was overexposed and required a large negative exposure correction.

The raw data is absolutely not over exposed. Are you familiar with ETTR? “Negative Exposure” (which is what I rendered in the DNG) is exactly the correct way to normalize the ETTR data. It looks over exposed to your default ACR settings. But like the WB, that’s simply a starting point. It is not necessarily correct or incorrect, accurate or inaccurate. No more than you can look at a color neg and tell us its accurate for a undefined filter pack in an enlarger. The warm condition of the WB, much like the appearance you saw initially is not correct for a proper visual appearance of this image. And that can be changed, the reason we shoot and deal with raw data!

Quote
ACR wouldn't let me WB on the white patch.


Well LR did so take that up with Adobe. And you can’t alter Tint/Temp either? Cause I can.

Quote
I looked at the image in Rawnalize and the green channel is heavily clipped. That is not a good square for white balance. The other neutral patches are intact and give reasonable WB values of about 4150K. As I suspected, your WB was screwed up. Why didn't you use a nonclipped area for the WB?

Yes, WB is screwed up in that it produces the wrong color appearance. I got a warm appearance because like the initial exposure, that’s not the correct render settings. And it illustrates again, that WB doesn’t ensure “accurate” color. It doesn’t ensure pleasing or desired color (or tone). So much for the simplistic notion that WB fixes all, provides “accurate” color, a point many of us have tried to illustrate to you for a few days now!

Quote
As I understand Melissa, it uses ProPhotoRGB primaries and linear encoding, but reports the RGB values according to an sRGB tone curve. The percentage values can easily be converted to 8 bit notation.


Yes, so what? That’s output referred. You continue to ignore this has nothing to do with scene colorimetry. Suppose we open this in another converter that doesn’t use ProPhoto TRC 1.0 for processing and reports a totally different scale for numbers? How does that in any way tell you about the accuracy of the scene data and thus the capture? It doesn’t.

Quote
However, I would simply export as a TIFF in ProPhotoRGB.


Until you can answer the question above about the scene colorimetry, it doesn’t matter what space you use. The export provides a set of numbers. Great. Now are they accurate? Well without providing numbers of the scene, you can’t answer that and haven’t from day 1. This continued discussion is really simple, we don’t need to go into color geekdom. You say WB produces accurate color. You have neither described what that means or how you can prove what is accurate. Example. I can take what you say is accurate neutral RGB values (40/40/40) and convert it to an output color space for an Epson and the numbers will be far different and not identical RGB values. So if I provide you just those numbers, you can tell us its neutral or not neutral without knowing about the original color space? No, you can’t. If I send 40/40/40 directly to the Epson, is that a neural color on the print? Nope. But wait, 40/40/40 is an accurate neutral value. Not with the limited data provided, looking solely at this one color space. So tell us how values you see in your raw converter are accurate to the scene. Or how to transform the original scene colors into this color space to prove they are accurate.

If we want to talk about the accuracy of a measurement, say the time at this very second, you can use a sundial, your wrist watch or an Atomic clock. We can argue about the better accuracy of the Atomic clock versus your wrist watch. But we’ll be pretty close without splitting hairs. I’m OK being accurate within a 10th of a second here. But we have values we can use to discuss this time accuracy. We can agree that the sundial may be off X number of minutes and agree to what is a level of accuracy we will accept. We can’t do this with your WB belief system because we don’t even have a value like days, let alone hours or seconds to define accuracy. And we don’t have a method to even gauge the process (the sun dial, your wrist watch). You just want us to believe in some level of accuracy and disbelieve there is any level of subjectively but you refuse to even define the beginnings of a process. If you don’t like me giving you shit about using the term accuracy, then define the accuracy much as we could discuss the accuracy of gauging the time of day to the second! If you can’t do that, then fine, say so. We’ll move on and ignore your mangling of the term accuracy.

Quote
With this approach all this confusion about scene colorimetry is circumvented. If I have an accurate capture of the color checker and print it out using an accurate profile, I should have a good match to the target. If I use a bad white balance, I suspect the match would be poor.
 
Accurate how compared to what? What instrument with which illuminant should I use to measure the Macbeth which then perfectly matches the scene illuminant using any specified transformation? If I have the actual spectral sensitivities of the chip AND the scene illuminant, I might be able to do this. But neither you nor I have that. So we’re back to you saying WB produces accurate color with absolutely no way to back it up.
Can and will you describe how you came up with this accuracy theory?

Quote
Why don't you try using a proper white balance?


What do you mean? You told us earlier to WB on white, (you used clouds as an example) and also said quite clearly: For accurate colors, it is usually best to take a WB reading from a neutral card such as a WhiBal. Now you’re bringing in exposure (which isn’t over exposed) and suggesting there is something wrong with the white on the Macbeth. But that’s not really worth going over, its your continuing language that WB produces an accurate rendering which has yet to be explained let alone proven. If you care to enlighten us on how this is produced, we can dig into white cards, exposure and the like. We’re far from that point yet.

Quote
The others comprise one or two at most.

I see, they then are wrong as I am. That’s your take. Look, I’m sure I speak for the others. If you can come up with a step by step, scientific process and a metric for accuracy, we’re all ears. So far, you continue to demand that WB isn’t subjective but have provided no mythology to prove that is the case. Or that WB isn’t subjective. My DNG illustrates that what I admit is not the proper white to WB on, disproves your simplistic idea and that I can say this from a totally subjective POV (cause I saw the scene and the rendering and its way off). You’ve got a raw file that can have an almost unlimited degree of alteration to the final numbers and rendering based on the sliders in your raw converter. So prove to us how we move em about to get accurate color, once you define what the accuracy metric is and we you got there. Otherwise, you’re wasting everyone’s time here.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
bjanes
Sr. Member
****
Offline Offline

Posts: 2756



« Reply #95 on: November 03, 2011, 08:28:22 PM »
ReplyReply

The raw data is [sic] absolutely not over exposed. Are you familiar with ETTR? “Negative Exposure” (which is what I rendered in the DNG) is exactly the correct way to normalize the ETTR data. It looks over exposed to your default ACR settings. But like the WB, that’s simply a starting point. It is not necessarily correct or incorrect, accurate or inaccurate. No more than you can look at a color neg and tell us its accurate for a undefined filter pack in an enlarger. The warm condition of the WB, much like the appearance you saw initially is not correct for a proper visual appearance of this image. And that can be changed, the reason we shoot and deal with raw data!

The raw data are absolutely overexposed, since the green channel is blown in the raw file as shown by Rawnalize, which looks directly at the raw data without demosaicing. Didn't you look at the raw histogram I posted? This has nothing to do with my ACR settings. I am quite familiar with ETTR to the extent that I know that its rationale is not related to the number of levels in the brightest f/stop of a raw file, but to the signal:noise improvement resulting from increased exposure.

I did look further at the raw file and saw that only the white square of the color checker was blown. You could try to save face by stating that the white square contains no important data and the exposure was thus proper ETTR.


Well LR did so take that up with Adobe. And you can’t alter Tint/Temp either? Cause I can.

Why would I want to take that up with Thomas Knoll? Does he want to enable inexperienced users to white balance on blown channels?

Yes, WB is screwed up in that it produces the wrong color appearance. I got a warm appearance because like the initial exposure, that’s not the correct render settings. And it illustrates again, that WB doesn’t ensure “accurate” color. It doesn’t ensure pleasing or desired color (or tone). So much for the simplistic notion that WB fixes all, provides “accurate” color, a point many of us have tried to illustrate to you for a few days now! 

Yes, so what? That’s output referred. You continue to ignore this has nothing to do with scene colorimetry. Suppose we open this in another converter that doesn’t use ProPhoto TRC 1.0 for processing and reports a totally different scale for numbers? How does that in any way tell you about the accuracy of the scene data and thus the capture? It doesn’t.

You can make a table of the color numbers of the color checker in any output space, as Bruce Lindbloom as posted on his web site.
 
Until you can answer the question above about the scene colorimetry, it doesn’t matter what space you use. The export provides a set of numbers. Great. Now are they accurate? Well without providing numbers of the scene, you can’t answer that and haven’t from day 1.

But I do know the colors of the color checker and I can compare them to the values in the rendered file with a known colors pace. It is indeed simple.


This continued discussion is really simple, we don’t need to go into color geekdom. You say WB produces accurate color. You have neither described what that means or how you can prove what is accurate. Example. I can take what you say is accurate neutral RGB values (40/40/40) and convert it to an output color space for an Epson and the numbers will be far different and not identical RGB values. So if I provide you just those numbers, you can tell us its neutral or not neutral without knowing about the original color space? No, you can’t. If I send 40/40/40 directly to the Epson, is that a neural color on the print? Nope. But wait, 40/40/40 is an accurate neutral value. Not with the limited data provided, looking solely at this one color space. So tell us how values you see in your raw converter are accurate to the scene. Or how to transform the original scene colors into this color space to prove they are accurate.

No I did not say that WB alone will give accurate colors, but proper WB along with a good camera profile will help in producing accurate colors. All current CFA sensors have metameric failure. If RGB values are equal in ProPhotoRGB, the color will be neutral. The Epson is non-linear, and a profile using a lookup table is necessary to take the working space numbers of 40, 40, 40 and convert them to values that will produce the same neutral colors in the print.

What do you mean? You told us earlier to WB on white,...

Well, white patch 1 of the color checker is not neutral, so you didn't white balance on white. Furthermore, no one who knows what he is doing would attempt white balance with a blown channel.

I see, they then are wrong as I am. That’s your take. Look, I’m sure I speak for the others. If you can come up with a step by step, scientific process and a metric for accuracy, we’re all ears. So far, you continue to demand that WB isn’t subjective but have provided no mythology to prove that is the case. Or that WB isn’t subjective. My DNG illustrates that what I admit is not the proper white to WB on, disproves your simplistic idea and that I can say this from a totally subjective POV (cause I saw the scene and the rendering and its way off). You’ve got a raw file that can have an almost unlimited degree of alteration to the final numbers and rendering based on the sliders in your raw converter. So prove to us how we move em about to get accurate color, once you define what the accuracy metric is and we you got there. Otherwise, you’re wasting everyone’s time here.

Look here at the Colorcheck documentation. You can take a picture of your color checker and determine if the rendered image has color values appropriate for the color space used. If your white balance is off, your colors will be off. What more can I say? This is similar to the camera color space thread where you chose to ignore the opinion of Thomas Knoll and Chris Murphy.

Regards,

Bill
Logged
digitaldog
Sr. Member
****
Offline Offline

Posts: 8608



WWW
« Reply #96 on: November 03, 2011, 08:43:12 PM »
ReplyReply

The raw data are absolutely overexposed, since the green channel is blown in the raw file as shown by Rawnalize, which looks directly at the raw data without demosaicing.

And yet ACR/LR shows zero clipping on the Macbeth.

Quote
You could try to save face by stating that the white square contains no important data and the exposure was thus proper ETTR.

Well its not clipped so its moot.

Quote
Why would I want to take that up with Thomas Knoll? Does he want to enable inexperienced users to white balance on blown channels?

I suspect you’d get no where with Thomas.

Quote
You can make a table of the color numbers of the color checker in any output space, as Bruce Lindbloom as posted on his web site.


Yes you can. Which has no relationship to the scene so how can you say its accurate (for the upteem time)?

Quote
No I did not say that WB alone will give accurate colors, but proper WB along with a good camera profile will help in producing accurate colors.


Again, you provide no means to back that up. Either what that means or how to evaluate it from the scene. Its accurate only because you say so.

You continue to ignore the questions asked of you. You continue to talk about clipping that isn’t an issue or plays a role in explaining or providing the accuracy metric or methodology. I’m going to then continue to believe you can’t.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
bjanes
Sr. Member
****
Offline Offline

Posts: 2756



« Reply #97 on: November 03, 2011, 10:02:06 PM »
ReplyReply

And yet ACR/LR shows zero clipping on the Macbeth.

Well its not clipped so its moot.

I suspect you’d get no where with Thomas. 

You continue to ignore the questions asked of you. You continue to talk about clipping that isn’t an issue or plays a role in explaining or providing the accuracy metric or methodology. I’m going to then continue to believe you can’t.

Give it up!  The white square is clipped and you attempted to white balance on a clipped area. ACR/LR is not designed to look at the raw files directly. And I suspect that Thomas would ignore your request to enable white balance on clipped channels, should you be so foolish to make such a requeswt. I have repeatedly told you how I would check the rendered colors of the color checker and compare them to the actual colors of the target. What more can I do? You won't listen. As far as I am concerned, I have proven that you overexposed the target and attempted to white balance on a clipped patch.

I white balanced on patch two and got 4150K, tint +8, which is very close to your auto WB that you felt gave you good results. However, auto WB will not work on scenes having a predominant strong color with no neutrals. It does work when looking at the whole target with many colors and neutral areas.

I rest my case. It is fruitless to attempt a discussion with you as I have seen before, but you again are proven wrong in some areas. You do not know everything.

Regards,

Bill

Logged
stamper
Sr. Member
****
Offline Offline

Posts: 2524


« Reply #98 on: November 04, 2011, 03:41:24 AM »
ReplyReply

Bill the next knock on the door will be from the men in white coats. Please go quietly. You will also find the Digital Dog in the van that will take both of you away for help. BTW I don't think there is room in the van for your stirring stick. Wink Grin
Logged

digitaldog
Sr. Member
****
Offline Offline

Posts: 8608



WWW
« Reply #99 on: November 04, 2011, 10:02:29 AM »
ReplyReply

The white square is clipped and you attempted to white balance on a clipped area.

The numbers say otherwise. At least in this example, actual numbers not the made up ones you’ve yet to define as accurate.

Agreed, done here. I’m taking that spectrally white van off into the sunset (which I would never WB upon)
« Last Edit: November 04, 2011, 10:05:26 AM by digitaldog » Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
Pages: « 1 ... 3 4 [5] 6 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad