Ad
Ad
Ad
Pages: [1] 2 3 ... 9 »   Bottom of Page
Print
Author Topic: Colour of light matters  (Read 13418 times)
Iliah
Sr. Member
****
Offline Offline

Posts: 415


« on: July 08, 2013, 10:00:33 AM »
ReplyReply

The theory is that as soon as the shot is white-balanced colour should be right.

Here is a simple experiment to check it. A ColorChecker SG card, shot with D4 under halogen lights, first time no filter on lens, second time with B&W KB15 (80A) color conversion filter on the lens. Matrix profiles were calculated from both shots, for delta E 2000 reports see .csv files in http://cl.ly/3x3F0D2p0e3K/D4_16-35_SG_Halogen_dE2000.zip

Sorting by the error value one can see that in both cases deep blue-green patch B7 is the worst offender but with the filter on the lens the error is very tolerable while without it it is extremely significant. Comparing error values on other saturated "cold" patches suggests that the camera will not reproduce cold colours faithfully under warm light without a compensating filter.
Logged
xpatUSA
Sr. Member
****
Offline Offline

Posts: 305



WWW
« Reply #1 on: July 08, 2013, 11:22:19 AM »
ReplyReply

Interesting. I once did a comparison between some LED and halogen floods using a D50. In that case, patch #10 was the worst for halogen,  according to Adobe's DNG profile editor.

See the full test method here:

http://kronometric.org/article/lampComp/mas.html

here's the DNG Editor results, halogen at  right:



Logged

best regards,

Ted
Jim Kasson
Sr. Member
****
Offline Offline

Posts: 1039



WWW
« Reply #2 on: July 08, 2013, 12:13:53 PM »
ReplyReply

The theory is that as soon as the shot is white-balanced colour should be right.

True under all illumination if and only if the camera meets the Luther-Ives condition, which no commercial camera does. Changing the illumination is equivalent to changing the spectral responses of the patches. Using a filter is likewise equivalent to changing the spectral responses of the patches.

The wavelength-by-wavelength product of the filter and the illumination can be thought of as an illuminating spectrum on a camera with no filter which should produce the same results.

So, what you have demonstrated so precisely and effectively is illuminant metamerism.

Right?

Jim
Logged

digitaldog
Sr. Member
****
Offline Offline

Posts: 9225



WWW
« Reply #3 on: July 08, 2013, 12:56:06 PM »
ReplyReply

Might I suggest this fine piece by Doug Kerr: http://dougkerr.net/pumpkin/articles/Metameric_Error.pdf
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
Jim Kasson
Sr. Member
****
Offline Offline

Posts: 1039



WWW
« Reply #4 on: July 08, 2013, 01:32:33 PM »
ReplyReply

Might I suggest this fine piece by Doug Kerr: http://dougkerr.net/pumpkin/articles/Metameric_Error.pdf

Good pick, Andrew. Iliah, I know you're quite technical, so you might appreciate this more quantitative summary of digital capture and output device characterization, including a discussion of illuminant metamerism:

http://wiki.epfl.ch/edicpublic/documents/Candidacy%20exam/dcihchap5devchar.pdf

It's by Raja Balasubramanian of Xerox.

Jim
Logged

Tim Lookingbill
Sr. Member
****
Offline Offline

Posts: 1231



WWW
« Reply #5 on: July 08, 2013, 09:57:51 PM »
ReplyReply

Can someone show a real world digital camera capture besides a CC chart that shows the effects of this illuminant metamerism?

Theory makes my brain hurt because I can never make a connection and/or find a use for the data to aid me in creating better looking images.
« Last Edit: July 08, 2013, 09:59:34 PM by Tim Lookingbill » Logged
stamper
Sr. Member
****
Offline Offline

Posts: 2876


« Reply #6 on: July 09, 2013, 02:48:09 AM »
ReplyReply

Can someone show a real world digital camera capture besides a CC chart that shows the effects of this illuminant metamerism?

Theory makes my brain hurt because I can never make a connection and/or find a use for the data to aid me in creating better looking images.

Absolutely spot on! However some are into the theory side of things. Personally I prefer the practical side. Smiley
Logged

BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3874


« Reply #7 on: July 09, 2013, 02:51:10 AM »
ReplyReply

Can someone show a real world digital camera capture besides a CC chart that shows the effects of this illuminant metamerism?

Hi Tim,

One example that's easy to grasp is that of the difference between a Red and Green wavelength light-source, which when mixed produces yellow, and a yellow wavelength light-source. Now, depending on the sensor's sensitivity to Red and Green, different sensors can produce a difference in the yellows that are perceived.

Because the human eye and the Bayer CFA use different filtered sensors, a given spectral Red/Green mix and pure yellow, will almost certainly be perceived as different yellow colors.

Cheers,
Bart
Logged
stamper
Sr. Member
****
Offline Offline

Posts: 2876


« Reply #8 on: July 09, 2013, 03:09:10 AM »
ReplyReply

Is this information a hindrance when you take an image? Should it stop you taking one? If the eye sees yellow differently from the camera then imo you should carry on and take the image because when I print/post to the internet the viewer of the image won't know?
Logged

BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3874


« Reply #9 on: July 09, 2013, 03:19:50 AM »
ReplyReply

Is this information a hindrance when you take an image?

Hi,

Well, that depends if e.g. you are into accurate reproduction of artwork ... The mix of Red and Green pigments may produce the same yellow as a more pure yellow pigment, depending on the spectrum of the illuminant, and the sensitivity of the Bayer CFA filtered sensels.

In other circumstances, e.g. photo journalism, it may be less of a show stopper. It only becomes a hinderance if it's important and cannot be rectified by accurate profiling. After all, this is a forum about Color Management, so feel free to ignore if you do not care about that.

Cheers,
Bart
Logged
Simon Garrett
Sr. Member
****
Offline Offline

Posts: 395


« Reply #10 on: July 09, 2013, 03:50:21 AM »
ReplyReply

It's an issue for things like product photography.  You might, for example, have a product with parts made of different materials (plastic and metal, say) that are intended to be the same colour.  Quite possibly the parts may be the same colour but have different reflective spectrums. 

The manufacturer may have gone to some trouble to ensure that the two parts look the same colour (or acceptably close) to the eye under different common illuminants, but they may not look the same on a photographic image.  No amount of global fiddling (e.g. white balance) can fix that - only local touching up. 
Logged
opgr
Sr. Member
****
Offline Offline

Posts: 1125


WWW
« Reply #11 on: July 09, 2013, 04:11:38 AM »
ReplyReply

It's an issue for things like product photography.  You might, for example, have a product with parts made of different materials (plastic and metal, say) that are intended to be the same colour.  Quite possibly the parts may be the same colour but have different reflective spectrums. 

The manufacturer may have gone to some trouble to ensure that the two parts look the same colour (or acceptably close) to the eye under different common illuminants, but they may not look the same on a photographic image.  No amount of global fiddling (e.g. white balance) can fix that - only local touching up. 

+1,

And another real world example that can not be shown for obvious reasons:

Late afternoon light has a tendency to give a very deep and rich green color to tree-leaves. A digital camera will not capture that color. It will capture a green color, but it will not have the same relative intensity and depth and is likely off in hue. I suspect this has something to do with the infrared spectrum and how that interacts with the leaves and the processes in a leave, but I am not sure. But I do know it is very visible and pronounced.

Another very distinct example: butterflies. Besides the fact that some of those butterfly wings are marvels of color and are absolutely fantastically bio-engineered from a colorreflection point of view, they also have a tendency to screw your camera's colorresponse. Circles that would be a lot of color to the eye, appear completely black on the camera. Again I suspect that this is partly due to infra-red and also ultraviolet reflectivity. 





Logged

Regards,
Oscar Rysdyk
theimagingfactory
elied
Sr. Member
****
Offline Offline

Posts: 272


« Reply #12 on: July 09, 2013, 06:03:30 AM »
ReplyReply

It seems to me that the bottom line practical lesson to be learned here is that when shooting in light that deviates significantly from D50 the camera may need some help and the ex-film-shooter might do well to get that old shoe box down from the top shelf of the closet and dig out the old CC filters.
Logged

Roll over Ed Weston,
Tell Ansel Adams the news
BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3874


« Reply #13 on: July 09, 2013, 07:00:08 AM »
ReplyReply

It seems to me that the bottom line practical lesson to be learned here is that when shooting in light that deviates significantly from D50 the camera may need some help and the ex-film-shooter might do well to get that old shoe box down from the top shelf of the closet and dig out the old CC filters.

Hi,

For a 'daylight' profile, that will work just fine. It's usually a good idea to get a good quality light as input rather than trying to 'fix' things afterwards (although that's sometimes all we can do, and the result doesn't have to be that bad either).

It all starts with a good quality of broad spectrum (incandescent source) light, peaks subdued if possible. That will provide a good basis for the removal of ambient reflected light casts. But with a trichromatic capturing device, instead of a multiple (many more than three) spectral band sampling device, metamerism will always be an issue (even if we try to mimic the 3 basic eye color-matching function curves, although it would reduce the errors).

Cheers,
Bart
Logged
Iliah
Sr. Member
****
Offline Offline

Posts: 415


« Reply #14 on: July 09, 2013, 07:52:22 AM »
ReplyReply

Hi Andrew,

It is not exactly the usual metameric error. The catch is that the sensor is linear in respect to the power of illumination if the light is the same. If you have a series of patches with the same L and different a&b, changing the illumination and doing proper profiles will not result in same L values for different light spectrums.
Logged
digitaldog
Sr. Member
****
Offline Offline

Posts: 9225



WWW
« Reply #15 on: July 09, 2013, 08:23:13 AM »
ReplyReply

Theory makes my brain hurt because I can never make a connection and/or find a use for the data to aid me in creating better looking images.

Going back to the 80's, I did an annual report where each section was separated by a colored page curl of which I had to shoot 4x5. All the colors expect this one particular blue Pantone paper reproduced on film close enough that the art director was happy. But the blue shifted radically. I had to go out and find a lot of colors near that blue, shoot a test and find the processed film that matched what the AD wanted. It hurt my brain too, but in the end, I had to create a better looking image for the client and at the time, that was the best solution.

See if you can find some Fluorescent plastic’s of similar colors on some product, shoot it under differing illuminants and see if you find a shift between the two differing manufactured pieces that should look the same. That might help with the connections.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
opgr
Sr. Member
****
Offline Offline

Posts: 1125


WWW
« Reply #16 on: July 09, 2013, 08:53:19 AM »
ReplyReply

Now that you mention it, OBAs in (printer)paper will eff up my camera response quite effectively.
That would be relevant if you try to use the paper for whitebalancing. Bad idea...

(Haven't tried whether a UV filter will help, but that effs up the camera response in its own magical way.)
Logged

Regards,
Oscar Rysdyk
theimagingfactory
Jim Kasson
Sr. Member
****
Offline Offline

Posts: 1039



WWW
« Reply #17 on: July 09, 2013, 09:53:49 AM »
ReplyReply

Theory makes my brain hurt because I can never make a connection and/or find a use for the data to aid me in creating better looking images.

I posted this before, in this thread, which has some example images.

Ed Giorgianni once told me that, in the film days, a wedding photographer sent him a picture of the husband-to-be, the best man, and all the ushers. The image had been made on color negative film.  All were wearing black tuxedos. None of the tuxedos was black in the picture. None of the tuxedos was the same color as any one of the other tuxedos. Ed said that the black aniline dyes used to dye clothing are strongly reflective in the infrared, and that the three layers of the film had some sensitivity there. The result was a disaster for the photographer.

The solution to IR problems is the same in digital capture as in the film days: UV- or IR-cut filters over the lens or the chip, or both. The IR-cut filters that come on most cameras are inadequate for some combinations of subject matter and illumination.

The IR case can be thought of as an example of the larger problem.

Jim
« Last Edit: July 09, 2013, 09:55:50 AM by Jim Kasson » Logged

Tim Lookingbill
Sr. Member
****
Offline Offline

Posts: 1231



WWW
« Reply #18 on: July 09, 2013, 10:58:15 AM »
ReplyReply

I posted this before, in this thread, which has some example images.

Ed Giorgianni once told me that, in the film days, a wedding photographer sent him a picture of the husband-to-be, the best man, and all the ushers. The image had been made on color negative film.  All were wearing black tuxedos. None of the tuxedos was black in the picture. None of the tuxedos was the same color as any one of the other tuxedos. Ed said that the black aniline dyes used to dye clothing are strongly reflective in the infrared, and that the three layers of the film had some sensitivity there. The result was a disaster for the photographer.

The solution to IR problems is the same in digital capture as in the film days: UV- or IR-cut filters over the lens or the chip, or both. The IR-cut filters that come on most cameras are inadequate for some combinations of subject matter and illumination.

The IR case can be thought of as an example of the larger problem.

Jim

The more variables you setup to build your anecdote the less it provides consistently predictable, accurate and usable data to mitigate against when creating better looking pictures. Film does not come very close to reacting/recording to light as a digital sensor. Editing 3000 Raws vs about 100 scanned negatives told me that.

One of the major variables not mentioned here and in Andrew's linked Doug Kerr pdf (which offered plastic vs paint illuminant reflectance variances in kitchen devices as his anecdote) is the fact that sensors only record/measure voltage charges in grayscale that get redefined by software as color on an RGB display after those grayscale measurements go through the A/D converter. The display's RGB filtering of those grayscale pixels would have to be known and compared against the sensor's RGB spectral transmission filtering to know exactly where the errors occur.

All in all we're no where close to mitigating this issue and just resort to selective color editing as we've always done. Out of the 3000 Raws I've shot under lights of extremely varying spectra, I really never seen this as a big problem to overcome in the post processing stage. The blue/purple flower example Jim Kasson linked to doesn't really prove or point to the source of the causality with any consistency because I've shot similar flowers and sometimes they're purple and sometimes they're blue shot under the same light.

I do notice this blue/purple issue with flowers whenever I've been out too long on a hot day and my batteries are going low which points to a heat issue but still no proof or consistency in coming up with attempts to avoid or correct except in post processing.

I just think there are WAY too many variables that haven't been considered to know for sure this illuminant metamerism is the cause to consistently mitigate against.
« Last Edit: July 09, 2013, 11:01:50 AM by Tim Lookingbill » Logged
Schewe
Sr. Member
****
Offline Offline

Posts: 5532


WWW
« Reply #19 on: July 09, 2013, 12:24:45 PM »
ReplyReply

I just think there are WAY too many variables that haven't been considered to know for sure this illuminant metamerism is the cause to consistently mitigate against.

A camera sensor captures the light passing through the RGGB filters on it's photosite. The specific bands of color those filters pass/cut makes up the spectral response of a sensor. As far as I know, most camera sensors are currently designed to capture "daylight" or about D50-D55. The SPD of daylight is pretty specific and pretty even across the visible light spectrum. The moment a light source with a different SPD hits the sensor, the sensor's response will be different. So, that's the basic reason for the lack of a metameric match. Throw in a spiky SPD (such is the case with fluorescent lights) and the sensor response will be even different.

The whole reason that Thomas Knoll designed Camera Raw to have two separate DNG profiles; one for Standard Illuminate A (2856ºK) and a different profile for D65 was to account for the differences in the sensor's response under those two SPDs. Note that this isn't about correcting for white balance...DNG profiles are designed to correct for different spectral responses AFTER white balance has been corrected.
Logged
Pages: [1] 2 3 ... 9 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad