Ad
Ad
Ad
Pages: [1] 2 »   Bottom of Page
Print
Author Topic: 2.2 or L* ?  (Read 16363 times)
jljonathan
Full Member
***
Offline Offline

Posts: 143


« on: October 18, 2009, 11:57:46 AM »
ReplyReply

I have used Coloreyes to calibrate an older model Imac 24" to 90 cd/m2. I have run it using 2.2 and L* and do see a shift between the two gammas. Could someone please explain the differences between the two that would account for what I am seeing, and offer some recommendation as to which would be preferable. I work in PS CS4 using ProPhoto 16 bit.
Thanks
Jonathan
Logged
Czornyj
Sr. Member
****
Offline Offline

Posts: 1424



WWW
« Reply #1 on: October 18, 2009, 12:47:35 PM »
ReplyReply

Quote from: jljonathan
I have used Coloreyes to calibrate an older model Imac 24" to 90 cd/m2. I have run it using 2.2 and L* and do see a shift between the two gammas. Could someone please explain the differences between the two that would account for what I am seeing, and offer some recommendation as to which would be preferable. I work in PS CS4 using ProPhoto 16 bit.
Thanks
Jonathan

It's not a good idea to calibrate iMac - that's natively gamma 2.2 calibrated - to L* TRC. It's also not a good idea to calibrate it to L*, when your working space (ProPhoto) is gamma 1.8 encoded. In your situation gamma 2.2 is the optimal choice.
« Last Edit: October 18, 2009, 12:48:01 PM by Czornyj » Logged

Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 6983


WWW
« Reply #2 on: October 18, 2009, 02:24:08 PM »
ReplyReply

You didn't explain the nature of the differences you are seeing. In the final analysis, one purpose of colour management (though not the only) is to achieve a good match between what you see on yoiur display and what comes out of your printer. Use which ever gamma setting systematically does a better job of this with your equipment and images.
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
Czornyj
Sr. Member
****
Offline Offline

Posts: 1424



WWW
« Reply #3 on: October 18, 2009, 03:09:13 PM »
ReplyReply

Quote from: MarkDS
You didn't explain the nature of the differences you are seeing. In the final analysis, one purpose of colour management (though not the only) is to achieve a good match between what you see on yoiur display and what comes out of your printer. Use which ever gamma setting systematically does a better job of this with your equipment and images.

It's actually a good idea to match the gamma of monitor with the TRC of our working space to minimize banding, but it gives any results only if we have a display with high bit matrix, LUT, and hardware calibration. In case of L* calibrated iMac and ProPhoto the banding will only get worse.
« Last Edit: October 18, 2009, 03:29:15 PM by Czornyj » Logged

Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 6983


WWW
« Reply #4 on: October 18, 2009, 06:46:37 PM »
ReplyReply

I calibrated and profiled my LaCie 321 (1st edition) with ColorEyes Display, used L* as the gamma parameter and I have no problem with banding.
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
Czornyj
Sr. Member
****
Offline Offline

Posts: 1424



WWW
« Reply #5 on: October 19, 2009, 02:57:53 AM »
ReplyReply

Quote from: MarkDS
I calibrated and profiled my LaCie 321 (1st edition) with ColorEyes Display, used L* as the gamma parameter and I have no problem with banding.

But LaCie is a 10bit panel with 12bit LUT, and internal calibration. On iMac you only can calibrate the 8bit LUT on the graphics card. iMac is gamma 2.2 precalibrated, so when you'll calibrating it to L*, you're loosing at least 22 levels per channel, and then you're loosing another 20-30 levels per channel to display gamma 1.8 encoded image on L* calibrated display.
Logged

Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 6983


WWW
« Reply #6 on: October 19, 2009, 07:37:45 AM »
ReplyReply

OK, I missed the iMac part of it., and the gamma recommendation depends, inter alia, on the hardware one is using.

Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
digitaldog
Sr. Member
****
Offline Offline

Posts: 9192



WWW
« Reply #7 on: October 19, 2009, 09:36:39 AM »
ReplyReply

L* calibration for displays is all the rage in Europe for some reason. Its yet to be proven to be useful in any peer reviewed, scientific piece I’m aware of. This was debated on the ColorSync list awhile ago. One of the most readable posts was from Lars Borg of Adobe (their head color scientist):


Quote
L* is great if you're making copies. However, in most other
scenarios, L* out is vastly different from L* in.  And when L* out is
different from L* in, an L* encoding is very inappropriate as
illustrated below.

Let me provide an example for video. Let's say you have a Macbeth
chart. On set, the six gray patches would measure around  L* 96, 81,
66, 51, 36, 21.

Assuming the camera is Rec.709 compliant, using a 16-235 digital
encoding, and the camera is set for the exposure of the Macbeth
chart, the video RGB values would be 224,183,145,109,76,46.

On a reference HD TV monitor they should reproduce at L* 95.5, 78.7,
62.2, 45.8, 29.6, 13.6.
If say 2% flare is present on the monitor (for example at home), the
projected values would be different again, here: 96.3, 79.9, 63.8,
48.4, 34.1, 22.5.

As you can see, L* out is clearly not the same as L* in.
Except for copiers, a system gamma greater than 1 is a required
feature for image reproduction systems aiming to please human eyes.
For example, film still photography has a much higher system gamma
than video.

Now, if you want an L* encoding for the video, which set of values
would you use:
96, 81, 66, 51, 36, 21 or
95.5, 78.7, 62.2, 45.8, 29.6, 13.6?
Either is wrong, when used in the wrong context.
If I need to restore the scene colorimetry for visual effects work, I
need 96, 81, 66, 51, 36, 21.
If I need to re-encode the HD TV monitor image for another device,
say a DVD, I need 95.5, 78.7, 62.2, 45.8, 29.6, 13.6.

In this context, using an L* encoding would be utterly confusing due
to the lack of common values for the same patches.  (Like using US
Dollars in Canada.)
Video solves this by not encoding in L*. (Admittedly, video encoding
is still somewhat confusing. Ask Charles Poynton.)

When cameras, video encoders, DVDs, computer displays, TV monitors,
DLPs, printers, etc., are not used for making exact copies, but
rather for the more common purpose of pleasing rendering, the L*
encoding is inappropriate as it will be a main source of confusion.

Are you planning to encode CMYK in L*, too?

And
Quote
This discussion seems to use the wrong premises,
focusing on one very narrow point,  the "optimal"
TRC for non-rerendered grays. This is a rat hole.
Grays make up only 0.00152588 percent of your
device colors. Unrendered colors are
uninteresting, unless you're making copiers. So
look at the other issues.

First, why does the L* gray TRC matter, when all
devices, even the eye, have to rerender for
optimal contrast (unless you're making copiers).
The digital system has to recode any (L* or not)
encoded data into other (L*) encoded data. To
make this more clear, optimally reproduce the
same image on newsprint, SWOP and transparency.
Clearly, the in-gamut L* values will differ. Show
me how an L* TRC would reduce quantization errors
when re-encoding from one dynamic range to
another.

Second, so far this discussion has completely
ignored colors. Even with L* TRCs, you have to
encode non-neutral colors. I know of no 8-bit
encoding (L* or not) that reduces quantization
errors when you convert say all saturated greens
from eci RGB to Adobe RGB. Show me the
quantization errors with different TRCs.

Third, where is the scientific foundation? Where
is the science that shows that the eye has a
natural L* TRC for any arbitrary color, not only
for neutrals? Where is the science that shows
that the eye has a natural L* TRC for neutrals at
arbitrary luminance levels and arbitrary flare
levels?
As far as I can tell, CIE LAB doesn't show any such thing.

I'm not picking on anyone in particular, but maybe you, Karl, could answer?

Lars

Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
digitaldog
Sr. Member
****
Offline Offline

Posts: 9192



WWW
« Reply #8 on: October 19, 2009, 09:41:13 AM »
ReplyReply

Quote from: Czornyj
iMac is gamma 2.2 precalibrated...


I don’t know that’s true for all iMacs. I recall a session at PPE with Karl Lang and Chris Murphy reporting that their data showed that the iMac’s they tested were actually a native TRC gamma of 1.8. This was a good 2 years ago and it wouldn’t surprise me based on Apple’s ideas about gamma which they thankfully fixed in Snow Leopard.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
jljonathan
Full Member
***
Offline Offline

Posts: 143


« Reply #9 on: October 19, 2009, 10:40:11 AM »
ReplyReply

Thanks for the replies. 2.2 seems to be what's suggested and what I will try. Apple's site also recommends 2.2 and D65.
Jonathan
Logged
Czornyj
Sr. Member
****
Offline Offline

Posts: 1424



WWW
« Reply #10 on: October 19, 2009, 10:49:21 AM »
ReplyReply

Quote from: digitaldog
I don’t know that’s true for all iMacs. I recall a session at PPE with Karl Lang and Chris Murphy reporting that their data showed that the iMac’s they tested were actually a native TRC gamma of 1.8. This was a good 2 years ago and it wouldn’t surprise me based on Apple’s ideas about gamma which they thankfully fixed in Snow Leopard.

Quote from: jljonathan
Thanks for the replies. 2.2 seems to be what's suggested and what I will try. Apple's site also recommends 2.2 and D65.
Jonathan

If my memory serves me well, 2 samples of white 24" iMac I profiled were ~2.2 rather than 1.8, but it could change over time. The native white point was at ~6700-6800K, so yes, D65 seem to be a good recommendation.
« Last Edit: October 19, 2009, 10:52:15 AM by Czornyj » Logged

JeffKohn
Sr. Member
****
Offline Offline

Posts: 1671



WWW
« Reply #11 on: October 19, 2009, 01:54:54 PM »
ReplyReply

I have an Eizo display with its own LUT, but I still prefer gamma 2.2 over L*. What I've found is that when calibrating to L*, the lowest shadow tones are more opened up. Not only does it look somewhat artificial, but it's pretty much impossible to get those tones to reproduce in print. I get a better screen->print match when my display is calibrated to gamma 2.2.
Logged

Czornyj
Sr. Member
****
Offline Offline

Posts: 1424



WWW
« Reply #12 on: October 19, 2009, 02:43:16 PM »
ReplyReply

Quote from: JeffKohn
I have an Eizo display with its own LUT, but I still prefer gamma 2.2 over L*. What I've found is that when calibrating to L*, the lowest shadow tones are more opened up. Not only does it look somewhat artificial, but it's pretty much impossible to get those tones to reproduce in print. I get a better screen->print match when my display is calibrated to gamma 2.2.

On L* calibrated display the shadows may only look different in non-color managed environment, otherwise it doesn't make a difference.
« Last Edit: October 19, 2009, 02:44:27 PM by Czornyj » Logged

Brian Gilkes
Sr. Member
****
Offline Offline

Posts: 431


WWW
« Reply #13 on: October 19, 2009, 07:02:30 PM »
ReplyReply

The opening up of shadows can be very useful for some images, especially if perceptual edits for profiles or RIPs ensure deep saturated shadows maximize colour and not black ink. You can get deep , old tapestry colours. I would still stick to 2.2 though. A superior approach is to duplicate the file, convert mode on duplicate file to L*, duplicate the background . Move this layer onto the top of the original file and blend on luminosity. This will open shadows. If effect is too great , adjust opacity. This gives control and a printable result. Assigning screen response to L* will not as screen steps and printer steps will not match. I see no reason to adjust Mac screens to 1.8. In addition if an accurate perceptual response is required , file space using Joe Holmes spaces will give a better result than Adobe RGB etc. With screen, file and output spaces to be considered the whole thing gets a bit complicated. If all else fails ,follow the instructions.
Hope this helps
Brian
www.pharoseditions.com.au
Logged
MPatek
Newbie
*
Offline Offline

Posts: 26


WWW
« Reply #14 on: October 20, 2009, 12:25:47 AM »
ReplyReply

Recently, I did some comparison of gamma 2.2 and L-star calibration using theoretical models as well as "real" calibrated display. Graphs are included in rather technical discussion at this page.

As others already pointed out, calibration to L-star gamma indeed results in opened shadows clearly apparent in non color managed viewers for untagged images. I found this distracting when viewing images, web sites and pictures on the web. This shadow opening is noticeable for both gamma 1.8 and 2.2. I remember seeing quite posterized web graphics -- compressed jpegs optimized for gamma 2.2. So unless you work with icc profile-tagged images in completely color managed environment, I also recommend using gamma of 2.2 for general work.
Logged

Marcel

_____________________________

Digital Photography Marcel Patek
http://www.marcelpatek.com and http://photo.marcelpatek.com
Arkady
Newbie
*
Offline Offline

Posts: 19


WWW
« Reply #15 on: October 21, 2009, 04:06:29 PM »
ReplyReply

Lars is talking about encoding. I don't think it is directly applicable to monitor rendering (though indirectly it is, but conclusion could be different).

Quote from: jljonathan
I have used Coloreyes to calibrate an older model Imac 24" to 90 cd/m2. I have run it using 2.2 and L* and do see a shift between the two gammas. Could someone please explain the differences between the two that would account for what I am seeing, and offer some recommendation as to which would be preferable. I work in PS CS4 using ProPhoto 16 bit.

The idea behind 2.2 gamma is to have a non color managed solution for sRGB workflow. There is no other benefits except that manufactures may be hardwire their monitors to be close to 2.2.

The idea behind L* to have as linear color transform as possible while rendering to a monitor.  That is to say since all color transforms happens in single 3D LUT with common use of tetrahedral interpolation, L* ing your monitor allows for more linear dependency between PCS (CIE Lab) and the device (your monitor).  If L* were the only nonlinear component in the transformation then matching your monitor to L* would have allowed "lossless" color rendering to the monitor, namely, no bit resolution will be sacrificed during the rendering. That means that it would reduce banding! Sounds exciting?! Well the problem is that your monitor has its own LUT, thus if native gamma is 2.7 (as quite common case) any deflection from native gamma would result in loosing resolution bits.  

Is it beneficial? Theoretically, it might be. If you have higher bit monitor that allows some room for TRC adjustment without loosing color resolution. Or if the monitor is CRT and is driven through VGA cable.

However in general the answer will depend on many components - color transform LUT dimensions, interpolation used by CMM, monitor native gamma, LUT bit depth and size.

I would think that an optimal TRC lies between native gamma and L* in real world.


BTW. I don't think ProPhoto is a good choice, at least not in ICC based workflow.
« Last Edit: October 21, 2009, 04:07:57 PM by Arkady » Logged

digitaldog
Sr. Member
****
Offline Offline

Posts: 9192



WWW
« Reply #16 on: October 21, 2009, 06:14:08 PM »
ReplyReply

Quote from: Arkady
The idea behind 2.2 gamma is to have a non color managed solution for sRGB workflow. There is no other benefits except that manufactures may be hardwire their monitors to be close to 2.2.

No, there ARE other benefits, mainly that the TRC gamma of displays is generally pretty darn close to 2.2!

Of course Lars is talking about encoding, that’s the entire topic here!

As for ProPhoto, lets not go there, too much else that’s of disagreement...
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
tho_mas
Sr. Member
****
Offline Offline

Posts: 1696


« Reply #17 on: October 22, 2009, 10:01:10 AM »
ReplyReply

Quote from: JeffKohn
I have an Eizo display with its own LUT, but I still prefer gamma 2.2 over L*. What I've found is that when calibrating to L*, the lowest shadow tones are more opened up. Not only does it look somewhat artificial, but it's pretty much impossible to get those tones to reproduce in print. I get a better screen->print match when my display is calibrated to gamma 2.2.
Color managed based on 16bit files you actually should not notice any difference. The problem is Eizo's Calibration software: L* is simply inaccurate in Color Navigator. So the entire dispersion of luminance will be off when you are using L* as target in Col.Navigator. Whereas Gamma 2.2 or 1.8 (or any Gamma) work fine.



Logged
Arkady
Newbie
*
Offline Offline

Posts: 19


WWW
« Reply #18 on: October 22, 2009, 12:40:55 PM »
ReplyReply

Quote from: digitaldog
No, there ARE other benefits, mainly that the TRC gamma of displays is generally pretty darn close to 2.2!

Well, that what I said, didn't I?


Though I respectfully disagree regarding to "darn close" stuff. It can be, however, dated disagreement. Back few years, even mid-high range LCD monitors set gamma on per channel basis the resulting gamma on device neutral were usually far off the 2.2  usually higher, like I mentioned 2.7 could be a quite common number.

Quote from: digitaldog
Of course Lars is talking about encoding, that’s the entire topic here!

No. Loosing bit resolution due to interpolation is an not an encoding problem it is round off problem. And that what I was talking about in the previous post when explaining rationale behind L* calibration. They have different causes, different math behind them and, correspondingly, different optimal solutions.

Though again, I don't advocate L* calibration and I completely agree with you that advantages of L* yet to be determined.

Logged

tho_mas
Sr. Member
****
Offline Offline

Posts: 1696


« Reply #19 on: October 22, 2009, 02:39:10 PM »
ReplyReply

Quote from: Arkady
I completely agree with you that advantages of L* yet to be determined.
In 8bit workflows the advantages are quite obvious e.g. if you are printing gamma 2.2 files. In 16bit workflows TRC translations and quantization errors are negligible IMO.
Advantages - if you'd like to consider them advantages - are e.g.:
- equidistant and perceptual uniform TRC
- Mid gray = L50|0|0 = RGB 127/127/127
both might be helful for editing regarding gradation curves and other global adjustments.
Gamma 1.8 differentiates better in bright and Gamma 2.2 differentiates better in dark tonal values; L* differentiates equal all over the gray axis.
This is why L* color spaces can be considered as "media-" or "device independent" (actually Gamma 2.2 refers to TV and Gamma 1.8 to print-media).
So... the concept is very good. At least much better than the mixture of Gamma 2.2 for the display (only for historical reasons) and ~Gamma 1.8 for printers (the advantages of this mismatch are yet to be determined ;-)... )
Still, in 16bit workflows this is negligible.
« Last Edit: October 22, 2009, 02:40:04 PM by tho_mas » Logged
Pages: [1] 2 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad