Ad
Ad
Ad
Pages: [1] 2 »   Bottom of Page
Print
Author Topic: (8 bit internal LUTs) LCD whitepoint adjustment  (Read 33798 times)
Serge Cashman
Full Member
***
Offline Offline

Posts: 200


« on: May 30, 2006, 11:35:51 PM »
ReplyReply

I would like to adjust a white point of an LCD monitor (without 10 or more bit internal LUTs)  to a certain target

From what I understand monitor RGB buttons change internal monitor LUTs.

Videocard LUTs work the same as gamma adjustments by loading videocard LUTs during startup.

I understand there are TWO sets of 8 bit LUTs in this case, both of which can be adjusted to achieve the target white point. And both have the same problem of the 8 bit color adjustment.

Is this correct or way off?
« Last Edit: June 05, 2006, 06:04:11 PM by Serge Cashman » Logged
Stephen Best
Guest
« Reply #1 on: May 31, 2006, 12:31:59 AM »
ReplyReply

If the display only has a 8-bit LUT, leave both the white point and gamma at their native settings (assuming the monitor enables you to do so) and make the changes in the card ... or just profile it as is. My understanding is that most modern video cards have greater than 8-bit LUTs. The Radeon 9600 in my Mac is 10-bit.
Logged
digitaldog
Sr. Member
****
Offline Offline

Posts: 9225



WWW
« Reply #2 on: May 31, 2006, 08:15:30 AM »
ReplyReply

Quote
If the display only has a 8-bit LUT, leave both the white point and gamma at their native settings (assuming the monitor enables you to do so) and make the changes in the card ... or just profile it as is. My understanding is that most modern video cards have greater than 8-bit LUTs. The Radeon 9600 in my Mac is 10-bit.
[{POST_SNAPBACK}][/a]

Problem is the OS and most applications can't use that extra data. Note the post to the ColorSync list by Bill of NEC:

Quote
While the Matrox board has 10 bit DACs (as do ATIs), this is of course only
useful for older analog monitors. Modern digital DVI monitors are not able
to take advantage of this.

Additionally, the data going into the 10 bit DACs is still 8 bit data from
the frame buffer but going through an 8 in x 10 bit out LUT. While having a
10 bit LUT on the video card is a great step forward, it is useless on
todays digital monitors since single-link DVI is an 8 bit bottleneck.

The ATI Avivo chipset mentioned that features 10 and 16 bit output is
currently only really useful for motion video, since the video stream data
can be processed (gamma, de-interlacing, color conversion, scaling etc.)
within the card at higher than 8 bit depth.

However until the core OSs and applications are updated to support frame
buffers of >8 bit depth, this is not of much use to all of us using
Photoshop etc.

FYI - There is a tech paper on the LCD2180WG-LED that explains some of the
issues with color bit depths as related to the increased gamut size of the
LED display, and why it is not quite such an issue as one individual has
suggested it to be. Certainly none of the color professionals worldwide who
have been involved with the display throughout it's development have raised
it as being a concern.

[a href=\"http://www.necdisplay.com/products/LCD2180WGLED_Techpaper.htm]http://www.necdisplay.com/products/LCD2180...D_Techpaper.htm[/url]

Will Hollingworth
Manager of OEM Product Design & Development Engineering
NEC Display Solutions of America, Inc.
http://www.necdisplay.com
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
Stephen Best
Guest
« Reply #3 on: May 31, 2006, 08:48:45 AM »
ReplyReply

Quote
Problem is the OS and most applications can't use that extra data.
[a href=\"index.php?act=findpost&pid=67000\"][{POST_SNAPBACK}][/a]

Thanks. I was going on how it used to work. The fact remains that there's no point in twiddling knobs (other than brightness) on displays with 8-bit DACs ... wouldn't you agree?
Logged
bjanes
Sr. Member
****
Offline Offline

Posts: 2866



« Reply #4 on: May 31, 2006, 09:32:21 AM »
ReplyReply

Quote
Thanks. I was going on how it used to work. The fact remains that there's no point in twiddling knobs (other than brightness) on displays with 8-bit DACs ... wouldn't you agree?
[a href=\"index.php?act=findpost&pid=67002\"][{POST_SNAPBACK}][/a]

If you read the NEC white paper, the 8 bit DVI bottleneck is not so great as it would first seem, since the 8 by 8 bit LUT on the host computer can be set to linear. Gamma correction and white balance can then be done in a 10 bit LUT in the display device.

It sounds like a very nice display, but at around US $6,000 it will be used only in high end applicatons.
« Last Edit: May 31, 2006, 09:34:54 AM by bjanes » Logged
61Dynamic
Sr. Member
****
Offline Offline

Posts: 1442


WWW
« Reply #5 on: May 31, 2006, 12:47:07 PM »
ReplyReply

Quote
Thanks. I was going on how it used to work. The fact remains that there's no point in twiddling knobs (other than brightness) on displays with 8-bit DACs ... wouldn't you agree?
[a href=\"index.php?act=findpost&pid=67002\"][{POST_SNAPBACK}][/a]
Bingo. Don't screw with the video card, don't screw with the monitor OSD. The best calibrations come from only adjusting the brightness (the only analog adjustment available on an LCD).

If you need to change the color temp of the display simply set the calibration software to the desired temp and let it build the appropriate profile.

This will net you the best results. If your needs demand better results, then you should be buying displays that operate at the desired temp to begin with.
Logged
opgr
Sr. Member
****
Offline Offline

Posts: 1125


WWW
« Reply #6 on: May 31, 2006, 03:04:04 PM »
ReplyReply

Quote
Problem is the OS and most applications can't use that extra data. Note the post to the ColorSync list by Bill of NEC:

While having a
10 bit LUT on the video card is a great step forward, it is useless on
todays digital monitors since single-link DVI is an 8 bit bottleneck.

Which is why they invented temporal dithering!

ATI Radeon X overview

Given that the Mac supports 16bit video luts at the OS level, and the GMB match software seems to adjust the lut for some of its measurements, you may even be taking advantage of the extra info as it currently is...

Note that it is better to have a temporal dithered 6bit panel, than an unstable 8bit panel.
Logged

Regards,
Oscar Rysdyk
theimagingfactory
Serge Cashman
Full Member
***
Offline Offline

Posts: 200


« Reply #7 on: May 31, 2006, 09:44:19 PM »
ReplyReply

Quote
If you need to change the color temp of the display simply set the calibration software to the desired temp and let it build the appropriate profile.
[a href=\"index.php?act=findpost&pid=67028\"][{POST_SNAPBACK}][/a]

...Achieving the desired temperature using WHAT exactly? That's kind of the main point of my question. I suppose it would be via the videocard's LUTs, but I don't know.

I do know that "the best" is not to mess with 8-bit LUTs, and I understand why it is so.

The question is what is involved in changing color temperature on 8-bit LCDs if one has to do it, and what is the best way about it.

The reasons for changing color temperature on an LCD could be adjusting it to your organization's standard target, matching another LCD's wp or using software that does not include a Native target option, to name a few.

So far my understanding is that both the monitor and the videocard LUTs can be used for this purpose.
Logged
61Dynamic
Sr. Member
****
Offline Offline

Posts: 1442


WWW
« Reply #8 on: May 31, 2006, 11:02:23 PM »
ReplyReply

Set the color temperature you want in the calibration software. The software will build a profile that will compensate for the monitors whitepoint and give you the desired WP. In other words it is taken care of in the ICC profile for the display.

This will degrade the image quality as well but not as badly as adjusting the video card LUTs or fiddling with the monitor controls.
Logged
Serge Cashman
Full Member
***
Offline Offline

Posts: 200


« Reply #9 on: May 31, 2006, 11:37:56 PM »
ReplyReply

If you work on a mac setting a certain white point target other than Native while "profiling" may seem like a natural thing to do. However "profiling" does not adjust monitor colors by itself. By assigning a  profile to a monitor on a Mac you automatically load corresponding videocard LUT corrections on startup (via the VCGT tag). Correct me if I'm wrong. PC users need a standalone LUT loader so it's more obvious to them...

I obviously don't refer to going into the videocard settings and pulling sliders over there, god forbid...
« Last Edit: May 31, 2006, 11:57:07 PM by Serge Cashman » Logged
Serge Cashman
Full Member
***
Offline Offline

Posts: 200


« Reply #10 on: May 31, 2006, 11:44:42 PM »
ReplyReply

Quote
...This will degrade the image quality as well but not as badly as adjusting the video card LUTs or fiddling with the monitor controls.
[a href=\"index.php?act=findpost&pid=67067\"][{POST_SNAPBACK}][/a]

And this is the kind of statement I'm looking for, only I would like to know more about the rationale for it.
Logged
opgr
Sr. Member
****
Offline Offline

Posts: 1125


WWW
« Reply #11 on: June 01, 2006, 02:47:27 AM »
ReplyReply

ICC profiles store RGB gamma values under the corresponding gamma curve tags. This can either be a single value indicating the gamma (1.8, 2.2, etc), or it can be a complete curve.

So this gives you effectively 2 places to store gamma correction. Both in these gamma curve tags, as well as in the video lut tags.

gamma curve tags are processed by the host application (photoshop) when preparing the image for display. It can do this in arbitrary precision and apply spatial dithering for possible bit limits. (In photoshop if you keep an image in 16bit, it actually displays differently than the same image converted to 8bit with dithering).

video luts are set in the video card (possibly involving a special utility to actually load the video lut data from the profile to the card). The video card can render with table precision and apply temporal dithering for intermediate values. (Video cards used to simply map 8bit to 8bit for video lut corrections).

Question now becomes: what is the optimal method to store the gamma information in a profile?

In general (native gamma method):
1. measure R, G, and B response,
2. calculate closest gamma values for R, G, and B,
3. store these gamma values in the normal gamma curve tags.
4. store deviations in the video lut.

this method ensures that the least amount of correction is applied in the video lut where you would lose the most amount of information because of 8to8bit mapping.
 
A different method (gamma 2.2):
1. store 2.2 in the curve tags,
2. adjust video lut to covert real response to gamma 2.2 equivalent.

This obviously can involve severe deviations in the video lut, but for untagged, direct display of RGB images that are supposed to be sRGB, you at least will see a normal contrast behavior, and if your display remotely conforms to sRGB, it should show relatively correct color. It also means that system elements are drawn properly on systems that do not apply CM continuously.

But what about white point?

Well, it is better to do it in the video lut for 2 reasons:
1. if it was in the gamma curve tag, then you would only see the correction if the system uses the display profile for ALL of its drawing.
2. for subtle, intermediate values you need some kind of dithering. Since the system can only apply spatial dithering (and only if it knows there is spatial room), while a video card can ALWAYS apply per pixel temporal dithering, the latter will produce, better and more consistent results.

Conclusion:
You can best use a method that stores a single gamma value in the curves tags, and have the video lut correct for the differences. Since video cards now support better than 8bit luts combined with temporal dithering, you can safely choose a gamma 2.2 as target and whatever whitepoint. Whether your software actually knows how to find a decent curve is another issue, but I would strongly suggest to look at some of the third party offerings such as ColorEyes.
Logged

Regards,
Oscar Rysdyk
theimagingfactory
digitaldog
Sr. Member
****
Offline Offline

Posts: 9225



WWW
« Reply #12 on: June 01, 2006, 07:32:08 AM »
ReplyReply

If you look at the curves provided in say i1 Match after calibration, you'll see the adjustments made based on the target calibration values you asked for. In a prefect world, they would be totally straight (linear) but that's never going to happen. Anyway, in theory, you can play with different settings (Native/Native) and compare the curves to adjustments elsewhere in the process (LUTs).
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
Tim Lookingbill
Sr. Member
****
Offline Offline

Posts: 1231



WWW
« Reply #13 on: June 01, 2006, 05:40:31 PM »
ReplyReply

Just started reading through here. Interesting discussion. Hi, Andrew and Serge.

opgr,

Your last post indicates a very deep understanding of video systems I wasn't aware of. That mention of spatial and temporal dithering put me at another level of understanding. Never heard of it.

However, Googling those terms brought a truck load of hits on the subject. I guess this IS rocket science going by the NASA pdf that comes up. This page I found is interesting as well:

http://www.extremetech.com/print_article2/...a=141162,00.asp

At least I've come to the right place where this level of discussion exists.

Is there any visual diagrams that illustrate how video systems operate. I've been looking all over for pictures that show what, where and how on the subject of vLUTs, differences between color tables and matrix profiles, simple gamma curve and separate RGB adjusted curves for gamma correction, and how all of this affects the final visual.

Look forward to reading many more interesting discussions.
Logged
Serge Cashman
Full Member
***
Offline Offline

Posts: 200


« Reply #14 on: June 01, 2006, 07:29:03 PM »
ReplyReply

Hi Tim. Nice to see you here.

Oscar, thanks a lot for your reply. From what I understand from your post one tag describes the actual output curves (for use by colormanaged applications), the other tag corrects the video output to match those described curves. I don't see where I get to control them though (aside from setting targets), I think it's done automatically...

My goal is  to correct the non-colormanaged (as well as colormanaged obviously) output to the target white point.  And I would like to know what's involved exactly.  Daniel sais this is better done  without touching the LCD RGB buttons for instance.

And I  still don't know  what LCD RGB buttons affect. (I also have difficulty understanding LCD DVI/VGA controls differences but I will probably post another topic on this).

I use Spyder2 Pro and it displays all curves - before, after, corrections, target - I find it a very helpful learning tool. I currently don't have a configuration that works with Coloreyes but I'll probably get a Display2 to better understand the subject.
« Last Edit: June 01, 2006, 08:20:03 PM by Serge Cashman » Logged
opgr
Sr. Member
****
Offline Offline

Posts: 1125


WWW
« Reply #15 on: June 04, 2006, 05:24:36 AM »
ReplyReply

Quote
From what I understand from your post one tag describes the actual output curves (for use by colormanaged applications), the other tag corrects the video output to match those described curves. I don't see where I get to control them though (aside from setting targets), I think it's done automatically...

My goal is  to correct the non-colormanaged (as well as colormanaged obviously) output to the target white point. 

VGA is an analog signal and because of this, an LCD must convert to digital signal before it can display the information. Therefore, you can theoretically adjust the analog signal prior to sampling for the digital signal. If the RGB buttons actually allow you to adjust the pre-amps, then they work akin to setting the correct white-balance prior to shooting JPG.

A good indication that the RGB buttons actually adjust the analog signal is when the RGB buttons are not available in DVI mode.

DVI is a digital signal, and the LCD could theoretically dump it straight to the panel. But a TFT panel doesn't have a normal gamma response, so the LCD has an internal lut to compensate for the difference. That way you can provide it with a relatively normal RGB signal and it will respond predictably.

Obviously, this then results in two places where the video signal is corrected. First in the video card (loaded from the special profile tag) and second in the LCD itself to make it "behave" normally.

This also means you can "calibrate" the device behavior in two places, provided of course that you can load the correction luts into the relevant locations.

So, if you have an LCD with a 14bit internal lut, a DDC connection, and compatible software, you can calibrate the internal lut so it behaves perfectly, and then you can simply leave the video card lut to a straight curve and not bother with the additional video card profile tags etc...

Displays and software that actually allow you to do this, won't leave you in the dark about the RGB buttons.

If you have an LCD with a 10bit or 12bit internal lut, and you have the RGB controls available in DVI mode, you could certainly use it. GMB match allows you to adjust the controls with measurement feedback. However, it usually doesn't do well with anything other than "native" white, because there usually is such a violent difference between pure white and shades of gray on LCD panels. This is where software like ColorEyes shines.

If you have an LCD with an 8to8bit lut, then it becomes a complete mist what to do with the controls. It may be that the LCD is using temporal dithering internally (because it is driving a 6bit panel for example) and therefore it may actually do something useful with the RGB controls. That's a lot of "mays" though, and in this case it is best to select the option with the least amount of tempering of the internal lut. (probably R=G=B=100 in user mode).

Given your goal to also have uncalibrated RGB displayed with a decent WB, you would want to calibrate to a single gamma curve for all three primaries. So, even if you choose "native" gamma, it should still be one and the same gamma value for all three primaries. I don't know whether your software allows you to control these settings, but software is the only place where you could influence this.
Logged

Regards,
Oscar Rysdyk
theimagingfactory
61Dynamic
Sr. Member
****
Offline Offline

Posts: 1442


WWW
« Reply #16 on: June 04, 2006, 10:19:22 AM »
ReplyReply

Quote
VGA is an analog signal and because of this, an LCD must convert to digital signal before it can display the information. Therefore, you can theoretically adjust the analog signal prior to sampling for the digital signal. If the RGB buttons actually allow you to adjust the pre-amps, then they work akin to setting the correct white-balance prior to shooting JPG.
You really can't equate adjusting a analog signal to setting WB prior to shooting a jpeg. When shooting jpeg, you start off with quite allot of raw information before the jpeg is made. With the analog signal, you start off with a 24-bit jpeg, it gets converted to analog and then you end up with another 24-bit jpeg after it's converted back in the display.

With the analog signal, you actually have less information than a pure digital signal since there is always a loss during conversion. So even if the RGB controls did adjust the analog signal, it would be even less desirable to do so than adjusting a full digital signal in a DVI connection.
Logged
opgr
Sr. Member
****
Offline Offline

Posts: 1125


WWW
« Reply #17 on: June 04, 2006, 11:22:51 AM »
ReplyReply

Quote
You really can't equate adjusting a analog signal to setting WB prior to shooting a jpeg.

The analogy was meant like this:

Does it help to adjust the RGB controls prior to calibration?

If you're using a VGA signal and the RGB controls adjust the analog signal, then it most certainly does help. It is then equivalent to setting the WB prior to shooting JPG as opposed to shooting at a fixed WB and adjusting it afterwards.

And I do agree with your point: DVI is to be preferred over any analogue connection. Be careful however about loss of information: the conversion may actually improve perceptual rendering of data. e.g.: the sampling of the relatively unstable analogue signal automatically dithers a pixel, produces noise over the entire image, and some misalignment, all of which are sometimes perceived as a more desirable rendition.

I would like to add: I am NOT an expert. These are merely my findings when looking for a replacement of my old Hitachi LCD about 2 years ago. I still haven't replaced it, since the changes back then were interesting enough to wait a little while longer.

I do believe it is a good time to make a purchase now. There are good offerings, and current developments are not particularly interesting for the digital darkroom, or won't materialize efficiently in the write-off time of a current purchase.
Logged

Regards,
Oscar Rysdyk
theimagingfactory
Serge Cashman
Full Member
***
Offline Offline

Posts: 200


« Reply #18 on: June 05, 2006, 01:09:21 PM »
ReplyReply

That was very informative.

However I understand that on the subject of 8 bit LUT monitors even Oscar is unclear.

My understanding so far is even though there are 2 places to make adjustments there are no known benefits to using the monitor buttons as opposed to videocard LUTs for white point adjustment on 8 bit monitors. And there's a solid consensus that the best thing would be not to adjust white point at all (I knew that).

In practice it means that if software asks you if your (8 bit ) LCD monitor has RGB buttons it's best to tell that you don't, so it does everything through the videocard LUTs, both the white point and the gamma.

The information on higher bit LUTs was extremely helpful. From what I understood adjusting native white point on 10 or more bit internal LUTs monitor using buttons (in case you can't use DDC) does not lead to the loss of color values. So in this case you'd want to avoid using videocard LUTs for adjustments as much as possible.
« Last Edit: June 05, 2006, 01:11:33 PM by Serge Cashman » Logged
digitaldog
Sr. Member
****
Offline Offline

Posts: 9225



WWW
« Reply #19 on: June 05, 2006, 01:39:07 PM »
ReplyReply

Just had to ping Dr Karl on all this. His response:
---

As to the issue in this post.

What is optimal front panel controls or video card LUT for adjusting  
white?

The simple answer is you can't really know. Manufacturers don't give  
you enough information in most cases.

The internal data paths of most LCDs are not published. Modern mid to  
high end LCDs do not have simple 1D LUTs. Most chip sets today have  
3D lookups which are altered by all of the front panel adjustments  
and a factory-set final grey balance 1D curve after that. Display  
features like NECs "colorcomp" are also using up data to make spacial  
corrections (this is never mentioned in the data sheets.)

Using an analog input signal never fixes anything! That input signal  
is not a smooth curve it is a stair-step signal created by the video  
card DAC. The only thing this will do is introduce noise and  
additional aliasing in the system as it is redigitised. Even if a  
front panel control did adjust the analog input side (i have never  
seen this) this would only create worse aliasing due to the  
misalignment of the stair-step signal and the ADC thresholds.

These things said, here is what you can do.

It is always preferable to keep the white point as close to native as  
possible. If you only have one display and native is 6800K use it  
don't correct it, your eyes will adapt. If you need to have two  
displays side by side then you need to correct white point.

If you are using a modern high-end LCD. Use the front-panel to adjust  
the white point, load a neutral LUT in VC. Look at a grayscale, is it  
neutral all the way to black? or did it look much better when set to  
native? This will give you some idea if your internal path has the 2  
stage 3D-1D system I described above. If it does use the front panel  
to correct the white, If not use the cal software and the LUT.

Do not adjust "contrast", "brightness" or "gamma" on the front panel.  
Do adjust "backlight", on some models backlight is incorrectly  
labeled "brightness."

When you calibrate do not change the TRC of the system ("use native  
gamma") changing the TRC will only cause more aliasing.

Do not use a Matrix/TRC based profile. Use a full lookup. This will  
provide the best soft proof. LCDs are not CRTs they do not use  
phosphors that mix in a linear fashion. There is a lot of crosstalk  
between channels especially in the darker tones. This can only be  
properly modeled with the full lookup ICC profile.

Hope this helps,

Karl Lang
----

Yup, it helped me.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
Pages: [1] 2 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad