Ad
Ad
Ad
Pages: [1]   Bottom of Page
Print
Author Topic: What's "Native" on an LCD?  (Read 8246 times)
Serge Cashman
Full Member
***
Offline Offline

Posts: 200


« on: September 20, 2006, 01:36:40 AM »
ReplyReply

I'm unclear about 8 bit LCD image quality (prior to calibration and/or profiling). "Native" settings are supposed to minimize artifacts. But what's Native?

If I remember correctly once when the topic of LCD image quality was discussed one of the posters wrote that it's possible that it's 100% on all channels.

No matter how much I look at non-colormanaged greyscale gradients I can't figure out what's "Native". I don't see a substantial difference between "Normal" or "Factory Default" preset or any other preset. And they all arn't that great, at least on the monitors I have access to. I'm talking specifically about neutrality of a 16 bit greyscale gradient . It's viewed in PS proofed to Monitor and without VC LUTs loaded - not sure it's the correct way to view it BTW.

100% on all channels does look somewhat more neutral all over, but it's useless obviously.
« Last Edit: September 20, 2006, 01:40:57 AM by Serge Cashman » Logged
opgr
Sr. Member
****
Offline Offline

Posts: 1125


WWW
« Reply #1 on: September 20, 2006, 02:40:40 AM »
ReplyReply

Quote
100% on all channels does look somewhat more neutral all over, but it's useless obviously.
[a href=\"index.php?act=findpost&pid=77034\"][{POST_SNAPBACK}][/a]

But that's generally what is understood to be "native". That is, native whitepoint. In addition you have native gamma, which is less precisely defined, but generally means the closest fitting gamma curve according to a least squares method. (But whether this is before or after the VCG correction, or whether this is on a per channel basis...?)

Because of the usual disruptions in the highlights of most LCDs, it would be useful to have the software measure several tints of gray for "native" whitepoint determination, but I don't think any of the current software options allows this.

So, generally "native" white point means using the white measured for R=G=B=255.

In Eye-one Match you want to select "Laptop" for calibration, otherwise you will most likely end up with a desktop and system GUI that is a different "gray" than your images in Photoshop.
Logged

Regards,
Oscar Rysdyk
theimagingfactory
Serge Cashman
Full Member
***
Offline Offline

Posts: 200


« Reply #2 on: September 20, 2006, 12:05:25 PM »
ReplyReply

Maybe I haven't asked the right question.

The question is - what is the "no internal adjustments" state of an LCD?

From the point of view of the calibration software "Native" means "no videocard LUTs adjustments".

I'm having hard time determining if there are OSD presets or RGB slider positions that give better image quality than others. I don't think there's any substantial differences in visible artifacts on a greyscale gradient no matter what I choose. Videocard LUTs are not loaded at all.
Logged
61Dynamic
Sr. Member
****
Offline Offline

Posts: 1442


WWW
« Reply #3 on: September 20, 2006, 02:01:12 PM »
ReplyReply

Quote
I'm having hard time determining if there are OSD presets or RGB slider positions that give better image quality than others.[a href=\"index.php?act=findpost&pid=77074\"][{POST_SNAPBACK}][/a]
There are none. The very act of altering the on screen controls degrades the image. How much depends on the display and the changes made of course. Set it to Factory defaults, and leave it. That is the "no internal adjustments" setting.

Calibrating for the native settings will profile the display at whatever settings the display is currently operating at.
Logged
digitaldog
Sr. Member
****
Offline Offline

Posts: 9186



WWW
« Reply #4 on: September 20, 2006, 07:28:25 PM »
ReplyReply

Quote
The question is - what is the "no internal adjustments" state of an LCD?

From the point of view of the calibration software "Native" means "no videocard LUTs adjustments".

[a href=\"index.php?act=findpost&pid=77074\"][{POST_SNAPBACK}][/a]

So Native in the calibration software is clear, the question is, what setting on the LCD is Native? Good question. I'm not sure anyone can know for sure. Factory default? Maybe yes, maybe no.

Those damn LCDs!
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
Serge Cashman
Full Member
***
Offline Offline

Posts: 200


« Reply #5 on: September 24, 2006, 02:09:13 AM »
ReplyReply

Quote
I'm not sure anyone can know for sure. Factory default? Maybe yes, maybe no.

Thanks for your input, Andrew.

Hmm... I would say I figured as much.

Daniel, I used to do exactly what you're suggesting (factory default) but recently I decided to run some tests to see if what I was doing was justified.
Logged
Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 6970


WWW
« Reply #6 on: September 24, 2006, 08:53:36 AM »
ReplyReply

Quote
So Native in the calibration software is clear, the question is, what setting on the LCD is Native? Good question. I'm not sure anyone can know for sure. Factory default? Maybe yes, maybe no.

Those damn LCDs!
[a href=\"index.php?act=findpost&pid=77106\"][{POST_SNAPBACK}][/a]

No point swearing at LCDs much as we'd like to - they're obviously here to stay (for a while anyhow) so we may as well grin and bear it.

I'm calibrating and profiling with ColorEyes Display using DDC (to use DDC the monitor needs to be DDC-compliant and the video card needs to be one of those that the software recognizes for DDC.). The only instructions one needs are to set the monitor itself at factory defaults. Ignore the video card because the software looks after it, and set three parameters judgmentally: white point, gamma and luminance. I have white point set to 6500, gamma on L* (as recommended) and luminance at 110 cd/m2. This is for a Lacie 321, printing to an Epson 4800 for now using only matte paper. Once those three parameters are set, the colorimeter is posed on the white circle the software provides on the display and the software does the rest with about 75 iterations plus another 15 to verify the Delta-Es. With these settings in general I'm getting very satisfactory predictability between monitor image and print. It does require some experimentation to get the luminance setting right for one's own conditions, but it can and does work.
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
TimothyFarrar
Newbie
*
Offline Offline

Posts: 49


« Reply #7 on: September 24, 2006, 11:59:14 AM »
ReplyReply

There is however a (somewhat) objective method to find settings which do not degrade the total number of shades the LCD produces.

Take an image that shows all 256 tones (in the form of a gradient) and load this into ImageReady (so the image is not processed using color management). Then zoom in to 1600%.

Change the settings on the LCD display until you can see all the individual 256 tones. Probably would have to do this at night or in a room with no lighting. Then look at the display at extreme angles to check the darks and lights (the distortion of brightness at odd angles helps).

Check gray, red, green, and blue gradients.

Then once the display settings are optimized then calibrate the LCD.

I've done this and the results have been great.

Since both Photoshop and ImageReady do not produce the correct gradients I manually created gradients (256x256 in size), and you can right click and save the 256 color gifs directly from here,

http://www.farrarfocus.com/ffdd/blog20060921.htm
Logged

Timothy Farrar
Farrar Focus Digital Darkroom
www.farrarfocus.com/ffdd
61Dynamic
Sr. Member
****
Offline Offline

Posts: 1442


WWW
« Reply #8 on: September 24, 2006, 12:09:46 PM »
ReplyReply

Quote
So Native in the calibration software is clear, the question is, what setting on the LCD is Native? Good question. I'm not sure anyone can know for sure. Factory default? Maybe yes, maybe no.

Those damn LCDs!
[a href=\"index.php?act=findpost&pid=77106\"][{POST_SNAPBACK}][/a]
While definitely true - LCD makers are mum on what they are specifically doing - it's been my experience that a vast majority of LCDs suitable for photography work calibrate best when first set to the factory defaults. YMMV.

When you get into the low-end of the spectrum, well, there's no consistency or concern for color accuracy. LCD TVs are the same way. Displays (even some higher end models) are set wrong intentionally to wow buyers at stores like Best Buy. Sometimes the off-kilter color settings are designed into the hardware rather than just a simple "Customer Bamboozle" preset in the software menu that can be disabled.

Just wait until SED hits the market in the next year or two, and then OLED. As if things weren't complex enough today...
Logged
Serge Cashman
Full Member
***
Offline Offline

Posts: 200


« Reply #9 on: September 24, 2006, 08:11:03 PM »
ReplyReply

Quote
Since both Photoshop and ImageReady do not produce the correct gradients I manually created gradients (256x256 in size)...


That's quite interesting. Why is that?

I was really frustrated by the difference between 16 and 8 bit gradients in PS, so I suppose you've got a point there.

Right now I'm looking at 16 bit gradients in PS, proofed to monitor space for display evaluation purposes.

[edit] I think that your advice to try a VGA cable for an LCD monitor as an alternative to DVI is highly controversial. Your rationale for it does make sense but this practice is usually discouraged by posters who have engeneering background.

MarkDS is in the position all of us want to be in - when everything is done via DDC. I know, I know...
« Last Edit: September 24, 2006, 08:25:47 PM by Serge Cashman » Logged
TimothyFarrar
Newbie
*
Offline Offline

Posts: 49


« Reply #10 on: September 24, 2006, 10:16:19 PM »
ReplyReply

As for the gradients, do a white to black gradient on an image that is 256 pixels wide, and check the colors, I was looking for a sequence of {0,1,2,3,4,5 .... 253,254,255} and just could not get either programs to produce a linear sequence. You might have better luck than I.

You are right, using the VGA cable instead of the DVI is definatly a controversial topic.

Don't get me wrong, I would definatly use the DVI cable on a quality LCD that has built in DDC control of LUT (look up tables). Then the calibration is stored in the LCD display and not being provided by the LUT on the video card. Using a VGA cable on this type of display is just a BAD idea.

However, for the LCD with no built in LUTs, the VGA cable has worked better for me. I would only go as far as to suggest that you test both and do what works best for you.

Quote
by posters who have engineering background

That gave me a chuckle. If it makes you feel any better, and not that this should make you trust any of my advice or comments, but I do have a BS in engineering, worked as a manufacturing engineer (before becoming a photographer), and have over 10 years of programming experience.

So it is a sure bet that I probably at least know the difference between a VGA and a DVI cable  

Quote
I think that your advice ... is highly controversial

All joking aside, I really do take that as a complement. Thanks!
Logged

Timothy Farrar
Farrar Focus Digital Darkroom
www.farrarfocus.com/ffdd
Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 6970


WWW
« Reply #11 on: September 24, 2006, 10:23:34 PM »
ReplyReply

Timothy, maybe you can explain the whys and wherefores for us, but I do know for sure that the video card plays a role in whether DDC works or not. Using ColorEyes Display software with my Lacie 321 (which is DDC aware), until Integrated Color recently provided a program up-date, there were numerous video cards including mine which it did not recognize for DDC and therefore could not operate in DDC mode. That was fixed with their latest software up-date. New video cards keep appearing all the time and I was informed from source that unless they've been tested for DDC compliance using this software there is no assurance that the monitor can be profiled and calibrated using DDC. I imagine the same may apply for other calibration/profiling software on the market too. I should also mention I am on Windows XP. That may matter.
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
TimothyFarrar
Newbie
*
Offline Offline

Posts: 49


« Reply #12 on: September 25, 2006, 01:09:35 AM »
ReplyReply

I also use ColorEyes Display, but on a Mac.

Quote
explain the whys and wherefores

First some background,

DDC = display data channel

LUT = look up table, maps an 8-bit level (0-255) to a brightness level (0-1023 or more)

In the CRT past, the LUT on the video card would take your 8bit RGB color data and transform those (8bit) 256 values into a brightness level. This analog signal gets sent to the CRT. So you calibrate the CRT by adjusting the video cards RGB LUTs.

There are a few problems with this, first you need voltage stable video card output, and a CRT monitor that provides consistant brightness levels from the analog video signal. For example, simply swapping out a video card on a PC, and you would need to recalibrate because the output voltages, even between two of the same cards, could be different. This becomes a mess when you need to administer hundreds of color calibrated workstations (say at a movie special effects business).

For high quality LCDs, this is different. The DVI cable sends the 8bit RGB data digitally to the LCD, and the transformation of those 256 values into a brightness level is done in the LCD (via RGB LUTs in the LCD). No analog problems. This is why DVI is the future, and why it is recomended.

However if you cannot set the LUTs in the LCD then DVI is not a good option for calibration (IMHO) using VGA cable is better (because you can set the LUTs in the video card).

About DDC,

DDC1 : DDC started out as a simply way for monitors to tell the graphics card what video modes it supported. This is the DDC1 standard. It was a one way transfer from monitor to graphics card, there is no way for the graphics card to talk back to the monitor to set LUTs or anything else. The monitor just keeps sending the display settings all the time.

Then the display and graphics card industries got wise and added two way communication so the graphics card could talk to the monitor (through I2C protocol).

DDC1/2B : Graphics card can ask the monitor for display settings. This does not support setting of display LUTs.

DDC1/2AB : Graphics card has full control over the monitor. This is needed to be able to set the LUTs on a high quality LCD display. This is also refered to as DDC/CI.

So you can probably guess where I am going with this,

You will need at least DDC1/2AB (also refered to as DDC/CI) support in the LCD display, the graphics card, the graphics card driver, the operating system, and the calibration software for this to work properly.

However, there is another problem.

While the protocol used to set the common settings (ie brightness and contrast) is standard, as far as I can tell, there is no standard for setting the LUTs on high quality LCDs. So adjusting the display LUT seems to be using a proprietary extension or proprietary protocols. Some displays even use a seperate USB connection for this purpose.

So now your calibration software has to be aware of all the proprietary methods to adjust each display's built in lookup tables. And of course the operating system and graphics cards (and drivers) also have to be providing support for this as well.

In essence, a lack of a common standard for setting LCD display LUTs has become a nightmare for software developers, and thus also a nightmare for us users.

With any luck, VESA will establish a common standard for setting LCD display's LUTs, and then this feature will be in even in-expensive LCD displays. Then Linux will get color calibration support, Adobe will port Photoshop to Linux, ... ok I'll stop dreaming.

I hope this has made sense to those interested in the topic (I'm definatly not a good technical writer)...

Please feel free to add corrections if I got anything wrong.
Logged

Timothy Farrar
Farrar Focus Digital Darkroom
www.farrarfocus.com/ffdd
opgr
Sr. Member
****
Offline Offline

Posts: 1125


WWW
« Reply #13 on: September 25, 2006, 04:04:33 AM »
ReplyReply

Quote
For high quality LCDs, this is different. The DVI cable sends the 8bit RGB data digitally to the LCD, and the transformation of those 256 values into a brightness level is done in the LCD (via RGB LUTs in the LCD). No analog problems. This is why DVI is the future, and why it is recomended.

However if you cannot set the LUTs in the LCD then DVI is not a good option for calibration (IMHO) using VGA cable is better (because you can set the LUTs in the video card).

Please feel free to add corrections if I got anything wrong.
[a href=\"index.php?act=findpost&pid=77547\"][{POST_SNAPBACK}][/a]

Future? Some people believe DVI is already marginally obsolete. Video card luts can always be adjusted, regardless of the connection. There is no reason to use VGA for that. VGA may simply obscure some artifacts that some people may find objectionable but really are present in the original file or signal. In other words, the rest of the world sees a problem where you don't. (you in general, not you personally).

From previous discussions on these boards I understand the following. Because of the nature of LCD, the response is far from the usual gamma paradigm. In addition, cross talk between channels is present and non-linear. So, to have an LCD behave remotely like a gamma device, it needs internal adjustments which may include 3D luts and gamma to LCD response translations. It may also involve bit translations because the panel, especially the cheaper ones, are usually less than 8bit.

So, if the panel menu is set to factory default, there will still be a lot of processing in the panel, and it is a fair question to ask what setting may give the best results. Due to the complete lack of information on the manufacturers sides, eye-balling may be the only proper answer...

Having said that, I still don't understand why people rave about that L* calibration. The black point of your LCD is not a black hole, far from it, so why would you want to use L* as opposed to gamma 2.2 which is about equal to the perceptual uniformity of L* (where it counts) but has the additional benefit of displaying all elements and images properly (as far as contrast goes), as opposed to just your colormanaged images...

And in addition, I would like to again point out that NATIVE does not necessarily mean "Linear Video LUT", it also means "Ignore Video LUT" (which may still contain adjustments), and technically it means: find the closest fitting gamma curve. Which is what you may (or may not) find stored in the gamma entry of your profile.
And if it is in the profile, then the device had better behave like that gamma and not like some completely irrelevant LCD response curve which bears no relation whatsoever with the gamma curve used to prepare your images for display...
« Last Edit: September 25, 2006, 04:05:34 AM by opgr » Logged

Regards,
Oscar Rysdyk
theimagingfactory
Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 6970


WWW
« Reply #14 on: September 25, 2006, 11:16:59 AM »
ReplyReply

Tomothy, thanks for that extensive analysis of what is going on and what needs to go on for getting proper calibrations with DDC on an LCD. Of course, the more one knows the more complicated it gets and the happier one is hitting on some combination of harrdware, software and parameter settings that seem to work well!
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
Pages: [1]   Top of Page
Print
Jump to:  

Ad
Ad
Ad