Ad
Ad
Ad
Pages: [1]   Bottom of Page
Print
Author Topic: 10 bit display  (Read 4394 times)
Czornyj
Sr. Member
****
Offline Offline

Posts: 1400



WWW
« on: October 29, 2009, 12:56:28 PM »
ReplyReply

Few years ago Karl Lang has written:

Quote
A wide gamut LCD display is not a good thing for most (95%) of high
end users. The data that leaves your graphic card and travels over the
DVI cable is 8 bit per component. You can't change this. The OS, ICC
CMMs, the graphic card, the DVI spec, and Photoshop will all have to be
upgraded before this will change and that's going to take a while. What
does this mean to you? It means that when you send RGB data to a wide
gamut display the colorimetric distance between any two colors is much
larger. As an example, lets say you have two adjacent color patches one
is 230,240,200 and the patch next to it is 230,241,200. On a standard
LCD or CRT those two colors may be around .8 Delta E apart. On an Adobe
RGB display those colors might be 2 Delta E apart on an ECI RGB display
this could be as high as 4 delta E.

Quote
The 2180WG has an actual 10 bit DVI interface
with a 10-10-10 path but nothing supports it so you can't use it yet -
but for $6500 your ready when it does ;-)

Seems like The NEC 2180WG (aka Reference 21) is soon to be discontinued, new wide gamut Eizo displays have 10-bit display port, but - apart from that - nothing happens. Out of curiosity - does anyone know something if there's any plan to support 10bit display path  in applications like Photoshop? Are any of new operating systems, graphic cards, drivers etc. 10-bit signal path ready? The gamut of modern LCD displays grows, but will we ever have a chance to make good use of it? Is anyone working on it, or at least  planning to do something about it?
« Last Edit: October 29, 2009, 05:58:36 PM by Czornyj » Logged

JeffKohn
Sr. Member
****
Offline Offline

Posts: 1671



WWW
« Reply #1 on: October 29, 2009, 02:03:13 PM »
ReplyReply

I believe Windows 7 also has support for 10-bit color now, but we're going to have to wait for the graphics card manufacturers to get on board and support it.
Logged

Czornyj
Sr. Member
****
Offline Offline

Posts: 1400



WWW
« Reply #2 on: October 29, 2009, 03:03:32 PM »
ReplyReply

Quote from: JeffKohn
I believe Windows 7 also has support for 10-bit color now, but we're going to have to wait for the graphics card manufacturers to get on board and support it.

Graphics card manufacturers also seem to support 10-bit output:
http://ati.amd.com/products/radeonx1900/specs.html

The question is - what about OS, drivers, calibration softwar, ICC profiles, CMM, Photoshop and the rest...
Logged

JBM
Newbie
*
Offline Offline

Posts: 17


WWW
« Reply #3 on: October 29, 2009, 03:56:24 PM »
ReplyReply

http://forums.adobe.com/message/2346311 - Chris Cox - Adobe employee, Oct 27, 2009
Quote
Photoshop can't support greater than 8 bit/channel output yet.
We tried, and the APIs still had problems.  We're working with
the vendors to resolve those problems. (and trying to find good
displays to test with that *really* output 10 bits or more per channel).


Logged
Czornyj
Sr. Member
****
Offline Offline

Posts: 1400



WWW
« Reply #4 on: October 29, 2009, 04:07:53 PM »
ReplyReply

Quote from: JBM
http://forums.adobe.com/message/2346311 - Chris Cox - Adobe employee, Oct 27, 2009

Thanks! That sounds promising - let's hope CS5 will support it!
« Last Edit: October 29, 2009, 04:13:58 PM by Czornyj » Logged

Schewe
Sr. Member
****
Offline Offline

Posts: 5469


WWW
« Reply #5 on: October 29, 2009, 05:52:22 PM »
ReplyReply

Quote from: Czornyj
Few years ago dr Karl Lang has written:

Just to be clear, while Karl is a really, really smart guy...he doesn't have a Phd. So, I'm pretty sure if he were to notice this, he would be uncomfortable with the "dr" in front of his name...
Logged
Czornyj
Sr. Member
****
Offline Offline

Posts: 1400



WWW
« Reply #6 on: October 29, 2009, 06:05:02 PM »
ReplyReply

bitte um Verzeihung!
« Last Edit: October 29, 2009, 06:05:23 PM by Czornyj » Logged

JeffKohn
Sr. Member
****
Offline Offline

Posts: 1671



WWW
« Reply #7 on: October 29, 2009, 06:06:17 PM »
ReplyReply

Quote from: Czornyj
Graphics card manufacturers also seem to support 10-bit output:
http://ati.amd.com/products/radeonx1900/specs.html

The question is - what about OS, drivers, calibration softwar, ICC profiles, CMM, Photoshop and the rest...
I don't see anything in those specs that claims to actually output 10-bit per channel color to a digital display under Windows. The DAC's are 10-bit per channel for converting to analog, and the internal processing pipepline is 10-bit; but that's not at all the same as outputting 10-bit color. The internal 10-bit color gets dithered to 8-bit for output.
« Last Edit: October 29, 2009, 06:07:02 PM by JeffKohn » Logged

Czornyj
Sr. Member
****
Offline Offline

Posts: 1400



WWW
« Reply #8 on: October 29, 2009, 06:22:27 PM »
ReplyReply

Quote
Avivo™ Video and Display Platform
(...)Dual integrated 10 bit per channel 400 MHz DACs
16 bit per channel floating point HDR and 10 bit per channel DVI output
Programmable piecewise linear gamma correction, color correction, and color space conversion (10 bits per color) (...)
Logged

tho_mas
Sr. Member
****
Offline Offline

Posts: 1696


« Reply #9 on: October 29, 2009, 06:25:27 PM »
ReplyReply

Quote from: Czornyj
will we ever have a chance to make good use of it?
have you ever noticed that a gradation in 16bit that does not match the TRC of your monitor is looking smooth but when you switch the file in Photoshop to 8bit you suddenly see banding?
AFAIK the monitor redraws the image and with high bit internal processing - even if the dvi only passes 8bit - you can make a good use of a wide gamut monitor even today. at least this is my reading.
I read the quoted article of Karl Lang carefully before I bought a wide gamut monitor back in the days. But I never experienced the problem he is talking about. (And btw... the stated Delta E values are not true).
Logged
JeffKohn
Sr. Member
****
Offline Offline

Posts: 1671



WWW
« Reply #10 on: October 29, 2009, 10:24:30 PM »
ReplyReply

Quote
(...)Dual integrated 10 bit per channel 400 MHz DACs
DAC means digital to analog converter. This is totally irrelevent unless you're using an analog VGA output.

Quote
16 bit per channel floating point HDR and 10 bit per channel DVI output
I'm not exactly sure what they mean here (probably just marketing BS), since DVI is limited to 8-bit; only HDMI 1.3 and displayport can handle 10-bit color. Maybe their chipset is capable of outputting 10-bit color in some hypothetical scenario, but this feature is not enabled on any actual shipping cards, certainly not under Windows Vista or earlier. And as far as I know not under Windows 7 either, unless they've released new drivers to enable this functionality (which they haven't, to my knowledge).

Quote
Programmable piecewise linear gamma correction, color correction, and color space conversion (10 bits per color) (...)
In other words, internal processing pipeline, just like I said.

If you think you're getting 10-bit color with an x1900 or any other Radeon card under Windows, show me the screen shot of the display settings dialog where it lets you select that option.
Logged

Czornyj
Sr. Member
****
Offline Offline

Posts: 1400



WWW
« Reply #11 on: October 30, 2009, 04:07:25 AM »
ReplyReply

Quote from: tho_mas
have you ever noticed that a gradation in 16bit that does not match the TRC of your monitor is looking smooth but when you switch the file in Photoshop to 8bit you suddenly see banding?
AFAIK the monitor redraws the image and with high bit internal processing - even if the dvi only passes 8bit - you can make a good use of a wide gamut monitor even today. at least this is my reading.
I read the quoted article of Karl Lang carefully before I bought a wide gamut monitor back in the days. But I never experienced the problem he is talking about. (And btw... the stated Delta E values are not true).
Yeah, I've noticed that phenomenon before, and frankly speaking - it was a little bit too much for my humanistic education. Rounding errors? Dithering?
But so or so it only makes a good use of an 8 bit palette, so we don't really know what visible difference would there really be on a 10 bit display working in a 10 bit environment.

Quote from: JeffKohn
DAC means digital to analog converter. This is totally irrelevent unless you're using an analog VGA output.

I'm not exactly sure what they mean here (probably just marketing BS), since DVI is limited to 8-bit; only HDMI 1.3 and displayport can handle 10-bit color. Maybe their chipset is capable of outputting 10-bit color in some hypothetical scenario, but this feature is not enabled on any actual shipping cards, certainly not under Windows Vista or earlier. And as far as I know not under Windows 7 either, unless they've released new drivers to enable this functionality (which they haven't, to my knowledge).

In other words, internal processing pipeline, just like I said.

If you think you're getting 10-bit color with an x1900 or any other Radeon card under Windows, show me the screen shot of the display settings dialog where it lets you select that option.

I agree - I'm also sceptical, and won't care until I see a working 10 bit workflow. On the other hand - I'm not so sure that DVI can't handle 10-bit signal, like K.Lang said:"The 2180WG has an actual 10 bit DVI interface". So hypothetically - maybe some cards have 10 bit output. Also medical LCD displays seem to be working with 10 bit signal for a long time.
« Last Edit: October 30, 2009, 04:13:16 AM by Czornyj » Logged

tho_mas
Sr. Member
****
Offline Offline

Posts: 1696


« Reply #12 on: October 30, 2009, 04:52:33 AM »
ReplyReply

Quote from: Czornyj
Yeah, I've noticed that phenomenon before, and frankly speaking - it was a little bit too much for my humanistic education. Rounding errors? Dithering?
But so or so it only makes a good use of an 8 bit palette, so we don't really know what visible difference would there really be on a 10 bit display working in a 10 bit environment.
I don't know what's going on under the surface regarding the technique... but my reading is that the information whether it's 16bit or 8bit is passed with the data and thus the monitor redraws with 8bit or 16bit accuracy.
The other way arround, from a practical standpoint: as long as internal 16bit processing of the monitor works that good I don't see any advantage of 10bit chanels in the dvi connection (except, maybe, if you work without color management). But if there are any advantages I'd certainly be glad about improvements.
Logged
Pages: [1]   Top of Page
Print
Jump to:  

Ad
Ad
Ad