Ad
Ad
Ad
Pages: [1] 2 »   Bottom of Page
Print
Author Topic: Wide Gamut Monitors and Untagged Images  (Read 2745 times)
David Eichler
Sr. Member
****
Offline Offline

Posts: 310


WWW
« on: March 19, 2014, 08:23:54 PM »
ReplyReply

A lot of websites seem to strip the tagged color profiles from images. This didn't matter quite as much in the past because it was very unlikely that the average viewer had a wide gamut monitor (not that some color and contrast distortion can't be seen with an sRGB only monitor). However, the current Apple Cinema monitor has a wide gamut, and while probably not mainstream in most areas at this point, it may be getting more common in more well heeled areas. I am wondering how common wide gamut monitors are at this point. Other than the Cinema display, are there any other wide gamut monitors currently being marketed to mainstream consumers? At least some of the current Dell Ultrasharp monitors are now wide gamut, but are average consumers even considering these?
Logged

digitaldog
Sr. Member
****
Offline Offline

Posts: 8576



WWW
« Reply #1 on: March 19, 2014, 08:25:52 PM »
ReplyReply

Cinema displays are wide gamut?
NEC and Eizo among a few other's have a number of wide gamut displays.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
David Eichler
Sr. Member
****
Offline Offline

Posts: 310


WWW
« Reply #2 on: March 19, 2014, 08:40:07 PM »
ReplyReply

Cinema displays are wide gamut?
NEC and Eizo among a few other's have a number of wide gamut displays.

Oops. No, the Apple Cinema display is not wide gamut. However, some of the new Dell Ultrasharps
are. Wonder how many people are buying these just for general use.
Logged

hjulenissen
Sr. Member
****
Offline Offline

Posts: 1666


« Reply #3 on: March 20, 2014, 02:08:46 AM »
ReplyReply

Oops. No, the Apple Cinema display is not wide gamut. However, some of the new Dell Ultrasharps
are. Wonder how many people are buying these just for general use.
My personal Dell 2711 is wide-gamut. As it was one of the least expensive 27" at the time, I am guessing that a fair amount of general users got it (those with $900 to spare, some room on their desktop and an interest in large "2.5k").

I am using the sRGB Dell 2713HM for work. I'd say for non-photographic work it is just as good. I am guessing that casual photography customers will buy it instead of the wide-gamut 2713HM.

I think that you are raising an interesting question. One cannot assume (and the industry should not assume) that general people have colorimeters, or that they care to press display buttons in order to select "sRGB emulation" when needed. So the current solution seems to be to ship these displays with sRGB emulation selected as default, relying on the sRGB assumption that is deeply embedded in all of our software and hardware. Problem is, why should the user then purchase a more expensive (less energy efficient?) wide-gamut display?

My family prefer to watch videos on the 10 year old 20" sRGB LCD that I have flipped into portrait. The reason is simple: colors look horrible on my wide-gamut display unless you do something active about it.

The solution seems to be that:
1. Dell & friends should work to automatically document the native color response of their displays (via EDID, a CD in the box, or downloadable driver). I am not talking about the (hopefully) moderate changes as the backlight wears, or across batches of displays, I am talking about the bulk difference that makes watching sRGB content painful.
2. Microsoft/Apple/friends should work to make their OS look good for both color-aware and color-unaware software/hardware/users. If a video player application cannot interpret color information, then the OS should try to switch the monitor into sRGB emulation mode, or inject a color-management layer between the application rendering and the display.

-h
« Last Edit: March 20, 2014, 02:11:17 AM by hjulenissen » Logged
D Fosse
Sr. Member
****
Online Online

Posts: 285



« Reply #4 on: March 21, 2014, 04:19:17 AM »
ReplyReply

If they just put a sticker on the screen it would go a long way:

Warning: This unit must be used with fully color managed software to operate as intended. Full calibration and profiling is required.

But then they would have to remove all the "High Definition" and "10 000 : 1 Dynamic Contrast Ratio" stickers, and they might not want to do that...

For web the solution is very simple: Firefox with color management set to mode 1. This works pretty much like the working space in Photoshop, so that the color management chain is always on. There's nothing stopping Microsoft and Apple from doing something similar across the board, but for some strange reason they just won't.
Logged
Simon Garrett
Sr. Member
****
Offline Offline

Posts: 339


« Reply #5 on: March 21, 2014, 04:58:37 AM »
ReplyReply

For web the solution is very simple: Firefox with color management set to mode 1.

Agreed.  But even in Firefox, it's not the default - you have to set explicitly Gfx.color_management.mode to 1.  

On the web, most graphic elements do not have embedded profiles.  That's reasonable: apart from photos most graphic elements are tiny, and an embedded profile of a few kb would be a significant size overhead.  But probably at least 99.99% of web graphics are sRGB (or intended to be displayed as sRGB).  So why don't browsers all take a wild guess and assume untagged, unprofiled graphic elements are sRGB?  

But all browsers (even Firefox by default) refuse even to guess the colour space might sRGB (which would virtually always be right), and simply turn off colour management for images or graphic elements with no profiles.  It's completely insane.  

Why do people smart enough to implement colour management make such a bizarrely stupid decision?  The only credible reason I've heard is that older Flash is not colour managed, and there might be a colour discontinuity between some Flash elements and surrounding html elements.  That's like saying "because some small proportion of colour is wrong, I want it all to be wrong."

All browsers should assume graphic elements without profiles are sRGB, with perhaps an option to turn that behaviour off for people with a particular fascination for seeing the wrong colour (e.g. those that like all colours to be as bad as Flash colours).  
« Last Edit: March 21, 2014, 08:25:10 AM by Simon Garrett » Logged
D Fosse
Sr. Member
****
Online Online

Posts: 285



« Reply #6 on: March 21, 2014, 05:36:11 AM »
ReplyReply

2. Microsoft/Apple/friends should work to make their OS look good for both color-aware and color-unaware software/hardware/users. If a video player application cannot interpret color information, then the OS should try to switch the monitor into sRGB emulation mode, or inject a color-management layer between the application rendering and the display.

While that's possible, it goes against the whole Microsoft color management policy, which is to leave CM strictly to the application. Windows never gets in the way of application CM.

Personally I think that's a very healthy policy in terms of overall reliability, low bug risk, easy troubleshooting and so on. But it obviously prevents OS-level management.

Apple could probably do it through ColorSync, I'm not really familiar with its under-the-hood workings.
Logged
Simon Garrett
Sr. Member
****
Offline Offline

Posts: 339


« Reply #7 on: March 21, 2014, 08:24:37 AM »
ReplyReply

While that's possible [OS-level colour management], it goes against the whole Microsoft color management policy, which is to leave CM strictly to the application. Windows never gets in the way of application CM.

Personally I think that's a very healthy policy in terms of overall reliability, low bug risk, easy troubleshooting and so on. But it obviously prevents OS-level management.

I agree.  I think it would be difficult for the OS to work out how to do colour management on existing screen-writing APIs.  For example, the OS can find out the profile of the monitor (if there is one) but usually doesn't know the colour space of the data being written to the monitor.  It could guess sRGB, but that would screw colour-managed applications (or those that don't use WCS, anyway).  There would be all sorts of circumstances where the OS might wrongly guess what to do.  I think it would probably be impossible to do without causing all sorts of problems. 

Apple could probably do it through ColorSync, I'm not really familiar with its under-the-hood workings.

Nor me!
Logged
hjulenissen
Sr. Member
****
Offline Offline

Posts: 1666


« Reply #8 on: March 21, 2014, 09:14:55 AM »
ReplyReply

I agree.  I think it would be difficult for the OS to work out how to do colour management on existing screen-writing APIs.  For example, the OS can find out the profile of the monitor (if there is one) but usually doesn't know the colour space of the data being written to the monitor.  It could guess sRGB, but that would screw colour-managed applications (or those that don't use WCS, anyway).  There would be all sorts of circumstances where the OS might wrongly guess what to do.  I think it would probably be impossible to do without causing all sorts of problems.  
Is it any harder than Apples approach to retina displays? Legacy APIs are hard-wired to a virtual low-rez display, and the OS will silently upscale content to the physical resolution. A new API is offered to those interested in accessing the full capabilities of the monitor.

Adobe & Co would have to make some adjustements (best-case: flip a bit "declaring that they know what they do", then recompile). All legacy applications would be sandboxed in the sRGB assumption (that is pretty much the standard outside of pro/enthusiast photography anyways).

There might be some problems with applications that access display hardware at a really low level (below what the OS is willing to mess with). So applications that are color-unaware and write directly to GPU buffers might be rendered (erroneously) at full native display gamut.


There is a question of what the OS ought to to if there is a color-unaware video player showing e.g. youtube content in one window, a color-aware photo editing application in another window, and the display is reporting two calibrated presets: 1)sRGB, 2)Wide-gamut. Should it inject sRGB->Wide-gamut conversion for the video window, use the accurate sRGB-mode of the display, or what? The easiest way out of such issues may be to only change behaviour for fullscreen applications.


I myself would probably be happy if display presets could be selected from the OS (using USB, EDID, whatever), and the OS allowed me to select which display preset should be selected depending on what application was highlighted. Then I could ensure that native wide-gamut was always used when I was using Lightroom, and sRGB emulation always otherwise. My family members would probably rejoice.

-h
« Last Edit: March 21, 2014, 09:23:07 AM by hjulenissen » Logged
digitaldog
Sr. Member
****
Offline Offline

Posts: 8576



WWW
« Reply #9 on: March 21, 2014, 09:48:07 AM »
ReplyReply

One would hope, a consumer looking for a wide gamut display and willing to pay for it would have a clue about color management. Same is true for people buying non wide gamut displays. And these better units have an sRGB emulation that works pretty darn well. And as pointed out, the issue isn't the wide gamut displays, it's the lack of color management. And, suppose that indeed 90% of the world switches to wide gamut displays today, we simply remove the silly "sRGB is the answer" for non color managed work and substitute Adobe RGB (1998). Not a great solution but it does illustrate how sRGB become what it is today and how it may go the way of the dodo bird in the future. My PA272 isn't producing Adobe RGB (1998) but an untagged image in that color space in a non ICC aware app looks OK. Just as an sRGB image would look OK on an sRGB like device. Not a match to an ICC aware app, this kind of viewing has always been a bit iffy at best. Hopefully the growth of wide gamut displays will encourage the right fix here, produce products that handle images to understand ICC color management, allow the user to select what the system does when it finds untagged data (we had that back on OS9, why Apple removed it I'll never understand). If not OS level, application level (mainly browsers).
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
David Eichler
Sr. Member
****
Offline Offline

Posts: 310


WWW
« Reply #10 on: March 21, 2014, 02:41:12 PM »
ReplyReply

"...the issue isn't the wide gamut displays, it's the lack of color management."  The issue isn't going to go away it seems. Many websites strip the tagged profiles from photos. Also, Adobe Flash is still very prevalent for displaying photos, and no one activates color management with that, if it is really even possible.
Logged

digitaldog
Sr. Member
****
Offline Offline

Posts: 8576



WWW
« Reply #11 on: March 21, 2014, 02:50:24 PM »
ReplyReply

"...the issue isn't the wide gamut displays, it's the lack of color management."  The issue isn't going to go away it seems. Many websites strip the tagged profiles from photos. Also, Adobe Flash is still very prevalent for displaying photos, and no one activates color management with that, if it is really even possible.
That's why there is an sRGB emulation mode. Flash IS a mess. Some versions apparently do support some color management but best to just stay away from Flash if color is important. Even if a site strips a profile, there is still EXIF data that defines the color space, why can't that be detected and used?
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
Alan Klein
Sr. Member
****
Offline Offline

Posts: 633



WWW
« Reply #12 on: March 21, 2014, 03:23:28 PM »
ReplyReply

One would hope, a consumer looking for a wide gamut display and willing to pay for it would have a clue about color management. Same is true for people buying non wide gamut displays. And these better units have an sRGB emulation that works pretty darn well. And as pointed out, the issue isn't the wide gamut displays, it's the lack of color management.  

Well I'm one of those who spent the money for a wide gamut, photo-edit centered monitor PA242W along with a Spectraview II calibrator.  You've helped explain some things to me about color management.  I'm still very confused but slowly learning.  

Sometimes you buy a better product, whatever it is, hoping to get the most from it.  But that takes time.  How many people actually read the 400 pages of instructions that comes with some digital cameras today?   Don't give up on us. Smiley   Then again some people never figured how to get rid of the flashing 12:00 time display on the front of their VHS video recorders.
Logged
David Eichler
Sr. Member
****
Offline Offline

Posts: 310


WWW
« Reply #13 on: March 21, 2014, 05:30:53 PM »
ReplyReply

That's why there is an sRGB emulation mode. Flash IS a mess. Some versions apparently do support some color management but best to just stay away from Flash if color is important. Even if a site strips a profile, there is still EXIF data that defines the color space, why can't that be detected and used?
  Actually, it is my understanding that many sites strip all exif data too, even though that is technically illegal if it contains a copyright notice.
Logged

digitaldog
Sr. Member
****
Offline Offline

Posts: 8576



WWW
« Reply #14 on: March 21, 2014, 06:29:29 PM »
ReplyReply

  Actually, it is my understanding that many sites strip all exif data too, even though that is technically illegal if it contains a copyright notice.
Such a site clearly doesn't care how it's audience views the images. And that's their right.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
D Fosse
Sr. Member
****
Online Online

Posts: 285



« Reply #15 on: March 21, 2014, 06:30:29 PM »
ReplyReply

the issue isn't the wide gamut displays, it's the lack of color management.

I think this is a very important and very undercommunicated point. People complain all over that color management is "a mess". In reality, color management is the solution to all these problems. The mess starts when color management stops.

An academic point you might say, but perception is everything. The perceived unpredictability is what drives people up against the wall. Once you see that it is in fact entirely predictable - if you just turn around and look at it from the opposite angle - it stops being a "problem" and becomes something you can deal with.

Color management isn't complicated or difficult at all, as long as it's there. The complexities of it is just a myth that should be put to death the sooner the better.  

Logged
Tony Jay
Sr. Member
****
Offline Offline

Posts: 2044


« Reply #16 on: March 21, 2014, 07:04:51 PM »
ReplyReply

I think this is a very important and very undercommunicated point. People complain all over that color management is "a mess". In reality, color management is the solution to all these problems. The mess starts when color management stops.

An academic point you might say, but perception is everything. The perceived unpredictability is what drives people up against the wall. Once you see that it is in fact entirely predictable - if you just turn around and look at it from the opposite angle - it stops being a "problem" and becomes something you can deal with.

Color management isn't complicated or difficult at all, as long as it's there. The complexities of it is just a myth that should be put to death the sooner the better.  
There is a lot of wisdom in this.

Tony Jay
Logged
Simon Garrett
Sr. Member
****
Offline Offline

Posts: 339


« Reply #17 on: March 21, 2014, 07:09:10 PM »
ReplyReply

Is it any harder than Apples approach to retina displays? Legacy APIs are hard-wired to a virtual low-rez display, and the OS will silently upscale content to the physical resolution. A new API is offered to those interested in accessing the full capabilities of the monitor.

I'd say it's harder.  It's one thing to alter scaling, colour I think is more complicated, as there's no single answer, like "scale everything up by a factor 2" or whatever.  


Adobe & Co would have to make some adjustements (best-case: flip a bit "declaring that they know what they do", then recompile). All legacy applications would be sandboxed in the sRGB assumption (that is pretty much the standard outside of pro/enthusiast photography anyways).

Yes, but how would you know what was a legacy application?  If an application developer is concerned enough to set a colour legacy bit, they probably already do colour management.  

And "All legacy applications would be sandboxed in the sRGB assumption" probably wouldn't work for several reasons.  The most important: you can't assume that "legacy" applications don't do colour management.  Some will be using WCS (Windows Color System); Windows could probably detect that and not do any further mapping.  But some applications will do colour management internally without using WCS, and there's probably no way Windows can tell that.  Applying a "sandbox sRGB assumption" would be completely wrong in that case.  

The issue is: for "legacy" applications (i.e. all applications up to now), there's usually no way Windows can know the colour space of graphic information, or even whether it's a photo.  

There might be some problems with applications that access display hardware at a really low level (below what the OS is willing to mess with). So applications that are color-unaware and write directly to GPU buffers might be rendered (erroneously) at full native display gamut.

And that's another problem: typically games and video players may bypass Windows - but not necessarily for all display material.  So you'd have information where Windows is trying to second-guess the colour, and mapping it, side-by-side with information that it can't map.  For example, what should look like a continuous red bar might be two different shades of red, part re-mapped by Windows and part not.  


There is a question of what the OS ought to to if there is a color-unaware video player showing e.g. youtube content in one window, a color-aware photo editing application in another window, and the display is reporting two calibrated presets: 1)sRGB, 2)Wide-gamut. Should it inject sRGB->Wide-gamut conversion for the video window, use the accurate sRGB-mode of the display, or what? The easiest way out of such issues may be to only change behaviour for fullscreen applications.


I myself would probably be happy if display presets could be selected from the OS (using USB, EDID, whatever), and the OS allowed me to select which display preset should be selected depending on what application was highlighted. Then I could ensure that native wide-gamut was always used when I was using Lightroom, and sRGB emulation always otherwise. My family members would probably rejoice.

-h

The problem is that Windows just doesn't know what's colour managed and what isn't, and doesn't know the colour space of information being written to the screen.  

By the way, EDID colour space values read from a monitor are often wholly incorrect.  Some monitors return the values of the sRGB primaries in the EDID - even for wide-gamut monitors.  

Edited to add: The problem is simpler for browsers, as untagged elements (without an embedded profile) are nearly always sRGB.  Anything that isn't sRGB will almost invariably have an embedded profile.  Windows, however, usually has no idea of the colour space of stuff written to the monitor. 
« Last Edit: March 21, 2014, 07:15:49 PM by Simon Garrett » Logged
WombatHorror
Sr. Member
****
Offline Offline

Posts: 299


« Reply #18 on: March 21, 2014, 07:22:37 PM »
ReplyReply

A lot of websites seem to strip the tagged color profiles from images. This didn't matter quite as much in the past because it was very unlikely that the average viewer had a wide gamut monitor (not that some color and contrast distortion can't be seen with an sRGB only monitor). However, the current Apple Cinema monitor has a wide gamut, and while probably not mainstream in most areas at this point, it may be getting more common in more well heeled areas. I am wondering how common wide gamut monitors are at this point. Other than the Cinema display, are there any other wide gamut monitors currently being marketed to mainstream consumers?

If you use Firefox you can tell it to apply sRGB profile to any untagged images or elements. Or if you must use a different browser, then you pop the display into sRGB emulation mode (most wide gamut screens that are not really old have these and on the best wide gamut monitors they actually deliver sRGB better than 99% of regular gamut monitors do).

I don't believe any Apple displays offer wide gamut.

Quote
At least some of the current Dell Ultrasharp monitors are now wide gamut, but are average consumers even considering these?

I think they are for sure. For a while almost any good IPS gamut monitor had been wide gamut. Although there are some good regular gamut ones again.
« Last Edit: March 21, 2014, 07:27:22 PM by WombatHorror » Logged
WombatHorror
Sr. Member
****
Offline Offline

Posts: 299


« Reply #19 on: March 21, 2014, 07:25:39 PM »
ReplyReply


All browsers should assume graphic elements without profiles are sRGB, with perhaps an option to turn that behaviour off for people with a particular fascination for seeing the wrong colour (e.g. those that like all colours to be as bad as Flash colours).  

exactly, i mean what else would an untagged image be but sRGB?? and if it's not it can't be displayed correctly anyway (at least not without a lot of trial and error) since you have no clue what gamut it is and what the tone response are, it's basically just a mistake)

Logged
Pages: [1] 2 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad