Ad
Ad
Ad
Pages: « 1 [2]   Bottom of Page
Print
Author Topic: The Achilles heel of 4k monitors on the Mac Pro?  (Read 9998 times)
digitaldog
Sr. Member
****
Offline Offline

Posts: 9191



WWW
« Reply #20 on: December 29, 2013, 08:55:04 AM »
ReplyReply

Alan, if the size isn't a major factor for you (and read what Jeff wrote about LR), I'd suggest the newer NEC is the best option and yes, they have improved over the years.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
alan a
Jr. Member
**
Offline Offline

Posts: 96


« Reply #21 on: December 30, 2013, 11:14:44 PM »
ReplyReply

4k monitors will become more and more common, just as HD ones did and before them the various increases in resolution and size.  4k TVs are now available, so in a few years it will be fairly mainstream.

At some point, Apple and Adobe (for Lightroom) will need to address this.  PS already leverages the GPU so obviously it's an option for LR and would make a huge difference, even with the more restricted choices of GPU you get with Mac. 

But quite apart from LR uses, regular users will want 4k displays as the price comes down and content increases and so on.  Retina is one idea/solution but it lacks the flexibility of simply being able to choose the correct resolution and then scale your fonts and system icons to a level that suits your monitor, your eyes and your usage.

Alan is entirely correct to highlight this as a current and potentially ongoing and growing issue.  I do believe, though, that Apple will eventually address it - the only question is whether they do it in a flexible manner or only in a way that supports their own hardware in their own paradigm (which is quite a common thing for them to do).  Either solution may work just fine, but there's more chance of broadening their market with a little 20-year-old flexibility.

Thanks very much for the observations, and kind comments.  It is good to know that I did identify a legitimate issue that Apple and Adobe, among others, need to address in the next few years.
Logged
alan a
Jr. Member
**
Offline Offline

Posts: 96


« Reply #22 on: December 31, 2013, 12:05:46 AM »
ReplyReply

Alan, if the size isn't a major factor for you (and read what Jeff wrote about LR), I'd suggest the newer NEC is the best option and yes, they have improved over the years.

The problem with forums is that there are many self-proclaimed experts.  Then there are the real experts, who publish in the field; work as full time consultants; or actually design software and hardware, and who are recognized for their expertise.  You are among those who can legitimately be referred to as experts, and I therefore welcome and am delighted that you chose to comment.  

Andrew, I remember when you were involved with the Sony Artisan, one of the last old style monitors that was made specifically for photographers.  I purchased one due to your strong recommendation for that product.  So I have been following your recommendations for many years, and I'm sure you have influenced many of the photographers who read this forum over the years.

So if I may ask you another question, because I believe others will benefit from your response as well.  This is related to the topic of this thread, namely why certain high end monitors may not be fully functional with an iMac -- but are they with the new Mac Pro?  So this is related to which monitors are fully functional with the Mac Pro, or in this case, an iMac.

  In the last year Wacom released the Cintiq 24HD Touch.  At one point I looked into it, but ultimately rejected it due to its very large size and footprint.  (At least, too large for my desk.)  That unit produces 1.07 billion colors and 97% of Adobe RGB. (*** See footnote below.)  Unless I misunderstood him, Wacom tech support said that to fully deliver this range of color and RGB coverage, the full sized displayport cable and output must be used -- and not DVI, which is also included. (The odd aspect of this statement is that DVI is what is provided for ease of use, and the back of the Wacom must be unscrewed to even access the Displayport cable.)  

Tech support for Wacom told me that the unit could not deliver its full potential on an iMac, because that high video quality could only be delivered through the displayport output.  And that the necessary displayport input is typically found on PCs with advanced graphics cards, and is not found on the iMac.  

In addition, Wacom recommended connecting that 24HD display to the iMac using a DVI to mini displayport converter (specifically a Kanex Mini DisplayPort to DVI adapter model IADAPTDVI).  Which means the DVI output would be used, rather than the full sized displayport output and full sized displayport cable on the back of the Wacom.  I later asked customer service at Wacom why they don't recommend a Mini DisplayPort to full size DisplayPort adapter instead, as this would presumably deliver the full billion colors and output on the Wacom 24HD display, since it would use the DisplayPort and not DVI.  The guy in customer service didn't know for certain, and thought such an adapter might not work with temperamental iMacs, that are known to have problems driving large monitors. (The web is filled with hundreds of reports of Mac users trying countless adapters and converters to find the magic one that will drive large monitors.)

That led to another question -- is the problem that prevents the Wacom from displaying its full resolution on an iMac related to the video card in the iMac, or is it related to the mini displayport?  I asked if the new Mac Pro, with its advanced video cards, had solved that problem.  But the Mac Pro still uses mini displayports.  (To be clear and fair to the guy at Wacom customer support, he didn't know that answer, and referred me to more senior tech support, who was not available on the day in question.)

So that is my question.  When Wacom said that an iMac can't deliver on the full potential of its new 24HD display, is it referring to the video cards and processing in the iMac, or is it referring to the method of connection -- the minidisplay or thunderbolt port?  ( I assume Wacom is referring to video cards and not the method of connection.)  Because if it is the former, namely the video cards, has the new Mac Pro solved that?  But if it is the latter, and the Mac Pro is still limited by the mini displayport, then why is it that the Mac Pro can drive 4k monitors but not a Wacom to its full potential?

Either way, does this limitation in the iMac adversely impact other high end displays, such as the top of the line NEC, when they are connected to an iMac?

A footnote on the above, with a request for more clarification, which again, I believe would be beneficial for many of us who read the forum:

*** The Wacom that is one step down, the 22HD Touch, produces 16.7 million colors, the same as the top of the line Apple Thunderbolt monitor.  That is 16.7 million colors versus over 1 billion.  And the 22HD produces "only" 72% of Adobe RGB.  That is 72% versus 97%.  (Apple does not publish the Adobe RGB spec for the Thunderbolt monitor, at least not that I could find.)  

(1) But what do these statistics actually mean?  For advanced amateurs, would we actually see a difference between 16.7 million colors versus 1 billion, and between 72% RGB coverage (the Wacom that costs "only" $2500) versus 97% coverage of Adobe RGB (the same level of coverage for the NEC or the Wacom 24HD)?

(2) And if there is a difference that can be actually seen, how does it manifest itself?  As banding?  How?

Assuming, of course, that we are using computers that can even reproduce this range of colors, which according to Wacom, can't be done on an external high end montor connected to an iMac -- either due to its internal video card, or its use of the mini displayport, or both.  Your answer above should clarify which is the issue.

(3) And which of those can't be reproduced on an external monitor connected to an iMac -- the 1 billion colors or the 97% of RGB -- or both?

(4)  And does either of these limitations affect the screen built into an iMac that was built in the 2011 (not retina) to 2013 timeframe?  Can owners of iMacs see 97% of RGB on their own screens, let alone 1 billion colors?

Does it really matter, for advanced amateurs?

Rodney, many thanks in advance for your response.

« Last Edit: December 31, 2013, 11:48:17 AM by alan a » Logged
hjulenissen
Sr. Member
****
Offline Offline

Posts: 1683


« Reply #23 on: December 31, 2013, 12:42:02 AM »
ReplyReply

Another fly in the ointment is sharpening for web output.

The images may look beautiful on your screen at home, but they are going to look very different on everyone elses lower resolution monitor. I already notice the difference in the amount of sharpening needed on my Dell U2711 screen vs my 24", and the pixel density difference there is not nearly as dramatic.
While this may or may not be a problem with current software, I don't think that this can be a fundamental issue with high-resolution monitors? I mean, if you need to simulate a low-res monitor, this can be done reasonably accurately cheaply in software by forcing 2x2 pixels to have the same value. Thus, a perfect 4k monitor paired with optimal software can do whatever a perfect 2k monitor can, and then some.

For the common case that our monitors color gamut is (nearly) a superset of our prints, we have softproofing that tries to emulate the print output on-screen. If/when our monitors are ever capable of resolving more spatial information than our prints, perhaps we will have include spatial characteristics in the "softproof" functionality as well. Emulating a CMYK dithered pattern may put a lot of stress on the monitor resolution (and be hard to profile), but a coarse emulation of the luminance MTF might be considered helpful.

-h
Logged
digitaldog
Sr. Member
****
Offline Offline

Posts: 9191



WWW
« Reply #24 on: December 31, 2013, 01:38:06 PM »
ReplyReply

Quote
So that is my question.  When Wacom said that an iMac can't deliver on the full potential of its new 24HD display, is it referring to the video cards and processing in the iMac, or is it referring to the method of connection -- the minidisplay or thunderbolt port?

Afraid I don't know what they are referring to. I suppose the iMac's graphic card, whatever that may be in all models, can't drive it? Don't know. There should be no limitation with the NEC's, I'm driving one off a Macbook Pro and have done so with older Macbooks.

As to the billon's of colors, that's mostly marketing speak so don't let it impress you too much. On paper, a 10-bit, 12-bit, 14-bit encoding can create a number of colors, not that we could see anything like that number of colors. It is useful to have more bits, to a degree. And are all the bits being used throughout the entire chain?

The percentage of some color space (depending on the calculations if you want to get anal) is telling you is how wide a gamut that display it is compared to (usually sRGB). A bit more useful, especially if you are working with raw data and your output is to a wider gamut device like modern Ink Jet's. There's the gamut of the display and the number of bits one can use to divide the numbers up.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
hjulenissen
Sr. Member
****
Offline Offline

Posts: 1683


« Reply #25 on: January 01, 2014, 06:05:07 AM »
ReplyReply

As to the billon's of colors, that's mostly marketing speak so don't let it impress you too much. On paper, a 10-bit, 12-bit, 14-bit encoding can create a number of colors, not that we could see anything like that number of colors. It is useful to have more bits, to a degree. And are all the bits being used throughout the entire chain?
And are the image processing steps along the chain using "proper" dithering (whatever that is for images), or is it only throwing away the lsbs after e.g. multiplications?

-h
Logged
mac_paolo
Sr. Member
****
Offline Offline

Posts: 423


« Reply #26 on: January 01, 2014, 06:35:08 AM »
ReplyReply

Thanks very much for the observations, and kind comments.  It is good to know that I did identify a legitimate issue that Apple and Adobe, among others, need to address in the next few years.
Ha! 😊
Apple and Adobe started addressing GPU acceleration years ago, not "in the next few years".  Wink

Truth is that a proper GPU acceleration support is already a reality since Lion and even more Mountain Lion. However, writing cross-platform softwares with good GPU acceleration is not easy at all, that's one of the reasons why Photoshop and Premiere are GPU accelerated while Lightroom is not (yet). Lightroom is written radically different from the two other big brothers.

So everything is ready. A more mature Cocoa API support could simplify or even automatize both thread splitting and GPU acceleration without changing the code that much. -That- could be a step for the next few years, and it wouldn't involve Adobe. Other than that, almost everything is already on the table.

Recent Anandtech Mac Pro review highlighted a spotty 4K resolution support, especially when an unsupported 4K display is plugged, but it's very much due the lack of a real standardization on the "4K" term, which is not a resolution. At best it's a family of resolutions.  Undecided Most probably the next OS X upgrades will broaden the support for more 4K panels.

Paolo

PS: there's no need to glorify Rodney (who indeed is a respected professional) to emphasize an personal attack towards me. That's all pretty much puerile. As soon as someone doesn't report wrong facts (you did, I didn't yet), there's no need to be known worldwide in order to raise a hand and just speak. My mantra is that on forums, before judging, you really should know what are you talking about. On this topic I do, without ever having proclaimed me anything. Do you?  Roll Eyes
Logged
alan a
Jr. Member
**
Offline Offline

Posts: 96


« Reply #27 on: January 01, 2014, 04:05:45 PM »
ReplyReply

My compliments to Rodney were entirely sincere, and if anything overdue. I purchased the Sony Artisan because of his recommendation, and my last purchase of an NEC monitor was also heavily guided by his recommendations. I'm sure that Rodney is pleased to know that his contributions to the forum are very useful for amateurs like myself who are guided by his recommendations.

I also have never claimed to have any level of expertise. The first paragraph my opening post in this thread made that quite clear, in which I requested clarification on any misstatements I might have made.

Mac Paolo:

Clearly you do possess expertise in these areas. As you have repeatedly stated and proclaimed in this thread.  :-)  It would, however, be easier to engage in a dialogue on these issues with you if the personal attacks could be left out.  

With that being said thanks very much for referring us to the Anandtech review of the Mac Pro:

http://www.anandtech.com/show/7603/mac-pro-review-late-2013

For anyone interested in the issues discussed in this thread, you should also look at this review as well:

http://www.anandtech.com/show/5998/macbook-pro-retina-display-analysis

The section of the Mac Pro review on retina displays also includes an excellent explanation of how retina displays operate, and how they retain sharp and crisp text while changing resolutions. In that regard, I now better understand your earlier comments in that regard, specifically on how the Apple retina can use high resolutions, that normally result in very small text, while then rescaling the text to make it readable.  The review of the retina screen on the MacBook Pro also discusses the same issue.

I'll readily acknowledge that my earlier comments on how monitors look blurry when they change resolutions applied to non-Apple retina displays. I own a 15 inch MacBook Pro with retina display, and agree that its behavior is quite different from a standard display–or the display on my 27 inch iMac which is a non-retina display.  I now better understand and appreciate your earlier comments in that regard.

(In addition to the improvements that Apple has already made on how it can change resolutions on a retina display, while retaining sharp text, I believe it would also be helpful if they would include a feature to increase the size of resolutions on a systemwide basis. Together, that would be an unbeatable combination.)

As you note, the above review explains that these technical innovations were not included in how Apple implemented its support for 4K displays. The review sums it up by saying that "the result is a bit of a blurry mess."  My conclusion is that we should all wait until Apple releases its own proprietary 4k display, and incorporates these type of features in the Apple support for its own display.

If you could also comment on the questions I posed concerning the statements made by Wacom, it would be very helpful. (See my above posting, which I admit was a bit too long)  As I reported, Wacom tech support flatly stated that the full *video* abilities of their 24 HD touch display could not be utilized on an iMac. (The video abilities, and not anything related to the touchpad.) I am still curious as to why that is the case.

Is it due to the video card within an iMac? Or is it due to a limitation of the thunderport mini displayport connection that prevents it from fully utilizing all of the video information from the full-sized displayport output on the wacom?  

(Recall that Wacom tech-support said that to fully utilize the video abilities of the 24 HD Touch display, the displayport output must be utilized, and not the DVI output.  At the same time, Wacom recommended using a DVI to mini displayport adapter with an iMac, rather than a displayport to mini displayport adapter.  That recommendation is counter–intuitive, since Watcom says that you must use the displayport output on their monitor to fully utilize its video capabilities, while they turn around and recommend using a DVI adapter on an iMac.  That results in a confusing set of statements and recommendations, further confusing what the actual problem is and why an iMac can't reproduce the full video capabilities of the top-of-the-line Wacom.)

Does the Mac Pro address these issues? If it does, does it address these issues through the more advanced video cards, or does it address these issues due to the new thunderport 2 connection that is discussed in the Mac Pro review?

Again, I appreciate your comments and for referring us to the Mac Pro review. It helped explain many issues discussed in this thread. And helped me to better understand your earlier comments.

I would appreciate any additional explanation and clarification you can provide– as well as any correction of any misstatements I may have made :-)
« Last Edit: January 05, 2014, 03:55:27 AM by alan a » Logged
jerryrock
Sr. Member
****
Offline Offline

Posts: 565



WWW
« Reply #28 on: January 01, 2014, 06:00:58 PM »
ReplyReply

Alan,
Here is a link explaining the capabilities of the different video connections (if that helps).

http://reviews.cnet.com/8301-33199_7-57614748-221/hdmi-vs-displayport-vs-dvi-vs-vga-which-connection-to-choose/

The Wacom Cintiq 24HD touch has a maximum resolution of 1920 x 1200. That resolution is supported by HDMI, DVI and Display port. Anything above that resolution would require a dual link mini display port to DVI adapter, HDMI or Display port.

I run a Wacom Cintiq 20WSX and a 24" HP2480ZX Dreamcolor monitor from the ATI 5700 video card in my old 2006 MacPro.The card has both a DVI connector and two mini display ports. The Cintiq is hooked directly to the cards DVI port and its max resolution is 1680 x 1050 @67 Hz The Dreamcolor monitor is connected with a standard Mac mini display port to DVI adapter and runs 1920 x 1200 @60 Hz.
Logged

Gerald J Skrocki
skrockidesign.com
mlewis
Newbie
*
Offline Offline

Posts: 39


« Reply #29 on: January 02, 2014, 06:26:27 AM »
ReplyReply

That unit produces 1.07 billion colors and 97% of Adobe RGB. (*** See footnote below.)  Unless I misunderstood him, Wacom tech support said that to fully deliver this range of color and RGB coverage, the full sized displayport cable and output must be used -- and not DVI, which is also included.
To get 1 billion colours you would have to be using a 10bit per channel video outlet, not the standard 8bit.  Support for this isn't great at the moment and only the AMD FirePro video cards (I don't think the Nvidia Quadro cards do) support it at the moment (via DisplayPort).  No graphics card that isn't in the workstation class supports 10bit video output.

The only Apple computers with such video cards are the new Mac Pros so only they could possibly run the Wacom in this way.  The issue is the video hardware, not DisplayPort vs mini DiplayPort which is just a different connector size.
Logged
jerryrock
Sr. Member
****
Offline Offline

Posts: 565



WWW
« Reply #30 on: January 02, 2014, 08:00:15 AM »
ReplyReply

To get 1 billion colours you would have to be using a 10bit per channel video outlet, not the standard 8bit.  Support for this isn't great at the moment and only the AMD FirePro video cards (I don't think the Nvidia Quadro cards do) support it at the moment (via DisplayPort).  No graphics card that isn't in the workstation class supports 10bit video output.

The only Apple computers with such video cards are the new Mac Pros so only they could possibly run the Wacom in this way.  The issue is the video hardware, not DisplayPort vs mini DiplayPort which is just a different connector size.

I did miss that point but the limitation on 10 bit output is not limited to the video hardware, it must also be supported by the operating system and Mac operating system has not historically supported 10 bit video output. There are rumors that OSX 10.9 does support 10 bit output but I have yet to see any confirmation from Apple.


Logged

Gerald J Skrocki
skrockidesign.com
alan a
Jr. Member
**
Offline Offline

Posts: 96


« Reply #31 on: January 02, 2014, 02:23:17 PM »
ReplyReply

Many thanks for mlewis and Jerry for responding to my questions.  So, even if the Mac Pro has the necessary video cards, the Mac OS doesn't support the advertised one billion colors at the present time?  Shame on Wacom, as they advertise and push this very heavily, without any caveats on their web site as to what is required to utilize the monitor at that level.  I have seen other postings, in other forums, with questions from potential Wacom customers, who ask about what they are losing if they buy the monitor that is one step down (22HD) with the same color support as the Apple Thunderbolt display.  The question is not what they are losing, but whether they can even utilize the full abilities of the 24HD Touch. You can't lose something that you don't have in the first place -- as almost all of these potential customers would be using 8 bit video cards, based on the explanation from mlewis.  But I suppose most advanced amateurs who buy it plug it in through DVI, with current video cards, and assume they are magically looking at 1 billion colors -- when they are not.  Nothing in the manual speaks to any of these issues, so no one would be the wiser or have any idea as to whether they are looking at 1 billion colors -- or not.

One question is whether, if we compared an Apple or NEC monitor running at 8 bits and 16.7 million colors, with a Wacom running at 10 bits and one billion colors -- could we tell the difference?
« Last Edit: January 02, 2014, 02:36:48 PM by alan a » Logged
Czornyj
Sr. Member
****
Offline Offline

Posts: 1422



WWW
« Reply #32 on: January 02, 2014, 11:46:42 PM »
ReplyReply

One question is whether, if we compared an Apple or NEC monitor running at 8 bits and 16.7 million colors, with a Wacom running at 10 bits and one billion colors -- could we tell the difference?

Only on some mean gradients, here's a NEC 10bit test with example objects where the difference is quite obvious:
http://www.necdisplay.com/documents/Software/NEC_10_bit_video_Windows_demo.zip

« Last Edit: January 02, 2014, 11:50:12 PM by Czornyj » Logged

hjulenissen
Sr. Member
****
Offline Offline

Posts: 1683


« Reply #33 on: January 03, 2014, 01:39:13 AM »
ReplyReply

According to Poynton, 460 nonlineary distributed steps (9 bits) is sufficient to encode a 100:1 contrast image without banding, or 9900 lineary distributed steps (14 bits). The 8 bits used in common gamma-encoded formats (BT709) is sufficient for 50:1 contrast. How many bits are our cameras effectively able to record? I do understand that people may want to edit their raw files heavily, and that color management in your computer may put a stress on (any) quantization between computer and display (especially one that is not dithered), but is this really a problem, aside from possibly synthetic benchmarks?

http://www.poynton.com/PDFs/GammaFAQ.pdf (Ch. 13)
Logged
Pages: « 1 [2]   Top of Page
Print
Jump to:  

Ad
Ad
Ad