Ad
Ad
Ad
Pages: [1] 2 »   Bottom of Page
Print
Author Topic: Test of monitor calibrators  (Read 9820 times)
stefohl
Jr. Member
**
Offline Offline

Posts: 59


WWW
« on: October 04, 2010, 03:38:34 PM »
ReplyReply

I've seen so many examples of bad monitor calibrators lately, so I decided to do a test. I have a friend working at Eizo/Sweden, so I can test my calibrators against their Minolta CA-210. As the Basiccolor Display supports this calibrator, we could calibrate a monitor and then verify the profile with this software. We then verified the monitor with my EyeOne Pro. The results were about the same. We then invited 15 photographers to come to us with their calibrators and the results were not impressive. Out of these 15 calibrators 5 gave really bad results. The worst showed an average Delta E of 30 and a max of 50. The others shoved results more like average Delta E of 15 and a max of 25. Out of the rest, three gave an average around 10 and a max of 15. So, out of 15 calibrators, 7 shoved good results, 8 gave results ranging from not so good to really bad.

The EyeOne Pros that shoved up all shoved good results. Maybe not so surprising, considering that the profile that we validated was created with an EyeOne Pro. Out of the 8 that didn't pass the test, we had four EyeOne Displays, three Spyder III:s and one Monaco Optix. Not a very scientific test, but rather distressing.
Logged

Stefan Ohlsson
Projektor
www.projektorutbildning.se
Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 6938


WWW
« Reply #1 on: October 04, 2010, 04:12:00 PM »
ReplyReply

You know, there are a slew of considerations at play in all of this. Firstly, there is variance between instruments. Secondly, instruments can degrade over time. Thirdly, they should all be at the same temperature when making comparative readings, because temperature affects their performance. Fourthly, some older colorimeters may not be well-adapted to evaluating the latest generation of wide-gamut displays. Fifth, is the parameter setting done using DDC or OSD controls (can impact consistency of the test between instruments)? Sixth, turning to the software, while the internal validation of an application such as BasicColor, or ColorEyes Display is fine as far as it goes, it does not go far enough. One wants an independent application generating a completely alternative set of values to be displayed, measured and the dE calculated in order to validate "from the outside" whether the internal validation of the calibration packages is reliable. This can be done with Babelcolor's PatchTool application, which I highly recommend, having used it intensively when I had display management issues, now resolved. It's a very useful analytic application. Of course, I should add that profile quality itself is obviously a key factor in the outcomes. Generally one prefers LUT over matrix profiles for achieving more accurate results, unless the display has very linear performance.
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
stefohl
Jr. Member
**
Offline Offline

Posts: 59


WWW
« Reply #2 on: October 04, 2010, 04:34:03 PM »
ReplyReply

Of course there is a variance between instruments and they do degrade over time. That is what we wanted to test, can we trust the calibrators that we are using? That is also why we didn't test new instruments, but invited users to test their calibrators. Some of these were quite new, others were 3-4 years old.

And we didn't test which instrument that gave the best profile, we just compared the validation results from one calibrator that we knew gave a good results with the validation results that we got from other instruments. Of course we could have used a tool like PatchTool or U-Dact, but I don't think that had changed anything. We would still probably have found that out of 15 instruments, 8 weren't good.

Logged

Stefan Ohlsson
Projektor
www.projektorutbildning.se
Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 6938


WWW
« Reply #3 on: October 04, 2010, 04:35:34 PM »
ReplyReply

From the sound of it, probably more than eight aren't really trustworthy. It's an interesting exercise.
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
probep
Full Member
***
Offline Offline

Posts: 149


« Reply #4 on: October 05, 2010, 05:22:13 AM »
ReplyReply

I've seen so many examples of bad monitor calibrators lately, so I decided to do a test. I have a friend working at Eizo/Sweden, so I can test my calibrators against their Minolta CA-210. As the Basiccolor Display supports this calibrator, we could calibrate a monitor and then verify the profile with this software. We then verified the monitor with my EyeOne Pro. The results were about the same. We then invited 15 photographers to come to us with their calibrators and the results were not impressive. Out of these 15 calibrators 5 gave really bad results. The worst showed an average Delta E of 30 and a max of 50. The others shoved results more like average Delta E of 15 and a max of 25. Out of the rest, three gave an average around 10 and a max of 15. So, out of 15 calibrators, 7 shoved good results, 8 gave results ranging from not so good to really bad.

The EyeOne Pros that shoved up all shoved good results. Maybe not so surprising, considering that the profile that we validated was created with an EyeOne Pro. Out of the 8 that didn't pass the test, we had four EyeOne Displays, three Spyder III:s and one Monaco Optix. Not a very scientific test, but rather distressing.
Very interesting. But there are some questions.
What display(s) did you calibrate? Wide gamut? or sRGB CCFL-backlit display?
Could you name 7 good sensors?
Logged
stefohl
Jr. Member
**
Offline Offline

Posts: 59


WWW
« Reply #5 on: October 05, 2010, 06:52:15 AM »
ReplyReply

We used Eizo CG 221, which is a wide gamut monitor with CCFL and Eizo CG 211, which doesn't cover the whole Adobe RGB gamut. But it made no difference which monitor we used, the results were about the same.

Logged

Stefan Ohlsson
Projektor
www.projektorutbildning.se
stefohl
Jr. Member
**
Offline Offline

Posts: 59


WWW
« Reply #6 on: October 05, 2010, 07:07:12 AM »
ReplyReply


Could you name 7 good sensors?


No, and that is perhaps the worst problem. The only calibrator where all samples passed the test was EyeOne Pro. We had 4 of those during the test. We didn't have a ColorMunki, but will test one during the week.
Logged

Stefan Ohlsson
Projektor
www.projektorutbildning.se
probep
Full Member
***
Offline Offline

Posts: 149


« Reply #7 on: October 05, 2010, 09:20:18 AM »
ReplyReply

No, and that is perhaps the worst problem. The only calibrator where all samples passed the test was EyeOne Pro. We had 4 of those during the test. We didn't have a ColorMunki, but will test one during the week.
Thank you. the ColorMunki accuracy is highly interesting subject.
Logged
Czornyj
Sr. Member
****
Offline Offline

Posts: 1420



WWW
« Reply #8 on: October 05, 2010, 02:24:02 PM »
ReplyReply

Thank you. the ColorMunki accuracy is highly interesting subject.
To my eye it seems to be as good as my i1pro, I've been playing with few CM units and all of them were giving results similar to my spectro.
Logged

stefohl
Jr. Member
**
Offline Offline

Posts: 59


WWW
« Reply #9 on: October 06, 2010, 05:10:45 PM »
ReplyReply

Thank you. the ColorMunki accuracy is highly interesting subject.

Today I tested two ColorMunkis and I'm happy to report that both of them were good, with an average Delta E < 2 and a max of < 4 when I tested them on a Eizo CG221, a monitor with a gamut about the same as Adobe RGB. It seems like the spectros are showing good results, both EyeOne Pros and ColorMunkis. The colorimeters are much more unstable.
Logged

Stefan Ohlsson
Projektor
www.projektorutbildning.se
neil snape
Sr. Member
****
Offline Offline

Posts: 1432


WWW
« Reply #10 on: October 07, 2010, 02:32:46 AM »
ReplyReply

I find the CM very reliable and consistent.

I am currently testing some new software, and the results are marred on the HP 2480zx.

I have the X-Rite HP APS for this monitor and it just plain doesn't work.

Somehow I suspect that this monitor doesn't play well with the video card. Sometimes the samples are sent and it flashes between samples. Same for power on, wake up etc.

So your test above is a fine example of how the variables can play.

It doesn't attest to any qualification though as the above noted variable tells me that there will continue to be bad monitor calibrations for reasons of drivers, and hardware that don't work .

I wish I had Eizo, they still are striving for a standard. They also allow multiple devices with their software, a very good thing.
Logged
Baxter
Jr. Member
**
Offline Offline

Posts: 77


WWW
« Reply #11 on: October 08, 2010, 04:27:18 PM »
ReplyReply

I've been using an Eye One Display since March 2004 and recently wondering what sort of improvement I'd see by replacing with a more modern device. I've an Eizo SX2761W, a 3.5 yr old Macbook Pro monitors to calibrate. Primary images come from a P45+

Printer-wise I use an Epson 7800 and  Mitsubishi 9550 Dye-sub for which I have had custom profiles made. Making printer profiles for paper isn't really something I need to do.

Two Q's
1 Will a current device produce noticeably better results?
2. If so, which ones should be on my shortlist please?

Many thanks

Bax
Logged

stefohl
Jr. Member
**
Offline Offline

Posts: 59


WWW
« Reply #12 on: October 09, 2010, 02:31:21 AM »
ReplyReply


Two Q's
1 Will a current device produce noticeably better results?

Of the 8 or 9 EyeOne Displays that I have tested, there were problems with 50% with average Delta E over 10. This isn't to say that 50% of all EyeOne Displays are wrong, as many of them were chosen just because that they had problems. I've seen older EyeOne Displays that gave good results and new one that didn't. It seems like that they are sensitive to both humidity and heat.

Quote
2. If so, which ones should be on my shortlist please?

The spectros that we've tested have shown better results than the colorimeters. Of the colorimeters it is the DTP-94 that has shown the best results. A colorimeter that I haven't tested is the new Basiccolor Discus, but it looks very promising.

Logged

Stefan Ohlsson
Projektor
www.projektorutbildning.se
Lednam
Newbie
*
Offline Offline

Posts: 1


« Reply #13 on: October 09, 2010, 11:20:15 AM »
ReplyReply

This is interesting! The question that rings in my head right now is if I can make some ocular tests to confirm that my calibration is somewhat right ... ? Is there any good evaluation images out there ...? Thanks in advance!
Logged
WombatHorror
Sr. Member
****
Offline Offline

Posts: 299


« Reply #14 on: October 27, 2010, 11:22:05 PM »
ReplyReply

Today I tested two ColorMunkis and I'm happy to report that both of them were good, with an average Delta E < 2 and a max of < 4 when I tested them on a Eizo CG221, a monitor with a gamut about the same as Adobe RGB. It seems like the spectros are showing good results, both EyeOne Pros and ColorMunkis. The colorimeters are much more unstable.

None of the colorimeters you used works on wide gamut displays (at least not without varying degrees of custom compensation). You need to test them on a standard gamut display.

There is a link floating around where someone tested a whole slew of probes. The i1Pro was fairly decent (aside from shadows) on all monitors types. The DTP94b was fairly decent on CCFL sRGB screens. The early Spyder3 and off the shelf i1D2 were brutal on CCFL sRGB and forget it on other types. The later SPyder3 were mediocre.

(The special NEC version i1D2 are factory calibrated they are not the same as off teh shelf i1D2).

Logged
Steve Weldon
Sr. Member
****
Offline Offline

Posts: 1460



WWW
« Reply #15 on: October 28, 2010, 04:26:12 AM »
ReplyReply

I just want to be sure.. when measuring Delta-E.. what software are you using and how are you measuring Delta E?

I don't understand this measurement as well as I'd like.. I've always understood it to be the difference between target and actual colors.  I usually look at my SVII Information Display Window and take the number directly off of the window in the White Point area where it's listed.  Target, calibrated, and then Delta E.

I ask because with the NEC Colorimeter my numbers range from .03 for sRGB Emulation to .65 at the other end and using the widest gamut such as for Photo Editing.  It seems the further away from sRGB I get, the more the number rises.  With the i1d2 colorimeter I get very close to the same, in no case above .8..

Am I reading this wrong?  Your numbers of 30-50 have got me asking..
Logged

----------------------------------------------
http://www.BangkokImages.com
Czornyj
Sr. Member
****
Offline Offline

Posts: 1420



WWW
« Reply #16 on: October 28, 2010, 11:10:22 AM »
ReplyReply

There's a difference between SpectraviewII and basICColor Display validation - SVII only validates the greyscale, while basICColor also validates a bunch of colors:
Logged

digitaldog
Sr. Member
****
Offline Offline

Posts: 9100



WWW
« Reply #17 on: October 28, 2010, 11:27:50 AM »
ReplyReply

There are a couple things that would make the testing more solid (scientific). Did you find out the rev’s of each unit? They do differ in how they measure data, IOW, a Rev A device and a Rev D device may not correlate. For example, we found that a Rev E iSis produced quite different data than all previous iSis units because its been updated. Did you know that different companies come up with the final data differently? It seems shocking at first but this is one reason why we recently saw X-Rite introduce a new metric (XRGA standard). So while it was useful to use a better reference grade unit (the Minolta), there could be some differences here just due to the way two companies report the data. Having a higher end X-Rite reference instrument would make the test results a bit more accurate.
Quote
There's a difference between SpectraviewII and basICColor Display validation - SVII only validates the greyscale, while basICColor also validates a bunch of colors:
Can you specify the colors or are they a fixed set? Being able to specify the colors is useful. Many “validation” processes we see are fixed and often, the companies select colors that are easy to produce low deltas.

Lastly, while we all hope our instruments produce “accurate” and more importantly, consistent data, in terms of a display, the ultimate goal, one that’s very difficult to measure in terms of a success is how well the prints and display visually match.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
stefohl
Jr. Member
**
Offline Offline

Posts: 59


WWW
« Reply #18 on: October 28, 2010, 11:30:04 AM »
ReplyReply

None of the colorimeters you used works on wide gamut displays (at least not without varying degrees of custom compensation). You need to test them on a standard gamut display.

There is a link floating around where someone tested a whole slew of probes. The i1Pro was fairly decent (aside from shadows) on all monitors types. The DTP94b was fairly decent on CCFL sRGB screens. The early Spyder3 and off the shelf i1D2 were brutal on CCFL sRGB and forget it on other types. The later SPyder3 were mediocre.

We have done the test on both the Eizo CG211 and the CG 221. 221 has a gamut about the same as Adobe RGB, 211 has a smaller gamut. We have now tested more than 20 calibrators, new and old, both EyeOne Displays, Spyder III, ColorMunki, DTP94 and EyeOne Pro. On an average about 30 - 40% of the Displays and Spyders failed and they failed on both the small gamut and the wide gamut monitors. Would be interesting to read the link you are referring to.

Stefan
Logged

Stefan Ohlsson
Projektor
www.projektorutbildning.se
stefohl
Jr. Member
**
Offline Offline

Posts: 59


WWW
« Reply #19 on: October 28, 2010, 11:34:53 AM »
ReplyReply

I just want to be sure.. when measuring Delta-E.. what software are you using and how are you measuring Delta E?


We used the ColorNavigators simple validation and compared the values that we got from our reference calibrator with the values that we got from the test example. We also used Basiccolors software and did the same thing because it supports the Minolta CA 210 that we used as our reference.
Logged

Stefan Ohlsson
Projektor
www.projektorutbildning.se
Pages: [1] 2 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad