Ad
Ad
Ad
Pages: [1] 2 »   Bottom of Page
Print
Author Topic: Phase One vs. Adobe Camera RAW comparison  (Read 6336 times)
teddillard
Sr. Member
****
Offline Offline

Posts: 664


WWW
« on: March 27, 2010, 08:12:55 AM »
ReplyReply

FINALLY got this up, comparing image quality of C1Pro with ACR, using Imatest data.  

http://www.h2hreviews.com/article/Head-to-...re-One-Pro.html
Logged

Ted Dillard
Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 7126


WWW
« Reply #1 on: March 27, 2010, 09:08:17 AM »
ReplyReply

Quote from: teddillard
FINALLY got this up, comparing image quality of C1Pro with ACR, using Imatest data.  

http://www.h2hreviews.com/article/Head-to-...re-One-Pro.html

Ted, thanks very much for posting this work - very interesting. Of course, at present, any one using a Phase back needs to use C1 for raw processing. I was particularly interested in your findings about sharpening: the "hump" which is more apparent in the C1 files. Is this beyond the control of the user, or by turning off all sharpening over which we have any control we get rid of it?
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
Schewe
Sr. Member
****
Offline Offline

Posts: 5544


WWW
« Reply #2 on: March 27, 2010, 09:30:50 AM »
ReplyReply

Quote from: teddillard
FINALLY got this up, comparing image quality of C1Pro with ACR, using Imatest data.


Kinda too bad you are doing this now with Camera Raw 5.x when it's already at the end of it's life...LR 3 and ACR 6 will be changing the rules of the game. Next time I wouldn't wait till the end of the game to do comparisons...
Logged
teddillard
Sr. Member
****
Offline Offline

Posts: 664


WWW
« Reply #3 on: March 27, 2010, 12:39:31 PM »
ReplyReply

Mark, I'm not sure, when I get a chance I'd like to go back and mess around with the sharpening settings and run them through Imatest again.

Jeff, thanks.  Yeah, I'll get right on it...  next time my sock drawer, toenail polish and bonbons will just have to wait...  
Logged

Ted Dillard
Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 7126


WWW
« Reply #4 on: March 27, 2010, 01:00:32 PM »
ReplyReply

Ted, not sure how much "messing around" is really needed. I would think it is just a matter of disabling sharpening in C1 and looking at what Imatest does. According to the C1 manual, page 116, there is "no sharpening applied behind the scenes". I take that to mean that if the user disables sharpening there is none. And if that's the case, your results raises a question about whether you had sharpening enabled or disabled in your test.
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
teddillard
Sr. Member
****
Offline Offline

Posts: 664


WWW
« Reply #5 on: March 27, 2010, 01:59:57 PM »
ReplyReply

Quote from: Mark D Segal
Ted, not sure how much "messing around" is really needed. I would think it is just a matter of disabling sharpening in C1 and looking at what Imatest does. According to the C1 manual, page 116, there is "no sharpening applied behind the scenes". I take that to mean that if the user disables sharpening there is none. And if that's the case, your results raises a question about whether you had sharpening enabled or disabled in your test.

Both were used at their "default" levels of sharpening.  Actually, everything else was at the default level too...  

By "messing around" I mean, trying to zero out the sharpening in both, and then playing with some visually equivalent levels in both, then running the tests again.
Logged

Ted Dillard
Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 7126


WWW
« Reply #6 on: March 27, 2010, 02:16:53 PM »
ReplyReply

Well, if that's the case, I don't know what one can conclude from any of this. We don't know whether you are detecting real quality differences in the basic algorithms of the applications, or differences due to the possibility that the "defaults" don't mean the same thing between the two, such that with a bit of tweaking here and there one wouldn't notice any significant difference between them. As for "defaults", I pay zero attention to them in any of these applications. All they are is what the manufacturer thinks is a group of settings that may please most of the people most of the time. It has no other signifigance, and isn't necessarily a scientific basis for comparative testing. Perhaps a better place to start is with everything zero'ed, or whatever is meant to be the null-adjustment point in each application. See what they deliver on that basis, then make equal percentage changes in a variable and see how they react. The main point here is that one must be able to normalize the starting points and the change options on some kind of known equivalent basis to do systematic comparative testing.
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
teddillard
Sr. Member
****
Offline Offline

Posts: 664


WWW
« Reply #7 on: March 27, 2010, 02:36:45 PM »
ReplyReply

Right- it's a pretty standard issue with testing processing software.  There are several approaches, but the "default" approach gives you at the very least an understanding of where the manufacturer puts you as a starting point, and how the camera files are treated.  If that default method brings you closer to where you want to be at the outset, you have less work to do.  

If you were to try to compare two packages at "best advantage", it turns very quickly into a bottomless pit of subjective issues and processing methods.  Not only does it become a skill-dependent issue, but it also becomes an ideological debate (for lack of a better word) as well- you could easily, for example, point at my preferred method of gray balancing, (or sharpening), and find issue with that, based on your overall processing strategy.  

To put it differently, it would be a really interesting review to take two files from two "masters" of their processors- people who know the software and how to get the most out of it and then compare the files.  Even that, though, you can see is a tough situation to make conclusions about since what you will really be testing is their tastes and preferences in how a file should be processed, not what each processor does and why...
Logged

Ted Dillard
Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 7126


WWW
« Reply #8 on: March 27, 2010, 02:57:09 PM »
ReplyReply

Yes indeed. Andrew Rodney's "Iron Chef's" Panel, which was hosted at several PPE shows, did exactly this: the raw processors and their developers were on stage to show how they get the best out of their wares; having been on the panel once and seeing it, what do I (and undoubtedly many others) conclude from that: all of it is pretty good software, and if the users know what they're doing with it they can get what they want out of it.  But that in itself is actually quite meaningful as long as one doesn't try to get too hung-up "picking winners".

For all these reasons I think it's still useful to do the comparisons with all the settings as "neutered" as possible. PERHAPS that gets one closer to a non-manipulated outcome which isolates the softwares' "intrinsic rendering qualities", though even here I wouldn't be 100% certain in the sense that neuter for one program may not be the same thing for the next, and in a practical sense what matters most is what the programs can do in skilled hands (which isn't defaults); these comps are hard stuff to do if we really want to isolate intrinsic capability from user skill.
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
Schewe
Sr. Member
****
Offline Offline

Posts: 5544


WWW
« Reply #9 on: March 27, 2010, 09:45:01 PM »
ReplyReply

Quote from: teddillard
Jeff, thanks.  Yeah, I'll get right on it...  next time my sock drawer, toenail polish and bonbons will just have to wait...  


Well, considering that the Lightroom 3 Beta 2 gives a strong indication that Lightroom 3 and Camera Raw 6 will be leapfrogging the current raw processing software, doing a comparison of something that as of April 12th won't even be sold by Adobe is more than somewhat old news...

Sorry bud...but doing a head to head with something this out of date is really not very useful. BTW, didn't know you wore toenail polish...
Logged
teddillard
Sr. Member
****
Offline Offline

Posts: 664


WWW
« Reply #10 on: March 28, 2010, 07:36:48 AM »
ReplyReply

Mark, you got me thinking more about the "neutral" processing model...  not sure that tells you anything at all.  As opposed to the default settings, where you're looking at the manufacturer's idea of a good rendering, using baseline settings (like sharpening, eg) and camera profiles (of various types), OR the "best result" processing of the "Iron Chef" approach, you're just seeing it with nothing...  arguably the "worst case" result.  I'm just not sure what value that is to anyone.  In a car analogy, it would be like doing a review of two cars stripped down to the wheels and the motor, wouldn't it?  Seems like you test it out of the box...  then maybe you let the Stig drive it?  

I'd be interested to hear more about why you think this would be valuable...  

@ Jeff. meh, bud.  ("THIS out of date"?  ...interesting.) Maybe Adobe is in for a landmark remake (pitch duly noted, thanks) but Capture One has just been going through some more incremental developments, with the notable release of 5.1.1.  While I'm sure anyone who's anyone will run out and get CS5 and LR3, there are a lot of people out there who still are on earlier versions.  And, you have to start somewhere...  at the very least, this is a baseline for when Adobe does release CS5.
« Last Edit: March 28, 2010, 07:38:44 AM by teddillard » Logged

Ted Dillard
Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 7126


WWW
« Reply #11 on: March 28, 2010, 09:19:05 AM »
ReplyReply

Quote from: teddillard
Mark, you got me thinking more about the "neutral" processing model...  not sure that tells you anything at all.  As opposed to the default settings, where you're looking at the manufacturer's idea of a good rendering, using baseline settings (like sharpening, eg) and camera profiles (of various types), OR the "best result" processing of the "Iron Chef" approach, you're just seeing it with nothing...  arguably the "worst case" result.  I'm just not sure what value that is to anyone.  In a car analogy, it would be like doing a review of two cars stripped down to the wheels and the motor, wouldn't it?  Seems like you test it out of the box...  then maybe you let the Stig drive it?  

I'd be interested to hear more about why you think this would be valuable...

Ted, it's primarily a matter of scientific method: start from a comparable base, and test for one variable at a time. Because we have no idea whether Phase defaults are the same as Adobe defaults both of these conditions are violated. IF it were more likely that "no adjustments" in each program means the same thing, the starting point is comparable. Run Imatest for accuracy, noise and resolution in that state. We're distinguishing here between "pleasing" and "technical quality of conversion". Then, if one wants to take it further into "pleasing", one can proceed to test for that by making comparable edits in each program one at a time to see how they react.

The stripped down car analogy is not a very good one. All the data is there in the un-edited state, and just waiting for the user to craft the image as the user sees fit. I find this a more sure-footed way of getting to where I wish to be, while other people may prefer altering a manufacturer's concept on the way to doing that. That's a matter of workflow preference, not to be confused with the issue of testing methodology.
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
teddillard
Sr. Member
****
Offline Offline

Posts: 664


WWW
« Reply #12 on: March 28, 2010, 09:57:58 AM »
ReplyReply

Quote from: Mark D Segal
Ted, it's primarily a matter of scientific method: start from a comparable base, and test for one variable at a time.

Understood...  and I agree, except that when testing software, and especially processing, it's a much harder target to hit than testing, say, lenses (which is what I should  be working on right now...    )  As we see often in straight ICC camera profiling, the dead-nuts on accurate profile is not often what we end up with (although in some cases a good starting point) and often impossible anyway, since the shooting environments vary so much, even in the studio.  (The XRite Passport, interestingly, because it is so easy and fast to use, esp. in Lightroom, helps that issue out...)  

The point being, what we're really looking for out of software is what you're able to get out of it, not a "by the numbers" analysis of each control at baseline.  Again, the point of the default evaluation is more about how fast and easily you can get to where you want to be with each package, not so much what is running under the hood.  

I still like my car analogy.      I'm not able to tell you what the user can get out of the software because I don't know how good the user is, but I can tell you what the software does before the user starts the engine...

(apparently I'm having an emoticon overload this morning...    )
« Last Edit: March 28, 2010, 09:59:28 AM by teddillard » Logged

Ted Dillard
Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 7126


WWW
« Reply #13 on: March 28, 2010, 10:03:01 AM »
ReplyReply

I guess this points to the need to be clear about one what is teting for from the get-go. Is one testing for the basic rendering quality of the software or how easy it is to get to "pleasing" and "how pleasing is pleasing". All different, but in some ways - related issues.
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
bjanes
Sr. Member
****
Offline Offline

Posts: 2882



« Reply #14 on: March 28, 2010, 10:45:02 AM »
ReplyReply

Quote from: teddillard
Right- it's a pretty standard issue with testing processing software.  There are several approaches, but the "default" approach gives you at the very least an understanding of where the manufacturer puts you as a starting point, and how the camera files are treated.  If that default method brings you closer to where you want to be at the outset, you have less work to do.
The forum Rotweiller's sarcasm aside, many of us appreciate your efforts in making the comparison. It would be easy for you to update the results when the new version of ACR comes out, since you already have the test shots.

As you point out, the issue of defaults vs optimal settings is problematic. For example, if you accept ACR's defaults, the dynamic range of your camera will be severely limited, since the default black point clips the shadows rather severely. Here is what I observed with the Nikon D3 with the default tone curve and the same curve except for black = 0.

[attachment=21115:D3_Stouf...s_Step_2.png][attachment=21114:D3_Stouf...0_Step_2.
png]

The hue shifts in blue to magenta that you observed are rather dramatic. With the D3 and the Adobe Standard profile, they are less marked. Is this the camera or the profile and tone curve? Using a custom DNG profile and setting the tone curve to linear results in more accurate color, but the low saturation might not be pleasing. As you point out, many photographers like increased saturation, but hue shifts are generally not welcome.

[attachment=21116:CompProf2.png]

Logged
Mark D Segal
Contributor
Sr. Member
*
Offline Offline

Posts: 7126


WWW
« Reply #15 on: March 28, 2010, 11:09:56 AM »
ReplyReply

Quote from: bjanes
As you point out, the issue of defaults vs optimal settings is problematic. For example, if you accept ACR's defaults, the dynamic range of your camera will be severely limited, since the default black point clips the shadows rather severely. Here is what I observed with the Nikon D3 with the default tone curve and the same curve except for black = 0.

The hue shifts in blue to magenta that you observed are rather dramatic. With the D3 and the Adobe Standard profile, they are less marked. Is this the camera or the profile and tone curve? Using a custom DNG profile and setting the tone curve to linear results in more accurate color, but the low saturation might not be pleasing. As you point out, many photographers like increased saturation, but hue shifts are generally not welcome.

Well, Bill, here you are getting at exactly the kind of stuff I have in mind when discussing test objectives and methodology. I guess the first thing to be clear about is objectives - what is being test for: is it which software makes more pleasing results out of the box? Is it which software makes more accurate colour, less noisy and higher resolution renditions? If the latter, where is the best place to start from in terms of initial settings? I would have thought linear and zero settings would be more appropriate and why you are saying/illustrating here seems to confirm that.

Now where you say "if you accept ACR's defaults, the dynamic range of your camera will be severely limited".......not sure this is the correct way of putting it. Seems to me the dynamic range of the camera depends on the camera's hardware and not on anything you do in ACR. The default tone curve only limits rendition of the image into ACR, not what comes from the camera and remains in the raw file for custom users to adjust so that wanted shadow detail doesn't get clipped.
Logged

Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....." http://www.luminous-landscape.com/reviews/film/scanning_workflows_with_silverfast_8.shtml
teddillard
Sr. Member
****
Offline Offline

Posts: 664


WWW
« Reply #16 on: March 28, 2010, 12:18:27 PM »
ReplyReply

Quote from: bjanes
With the D3 and the Adobe Standard profile, they are less marked. Is this the camera or the profile and tone curve?

Honestly I'm not sure.  What I am sure of is that almost every camera's files I've processed seem to have a slightly different look and feel, and I'd assume it does relate back to the "profile" ACR is looking at. ("Profile" in this use is not an ICC profile, as you probably know...  a misuse of terms by Adobe I personally find somewhat perplexing and annoying.)  That was, at any rate, the reasoning behind running the tests on three different brand and price point cameras.

...and thanks, I'm glad you find it of interest.  (rotweiller, whatever...  )

Mark, that's a very good point- I will go back and see if we can firm up the language at the outset, explaining more about the "default" mode and why we're using that approach.  Thanks!
« Last Edit: March 28, 2010, 12:32:22 PM by teddillard » Logged

Ted Dillard
bjanes
Sr. Member
****
Offline Offline

Posts: 2882



« Reply #17 on: March 28, 2010, 02:06:12 PM »
ReplyReply

Quote from: Mark D Segal
Now where you say "if you accept ACR's defaults, the dynamic range of your camera will be severely limited".......not sure this is the correct way of putting it. Seems to me the dynamic range of the camera depends on the camera's hardware and not on anything you do in ACR. The default tone curve only limits rendition of the image into ACR, not what comes from the camera and remains in the raw file for custom users to adjust so that wanted shadow detail doesn't get clipped.
Mark,

Thanks for the comment. I think you are correct and I should have stated the situation better. The DR is largely a property of the camera, but the raw converter does affect the DR of the rendered image. Since the photographic DR is affected by the noise floor in the shadows, the NR in the converter as well as the black point settings can affect the DR of the rendered image.

Regards,

Bill
Logged
Schewe
Sr. Member
****
Offline Offline

Posts: 5544


WWW
« Reply #18 on: March 28, 2010, 05:35:40 PM »
ReplyReply

Quote from: teddillard
("Profile" in this use is not an ICC profile, as you probably know...  a misuse of terms by Adobe I personally find somewhat perplexing and annoying.)

Fact is, you are mistaken by using the term "Profile" since the proper term would be either a "DNG Calibration Profile" or simply a "camera color profile" which can mean, well, pretty much anything that describes the aspects of the thing being profiled...

So, if you don't like the term "profile" what would YOU suggest?
Logged
bjanes
Sr. Member
****
Offline Offline

Posts: 2882



« Reply #19 on: March 28, 2010, 06:00:09 PM »
ReplyReply

Quote from: Schewe
Fact is, you are mistaken by using the term "Profile" since the proper term would be either a "DNG Calibration Profile" or simply a "camera color profile" which can mean, well, pretty much anything that describes the aspects of the thing being profiled...

So, if you don't like the term "profile" what would YOU suggest?

Lewis Carroll summed up the situation nearly 140 years ago:

When I use a word, Humpty Dumpty said, in rather a scornful tone, it means just what I choose it to mean--neither more nor less. The question is, said Alice, whether you can make words mean so many different things. The question is, said Humpty Dumpty, which is to be master--thats all.

Source: LEWIS CARROLL , Through the Looking-Glass, chapter 6, p. 205 . First published in 1872
« Last Edit: March 28, 2010, 06:01:29 PM by bjanes » Logged
Pages: [1] 2 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad