Ad
Ad
Ad
Pages: « 1 2 [3]   Bottom of Page
Print
Author Topic: The D800  (Read 5980 times)
opgr
Sr. Member
****
Offline Offline

Posts: 1125


WWW
« Reply #40 on: August 14, 2012, 03:01:08 AM »
ReplyReply

So what is it that you are saying? That image sensors are non-linear, but that it does not cause any problems?

No, my claim is that some image sensors are only partially linear, and engineers can opt to include the non-linear parts to expand dynamic range at the expense of colorconsistency.

But my claim in this thread is that the color-rendition of the D800 does not exhibit anywhere near the quality that one might expect considering all of its specs and DR.

Logged

Regards,
Oscar Rysdyk
theimagingfactory
Rob C
Sr. Member
****
Offline Offline

Posts: 12213


« Reply #41 on: August 14, 2012, 03:47:16 AM »
ReplyReply

So:

1) You cannot show a DSLR image without tonal compression, so tonal compression is always a major parameter
2) A web image has a wider DR than a print (a good monitor has a dynamic range of at least 1:400)
3) Prints have least dynamic rangePresentation plays a major role. An image surrounded by white will have much less apparent DR (or tonal range) than an image surrendered by black.

Best regards
Erik




This point makes my point about the largely pointless value of über cameras as valid alternatives to already very advanced camera equipment. I simply believe that people are being seduced into the 'new' for no better reason than that it's new.

Of course I have no reason to question their choices to do that; I simply don't buy that they are achieving anything that way.

There are so many other factors at play in photography that the actual machine is seldom the greatest one unless you have specialised in something esoteric. And I don't think making digital prints enters into that definition.

You guys really should read the Ansel book of letters; so much here is similar to what he wrote about... Soul, soul, effing soul!

Rob C
Logged

hjulenissen
Sr. Member
****
Offline Offline

Posts: 1675


« Reply #42 on: August 14, 2012, 03:48:23 AM »
ReplyReply

No, my claim is that some image sensors are only partially linear, and engineers can opt to include the non-linear parts to expand dynamic range at the expense of colorconsistency.
Perhaps. Perhaps not. All that I have seen seems to indicate very good linearity. If you know something else, it would be very interesting to get to see your sources?
http://translate.google.com/translate?u=http://www.guillermoluijk.com/article/anavsdig/index.htm&langpair=es%7Cen&hl=EN&ie=UTF-8

"This chart is quite convincing to verify sensor linearity. We see how it responds linearly to input stimulus without presenting any asymptotic behavior in the area close to saturation as it does the film chemistry, and does so from a very low range of luminosities (approx. 9 diaphragms below the saturation analyzed the sensor), to the very point of saturation. "

I suspect that Canons "HTP" mode is simply underexposure to preserve highlights.
Quote
But my claim in this thread is that the color-rendition of the D800 does not exhibit anywhere near the quality that one might expect considering all of its specs and DR.
Isn't that a bit like claiming that CDs sounds worse than vinyl based on a random, freshly purchased CD compared to a random vinyl album sitting on your shelf? Is "color rendition" something that can be objectively measured by someone like dxo, or is it only a subjectively detectable quality?

Without shooting the same scene, under similar/fair conditions, processing using similar/fair settings, how are you able to conclude visually about the qualities of some camera?

I know that these suggestions make some people furious, but I ask because I think that it is very difficult (for me, at least) to visually inspect an image, and from that alone conclude much about the properties of the camera used to generate it, compared to the skills of the Photoshop operator. Any digital image is made up of a finite amount of information (bits & bytes). Therefore, one would expect an army of photoshop monkeys, given enough time, would be able to produce any image from any starting-point. Slightly more skilled monkeys may be able to do the same in less time. Not saying that this extreme actually happens, but it goes to show that photoshopping can be an important component in finished images.

-h
« Last Edit: August 14, 2012, 03:56:32 AM by hjulenissen » Logged
opgr
Sr. Member
****
Offline Offline

Posts: 1125


WWW
« Reply #43 on: August 14, 2012, 04:25:57 AM »
ReplyReply

Perhaps. Perhaps not. All that I have seen seems to indicate very good linearity. If you know something else, it would be very interesting to get to see your sources?
http://translate.google.com/translate?u=http://www.guillermoluijk.com/article/anavsdig/index.htm&langpair=es%7Cen&hl=EN&ie=UTF-8

Guillermo is measuring RAW files, which (almost by definition, as I'm inclined to add) are linear.

It doesn't measure the actual sensor response.


Isn't that a bit like claiming that CDs sounds worse than vinyl based on a random, freshly purchased CD compared to a random vinyl album sitting on your shelf? Is "color rendition" something that can be objectively measured by someone like dxo, or is it only a subjectively detectable quality?

No, it's like claiming that we now have 24bit 196kHz uncompressed audio, but for some reason the original CD sounds better. And that you totally agree that that is odd and almost preposterous to claim, but that that is what the ears are telling…
 
Can color-rendition be measured. Well, DXO does have a color rendition metric, and from the samples I have seen so far, I would think that it should show up in some metric. Whether it be DXO measure method or something else. But it is obvious enough to be measured. I am fairly certain.

Without shooting the same scene, under similar/fair conditions, processing using similar/fair settings, how are you able to conclude visually about the qualities of some camera?

Mark has provided us with such an example over in the MF forum.


Slightly more skilled monkeys may be able to do the same in less time. Not saying that this extreme actually happens, but it goes to show that photoshopping can be an important component in finished images.

Errr, yes, as does panorama software adding another layer of manipulations. But I would like to immediately add that some color manipulations are near impossible to correct. Specifically those that deal with non-linear primaries.

And some errors are simply exaggerated when layered: i.e. JPEG blockiness will simply become worse when double compression is applied. Never less...

Logged

Regards,
Oscar Rysdyk
theimagingfactory
hjulenissen
Sr. Member
****
Offline Offline

Posts: 1675


« Reply #44 on: August 14, 2012, 05:32:37 AM »
ReplyReply

Guillermo is measuring RAW files, which (almost by definition, as I'm inclined to add) are linear.

It doesn't measure the actual sensor response.
But you have? Or you can point to someone that have? Or you are just speculating?

-h
Logged
hjulenissen
Sr. Member
****
Offline Offline

Posts: 1675


« Reply #45 on: August 14, 2012, 05:44:56 AM »
ReplyReply

No, it's like claiming that we now have 24bit 196kHz uncompressed audio, but for some reason the original CD sounds better. And that you totally agree that that is odd and almost preposterous to claim, but that that is what the ears are telling…
You seem to have missed my point. Comparing a CD recording of Aretha Franklin and a BluRay Audio recording of Britney Spears may be interesting, but allows us to draw very limited conclusions about the merits of each _format_. You or more likely to find your music preferences.

You need to compare the same, subjectively "good" recording that has been fairly mastered for both formats. That is a point that many audiophiles forget, and many photographers forget.
Quote
Errr, yes, as does panorama software adding another layer of manipulations. But I would like to immediately add that some color manipulations are near impossible to correct. Specifically those that deal with non-linear primaries.
...
If the raw file appears to be linear in all possible ways, then there is no non-linearity to correct? If non-linear raw files do occur, I challenge you to find an example.

I am assuming that exposure is chosen such that sensor saturation and noise-floor does not affect the final image. If you capture a scene of effectively 14 stops of DR using a camera of effectively 10 stops of DR, parts of the image will have significant errors that a better camera might record better.

-h
« Last Edit: August 14, 2012, 05:48:24 AM by hjulenissen » Logged
opgr
Sr. Member
****
Offline Offline

Posts: 1125


WWW
« Reply #46 on: August 14, 2012, 05:47:12 AM »
ReplyReply

But you have? Or you can point to someone that have? Or you are just speculating?

No, I haven't, I am merely paraphrasing some of the more knowledgable people on this forum from the past, having looked up some internet info once in a while, and combining that with what little knowledge I have of electronics and physics and then coming to the conclusion that there is no relation between what is in the RAW file and what comes from the sensor.

This was relevant to me as I tried to define a correct color-calibration strategy for RAW conversion, something which I do feel comfortable and knowledgeable enough to fuzz about. The main question being: can RAW data be calibrated with a simple linear RGB matrix profile, or does it require full 3D table based corrections. The answer obviously depends entirely on the linearity of the sensor response, and very much NOT on the linearity of the RAW data in the file...

Anyway, here is some random source i just found, whereby I immediately want to add that the curve is likely simplified for illustrative purposes:
Random source

I recall searching in the past for Sony sensor spec sheets, and finding it, but currently I only find spectral response charts. Apparently that has become more pregnant to current designers if the Google algos are anything to go by.


Logged

Regards,
Oscar Rysdyk
theimagingfactory
hjulenissen
Sr. Member
****
Offline Offline

Posts: 1675


« Reply #47 on: August 14, 2012, 05:54:43 AM »
ReplyReply

No, I haven't, I am merely paraphrasing some of the more knowledgable people on this forum from the past, having looked up some internet info once in a while, and combining that with what little knowledge I have of electronics and physics and then coming to the conclusion that there is no relation between what is in the RAW file and what comes from the sensor.
I have only seen this idea presented once before on this forum. That forum user could not present any sources supporting his view, and the people who I respect for their knowledge seemed to agree that he was wrong. That is the reason that I suggest that you present actual sources.
Quote
can RAW data be calibrated with a simple linear RGB matrix profile, or does it require full 3D table based corrections.
It obviously _can_ be calibrated using anything. The question is, perhaps, what can be accomplished using the different techniques (and what your aims are).

I believe that camera primaries are a very crude approximation to the primary sensitivity in typical human eyes. Two different colors might look identical to the camera sensor. You might want to prioritize some tones over others, you want a smooth model, and you have access to a limited set of (error-prone) color response samples. I would assume that image noise (both in measurement and in later regular images) could be a reason to do non-linear calibration, even if the basic model of linear, clipped and noisy sensor holds. Further, there might be offset values, biases, slightly variable scaling or other details in the digital number representation etc that serve to conceal the underlying linear model. Finally, Nikon are known (?) to use non-linear representation in raw files as a means to save storage space. If you are operating directly on that data, you would have to have a non-linearity in the characterization (same with jpeg gamma-encoded values).

Iliah Borg on the argyllCMS mailing list:
Quote
"My matrix-based 5D3 profiles are OK. I can confirm that some few tints of yellow do have wrong renditions due to the filters used in the camera, but those are in a very narrow range of Lab. LUT profiles attempting to accommodate for these variations usually make things worse sacrificing profile smoothness.
"

Quote
The answer obviously depends entirely on the linearity of the sensor response, and very much NOT on the linearity of the RAW data in the file...
Sorry, I fail to see that. The calibration will be used to refer raw data back to some known reference? Whatever happens in-between is an irrelevant black box as seen from the calibration procedure. When I calibrate my screen it really does not matter if the gamma response is a physical attribute of the display, or if it is only a driver/firmware software thing. What matters is "how does the physical panel respond if I input a pixel valued e.g. [16 22 21]?".

If camera sensors are significantly non-linear, one would expect to observe this in the raw file. Strange behaviour is to be expected close to the top and bottom of the sensing range. If one does not observe it in the raw file, then either the camera manufacturer was able to perfectly compensate the non-linearity (in which case sensor non-linearity is nothing to worry about), or the sensor is really linear (in which case sensor non-linearity is nothing to worry about)

-h
« Last Edit: August 14, 2012, 06:25:18 AM by hjulenissen » Logged
ErikKaffehr
Sr. Member
****
Offline Offline

Posts: 7308


WWW
« Reply #48 on: August 14, 2012, 08:47:07 AM »
ReplyReply

Hi,

The sensor itself measures captured electrons, that is charging a capacitor. I'd say that it pretty linear. That charge is read as a voltage which in all probability needs preamplification. The pre amps are probably linear but even if they would not be as the output signal can be linearised. As long as the data numbers in the raw file are proportional to the number of collected electrons the device would be linear.

The data undergo significant processing before they are converted into a device dependent rendition.

Best regards
Erik


But you have? Or you can point to someone that have? Or you are just speculating?

-h
Logged

RSL
Sr. Member
****
Online Online

Posts: 6183



WWW
« Reply #49 on: August 14, 2012, 09:48:20 AM »
ReplyReply

This thread demonstrates, as if it needed more demonstration, that talking about equipment is a losing proposition.
Logged

Slobodan Blagojevic
Sr. Member
****
Online Online

Posts: 5654



WWW
« Reply #50 on: August 14, 2012, 09:53:28 AM »
ReplyReply

This thread demonstrates, as if it needed more demonstration, that talking about equipment is a losing proposition.

And it demonstrates equally well how easily even quite serious people fall for trolls.
Logged

Slobodan

Flickr
500px
ErikKaffehr
Sr. Member
****
Offline Offline

Posts: 7308


WWW
« Reply #51 on: August 14, 2012, 11:39:12 AM »
ReplyReply

Yes,

You are right. Sorry for derailing the discussion.

Regarding the original topic, my impression is that Nikon introduced a camera many were waiting for with D800, better in many senses than the D700 and much more affordable than the D3X.

Best regards
Erik


This thread demonstrates, as if it needed more demonstration, that talking about equipment is a losing proposition.
Logged

Ray
Sr. Member
****
Offline Offline

Posts: 8874


« Reply #52 on: August 15, 2012, 12:43:20 AM »
ReplyReply

Nevertheless, there does appear to be an interesting paradox in the DXO measurements that I sometimes notice when comparing the graphs.

Oscar mentioned that the Canon 5D was considered to be ahead of its time regarding tonal range and color rendition.

Just out of curiosity I compared the measurements, at the pixel level, for the 5D and the D800 on the DXOMark website. I was surprised to find that the 5D pixel, at almost 3x the size of the D800 pixel, has about equal performance with regard to SNR at 18%, Tonal Range and Color Sensitivity. This is a good indication of the technological progress that has been made since the introduction of the 5D, the fact that a pixel almost 1/3rd the size can have the same performance with regard to noise and color sensitivity.

However, here is where I see as a paradox. The D800 pixel has almost 2.5 stops greater Dynamic Range than the 5D pixel, yet has no greater tonal range. How come?

If the Tonal Range measurement describes the number of distinct tones the sensor can record over its entire dynamic range from the deepest shadows to the brightest highlights, then one might expect the D800 pixel to also have a greater tonal range than the 5D pixel, if it has a greater DR. Those 2.5 stops in the deepest shadows also have a tonal range, surely.

I can only conclude therefore that the 5D pixel must have a greater tonal range within the useable DR that the pixels from both cameras have in common, that is, the range from the deep or low midtones to the brightest highlights. The entire tonal range within those darkest 2.5 stops that the D800 can record, but the 5D cannot, must be at the expense of the tonal range above those lowest 2.5 stops.

That's how I see it, but I might be wrong. My understanding of such technical matters is quite limited.
Logged
Pages: « 1 2 [3]   Top of Page
Print
Jump to:  

Ad
Ad
Ad