Ad
Ad
Ad
Pages: « 1 2 [3] 4 5 »   Bottom of Page
Print
Author Topic: 3 Feb, 2009 - Eyes vs. Numbers - Which to Believe  (Read 26491 times)
JRSmit
Sr. Member
****
Offline Offline

Posts: 417


WWW
« Reply #40 on: February 06, 2009, 03:41:13 AM »
ReplyReply

Quote from: BernardLanguillier
Yet, it remains to be shown how exactly their results differ from what one sees...

Other than the fact that a P65+ A2 print looks nicer than the same print from an A900/D3x, I have not read any convincing example from Michael to explain what exactly he means by this supposed gap between perception and numbers.

If anything, he commented in a recent thread that most backs and DSLRs have the same 12 stop DR in real world applications, which is basically exactly what DxO is saying also.

Is the disconnect real?

Cheers,
Bernard

In the past i have been professionally and personally heavely involved in the HiFi Audio world, both as a consumer and as a lead engineer in designing and producing loudspeakers and amplifiers. I see a strong parallel with the HiFi Image world, and it frightens me. As John Atkinson, the well known and highly regarded editor of Stereophile once put it, that with his many many years of testing and listening to hifi-audio-equipment he still was not able to find consistent correlation between measurements and listening results.
Like Michael said several postings up, his interest is the baked result, and lesser the ingredients. I once designed a phono preamplifier, duing design i had two different circuit topologies, but with the same individual components. Both topo's measured identical but sounded quite different.  The best one was declared in a Festival du Son in Paris as one of the top-three in the world. So to take the analogy of baking a bit further, the ingredients do matter, but how it is combined to the end-result is fundamental in the outcome, this is the baking recipe including all parameters.
The scary thing that frightens me is the possibility in electronics to make the measured result look good, but achieved by just fiddling around inside the "black box". Sound engineering principles do allow for that unfortunately.


Regards,

Jan R.
Logged

Fine art photography: janrsmit.com
Fine Art Printing Specialist: www.fineartprintingspecialist.nl


Jan R. Smit
JRSmit
Sr. Member
****
Offline Offline

Posts: 417


WWW
« Reply #41 on: February 06, 2009, 03:55:32 AM »
ReplyReply

Perhaps the problem lie in the measurement principles. Yes the extremes are black and white, yes the tone curve from black to white matters. But what about the smallest change in luminance, in color change that can be faithfully reproduced? What about how the system performance degradation in reproducing linepairs should be versus how a given system actually degrades. I have a Nikon Coolscan LS50-ED, that had a ghosting problem. The measurements of Nikon Professional Services did not show this problem, yet the scanned pictures did. To cut a long sotryshort, after removal of the protective window on the sensor and treating the chamber between the optics and the sensor with a very strong lightabsorbent material, not only the ghosting was gone, THe scanresults now show more detail, and more "alive". Yet the black is till black the white is still white, the curves when looking at the calibration profile of the same calib-slide vary little if any. So in between the extremes something has changed for the better, as if a gray veil has lifted (or in HiFi Audio teminology, clarity improved, micro-contrast repoduced more faithfully, etc, etc), but measurements do not show this.

Regards,


Jan
Logged

Fine art photography: janrsmit.com
Fine Art Printing Specialist: www.fineartprintingspecialist.nl


Jan R. Smit
Ray
Sr. Member
****
Offline Offline

Posts: 8948


« Reply #42 on: February 06, 2009, 06:09:00 AM »
ReplyReply

Quote from: JRSmit
Perhaps the problem lie in the measurement principles. Yes the extremes are black and white, yes the tone curve from black to white matters. But what about the smallest change in luminance, in color change that can be faithfully reproduced? What about how the system performance degradation in reproducing linepairs should be versus how a given system actually degrades. I have a Nikon Coolscan LS50-ED, that had a ghosting problem. The measurements of Nikon Professional Services did not show this problem, yet the scanned pictures did. To cut a long sotryshort, after removal of the protective window on the sensor and treating the chamber between the optics and the sensor with a very strong lightabsorbent material, not only the ghosting was gone, THe scanresults now show more detail, and more "alive". Yet the black is till black the white is still white, the curves when looking at the calibration profile of the same calib-slide vary little if any. So in between the extremes something has changed for the better, as if a gray veil has lifted (or in HiFi Audio teminology, clarity improved, micro-contrast repoduced more faithfully, etc, etc), but measurements do not show this.

You might have a point there, somewhere   . If there's evidence out there that the DXO analytical method is failing to take account of certain, significant, observable differences in image quality, differences and qualities that are not reflected in the DXO measurements, then the DXO team would surely be the first to be concerned. This is their business.

But so far, in all the heat of the discussions on this issue in various threads, I've never seen a single comparison between any two cameras which contradict the DXO results.

Michael produced a comparison between the P45+ and the Canon Powershot G10 P&S, in which he demonstrated that at an A3+ size (13"x19") there was no discernible difference between the images from the two cameras, apart from the shallower DoF of the P45+.

However, the DXO results in respect of noise, dynamic range, tonal range and color sensitivity, at a normalised print size of 8"x12", indicate there is a substantail difference between these two cameras. Is this a contradiction or a disconnect?

By substantial, I mean 4 stops greater SNR, for example. At base ISO, the P45+ has over 4 stops (12.2dB) lower SNR than the Powershot G10, at an 8x12" print size, and 3 stops greater dynamic range. Is there anyone who would dispute this? Would proud owners of a P45+ back like to stick up their hand and declare that these figures are nonsense?

If you do want to stick up your hand, would you say that the SNR of the P45+ should be more than 4 stops greater, at base ISO, and the DR more than 3 stops greater, than the G10?

If you do think that this is the case, that DXO are underestimating the SNR, DR, Tonal Range and Color Sensitivity of the P45+, how do you explain that a group of experienced photographers were unable to distinguish between Michael's A3+ size prints from the G10 and P45+?

Do you think it might have something to do with the characteristics of the subject that was photographed?
Logged
JRSmit
Sr. Member
****
Offline Offline

Posts: 417


WWW
« Reply #43 on: February 06, 2009, 08:13:41 AM »
ReplyReply

The objects photographed can have something to do with the observation Michael made.

Also DxO measures at the "raw point" in the total production-reproduction chain, whereas Michael measured at the "end point" of the chain: interpretation of what is visually observed.
Those are two totally different points in the total chain, making comparison difficult if not impossible.

Looking at individual parameters: Again comparing to the HiFi Audio, for instance SNR as an absolute number has little meaning on its own, unless it is absolutely not noticed, ie well under the reception threshold of the human being (unfortunately , to my knowledge we do not know this threshold, nor can we achieve it apparently)
But when received/observed/noticed, how noise is perceived is highly dependent on two main parameters, its noise profile(how it sounds)  and whether or not it is somehow modulated/influenced by the signal whe are interested in.
If of a smooth/clean profile, and totally independent of the signal, the human processing system filters it out as a constant, effectively reducing the noise level. Several observations on this phenomenon pointed in the direction of up to 10dB or more.
If this is also true for what we see (recorded by our eyes, processed by our brains to something we reckognise) i do not know, but only assume it is, it is the same brains, therefore it is plausible that the same basic principles apply.

It however requires only one component in the total chain to jeopardise the perceived endresult, for instance the noise topic just covered.

This does not imply that i am saying the Michael's reproduction chain is somehow flawed, i would be the last person to make such a statement.

What i am saying is that we need to take the total chain into account and not just one component in isolation.

At the end of the day we only see and aim to enjoy what comes out at the end, the intermediate results are just there to serve the endresult.


Regards Jan R.


Logged

Fine art photography: janrsmit.com
Fine Art Printing Specialist: www.fineartprintingspecialist.nl


Jan R. Smit
Ray
Sr. Member
****
Offline Offline

Posts: 8948


« Reply #44 on: February 06, 2009, 09:04:31 AM »
ReplyReply

Quote from: JRSmit
The objects photographed can have something to do with the observation Michael made.

Also DxO measures at the "raw point" in the total production-reproduction chain, whereas Michael measured at the "end point" of the chain: interpretation of what is visually observed.
Those are two totally different points in the total chain, making comparison difficult if not impossible.

Looking at individual parameters: Again comparing to the HiFi Audio, for instance SNR as an absolute number has little meaning on its own, unless it is absolutely not noticed, ie well under the reception threshold of the human being (unfortunately , to my knowledge we do not know this threshold, nor can we achieve it apparently)
But when received/observed/noticed, how noise is perceived is highly dependent on two main parameters, its noise profile(how it sounds)  and whether or not it is somehow modulated/influenced by the signal whe are interested in.
If of a smooth/clean profile, and totally independent of the signal, the human processing system filters it out as a constant, effectively reducing the noise level. Several observations on this phenomenon pointed in the direction of up to 10dB or more.
If this is also true for what we see (recorded by our eyes, processed by our brains to something we reckognise) i do not know, but only assume it is, it is the same brains, therefore it is plausible that the same basic principles apply.

It however requires only one component in the total chain to jeopardise the perceived endresult, for instance the noise topic just covered.

This does not imply that i am saying the Michael's reproduction chain is somehow flawed, i would be the last person to make such a statement.

What i am saying is that we need to take the total chain into account and not just one component in isolation.

At the end of the day we only see and aim to enjoy what comes out at the end, the intermediate results are just there to serve the endresult.


Regards Jan R.

The point I'm trying to make is as follows. Differences in image quality can be lost because the subject photographed is not suitable to display such subtle differences. That's not a fault of the DXO testing methodology.

What would throw suspicion on the DXO results is a comparison between, say, a D3X and a P45+ at an A2 size, which showed an observable, smoother tonality and lower noise in any part of the tonal range. So far, no such comparison has been shown, to my knowledge, so all the protestations and bleating are just hot air.

If someone does dare to show such a comparison, they'd better make sure that both imags are of the identical subject with identical lighting, and that both images have an equally full exposure (ETTR), and equally thorough treatment at the conversion stage and in post processing.


Logged
JRSmit
Sr. Member
****
Offline Offline

Posts: 417


WWW
« Reply #45 on: February 06, 2009, 10:29:37 AM »
ReplyReply

Quote from: Ray
The point I'm trying to make is as follows. Differences in image quality can be lost because the subject photographed is not suitable to display such subtle differences. That's not a fault of the DXO testing methodology.

What would throw suspicion on the DXO results is a comparison between, say, a D3X and a P45+ at an A2 size, which showed an observable, smoother tonality and lower noise in any part of the tonal range. So far, no such comparison has been shown, to my knowledge, so all the protestations and bleating are just hot air.

If someone does dare to show such a comparison, they'd better make sure that both imags are of the identical subject with identical lighting, and that both images have an equally full exposure (ETTR), and equally thorough treatment at the conversion stage and in post processing.

Ray,

Taking your criteria to separate hot air from substance, where does that put DxO?

Regards,


Jan R.
Logged

Fine art photography: janrsmit.com
Fine Art Printing Specialist: www.fineartprintingspecialist.nl


Jan R. Smit
TomWalton
Newbie
*
Offline Offline

Posts: 1


« Reply #46 on: February 06, 2009, 02:30:52 PM »
ReplyReply

For the past 30 years, I've been professionally involved in designing signal processing systems for audio, sonar, radar, and telecommunications systems.  I've also read way too many statements about the mystical ineffability, the 'magic', the unknowable connection between design/implementation and perceived performance.  If your two pre-amps measured "identically"  (presumably in frequency response) and performed differently at a level reproducibly perceptible to  humans in blind trials, then you didn't do enough measurements.  Perhaps it was phase or amplitude non-linearity, or transient ringing effects, etc., but if its a gross enough effect for a human to distinguish, it has a measurable (in some parameter) effect on the signal waveform.

I'm not saying that different designs don't sound different, or that people shouldn't prefer one device's 'interpretation' over another.  People can be pretty good at 'different', but we're really terrible at 'better'.  Outside controlled blind trials, we're not even very good at reliably detecting differences among similarly performing gadgets.  The shameful/successful marketing of $1000 per foot pure crystalline-aligned unobtanium-alloy speaker cables shows how effective the marketing guys are at creating distinctions without differences in the minds of vulnerable, enthusiastic (well-heeled) hobbyists.  

Much of what is sold as high-end audio today is quantum snake oil.  The tube amplifiers that are so popular today among the 'golden ears' MR refers to sound good in their judgment, but they're (by design) not high-fidelity, if hi-fi is taken to mean faithful reproduction of the audio signal.  Tubes are well known for their compressive amplitude transfer characteristics, which in a nice Class A configuration (with little feedback to suppress the harmonic distortion) will produce the sweet 'syrupy'  harmonics (distortion) that the golden ears have declared to be High Fidelity (its especially sweetening to acoustic instruments and human voices.   I even like it.). Their preference for low-power final amplifiers (or overdriven pre-amp stages) is to insure that the tubes are frequently driven into their compressive performance regime.  Its fine that they like it, and that they are willing to pay for it.  Defending their 'refined' listening preferences by saying there's no correlation between measurements and quality though, is a rationalization of their preference for harmonically sweetened playback.  High-fi it ain't.  (I'm sure there's a Kodachrome/Velvia analogy in here somewhere - look at all the pretty colors)

So, while I'm puzzled, along with many others, with the DX0 'resolution independent' quality measures, I can't subscribe to the 'numbers don't matter' school.  Badly formulated or mis-applied metrics are common.  More common is mis-interpretation of technical measurements by non-specialists.  Most common by far, sadly, is over-reliance on, and endless yammering over the the importance of this or that technical parameter in what is, outside of medical and forensic imaging, an artistic endeavor more than a  technical one.  The numbers, properly formulated and interpreted, can characterize the performance of the instrument, but can't predict the artistic quality of the product.

Regards,

--Tom


Quote from: JRSmit
In the past i have been professionally and personally heavely involved in the HiFi Audio world, both as a consumer and as a lead engineer in designing and producing loudspeakers and amplifiers. I see a strong parallel with the HiFi Image world, and it frightens me. As John Atkinson, the well known and highly regarded editor of Stereophile once put it, that with his many many years of testing and listening to hifi-audio-equipment he still was not able to find consistent correlation between measurements and listening results.
Like Michael said several postings up, his interest is the baked result, and lesser the ingredients. I once designed a phono preamplifier, duing design i had two different circuit topologies, but with the same individual components. Both topo's measured identical but sounded quite different.  The best one was declared in a Festival du Son in Paris as one of the top-three in the world. So to take the analogy of baking a bit further, the ingredients do matter, but how it is combined to the end-result is fundamental in the outcome, this is the baking recipe including all parameters.
The scary thing that frightens me is the possibility in electronics to make the measured result look good, but achieved by just fiddling around inside the "black box". Sound engineering principles do allow for that unfortunately.


Regards,

Jan R.
Logged
michael
Administrator
Sr. Member
*****
Offline Offline

Posts: 4925



« Reply #47 on: February 06, 2009, 04:31:02 PM »
ReplyReply

It's not the critical "exactly the same" test that some people want, but I now have about 15 prints from my recent shoot in Antarctica hanging at my gallery. Within the next week I should have about 30. In mid-March I'll have an open house and show. In the meantime if anyone wants to drop by my Toronto gallery when I'm there I'd be happy to show them to you.

What do they show? Most are in the 20" X 28" size range, matted to 28X34". Some are shot with the Sony A900 and some with the Phase One P65+. Can one see a difference at this size? Yes, absolutely if you know what to look for, and it's not just about resolution. But to anyone except a technically knowledgeable observer these differences don't jump out. They all look pretty terrific.

There are no simple answers.

Michael
« Last Edit: February 06, 2009, 04:32:19 PM by michael » Logged
Nick Rains
Sr. Member
****
Offline Offline

Posts: 704



WWW
« Reply #48 on: February 06, 2009, 05:10:13 PM »
ReplyReply

Quote from: michael
It's not the critical "exactly the same" test that some people want, but I now have about 15 prints from my recent shoot in Antarctica hanging at my gallery. Within the next week I should have about 30. In mid-March I'll have an open house and show. In the meantime if anyone wants to drop by my Toronto gallery when I'm there I'd be happy to show them to you.

What do they show? Most are in the 20" X 28" size range, matted to 28X34". Some are shot with the Sony A900 and some with the Phase One P65+. Can one see a difference at this size? Yes, absolutely if you know what to look for, and it's not just about resolution. But to anyone except a technically knowledgeable observer these differences don't jump out. They all look pretty terrific.

There are no simple answers.

Michael

I find that quite reassuring. My position is that any good digital camera with good lenses, good shooting technique and good printing methods will produce excellent results, almost indistiguishable from other cameras as long as they stay within their native resolution .

At 28" the Sony is printing at about 200dpi which is roughly the lower limit of acceptable printing res according to current wisdom. The P65 is well within it's native res of course, and I'm not surprise the prints look mostly as good as each other. It would be only when you went up a print size that the P65 would come into it's own. I absolutely guarantee that a 44" print would look better on the P65 than on any other camera, with the possible exception of the P45. It's just simple math really.


Logged

Nick Rains
Australian Photographer
Leica Akademie Instructor
www.nickrains.com
Ray
Sr. Member
****
Offline Offline

Posts: 8948


« Reply #49 on: February 06, 2009, 06:33:58 PM »
ReplyReply

Quote from: Nick Rains
At 28" the Sony is printing at about 200dpi which is roughly the lower limit of acceptable printing res according to current wisdom. The P65 is well within it's native res of course, and I'm not surprise the prints look mostly as good as each other. It would be only when you went up a print size that the P65 would come into it's own. I absolutely guarantee that a 44" print would look better on the P65 than on any other camera, with the possible exception of the P45. It's just simple math really.


Yes, of course. The image from the sensor with the greater number of pixels will at some point of enlargement begin to show better resolution. We don't need DXO testing to tell us that.

The issue here is really about the other attributes of image quality; tonality, noise, dynamic range etc. It is surprising that the flagship 35mm DSLR (the D3X), in respect of these other attributes, seems to have caught up with the older generation of DBs which have aprroximately double the sensor area. The P45+ has been around for a while now, and DXO have not tested the P65.

It's surprising to me that the smaller sensor should do so well, because I recently compared my new 50D with my 3 1/2 year old 5D, with respect to noise and tonality. The purpose of the testing was to find out if the 50D had a shutter-speed/image-quality advantage flowing from its inherent ability to produce a greater DoF at a givem F/stop (as a result of its smaller sensor). In other words, if in practice I can use 100th at F4 and ISO 100, with the 50D, hand-held, then in order to get the same DoF at the same shutter speed, I need to use approximately F6.3 and ISO 350 with the 5D.

I wanted to see if the 5D image would be noisier in those circumstances.

In fact, it wasn't, even comparing pixel for pixel on the screen. Both images appeared to have about equal noise. However, comparing both cameras at the same ISO, it was clear than the 5D produced lower noise. The reason for this seems pretty clear. The 5D's sensor is over 2.5x the area of the 50D's sensor. It simply gathers more light at the same ISO for any scene of equal FOV that is correctly exposed.

These results of my own testing carried out before DXOMark existed, are quite consistent with the DXO graphs in respect of noise and tonal range. Despite the 5D being 3 1/2 year old technology, the DXO tests show that the 5D actually does have better SNR and Tonal Range than the smaller-sensor 50D, both 'on screen' and at a normalised size of 8"x12".

Logged
Ray
Sr. Member
****
Offline Offline

Posts: 8948


« Reply #50 on: February 06, 2009, 07:07:52 PM »
ReplyReply

Quote from: TomWalton
For the past 30 years, I've been professionally involved in designing signal processing systems for audio, sonar, radar, and telecommunications systems.  I've also read way too many statements about the mystical ineffability, the 'magic', the unknowable connection between design/implementation and perceived performance.  If your two pre-amps measured "identically"  (presumably in frequency response) and performed differently at a level reproducibly perceptible to  humans in blind trials, then you didn't do enough measurements.  Perhaps it was phase or amplitude non-linearity, or transient ringing effects, etc., but if its a gross enough effect for a human to distinguish, it has a measurable (in some parameter) effect on the signal waveform.

Tom,
Although I'm not an engineer or technician, I tend to agree with your entire post. It makes complete sense to me and accords with my own experience in evaluating hi fi systems for my personal use.
Logged
ErikKaffehr
Sr. Member
****
Offline Offline

Posts: 8031


WWW
« Reply #51 on: February 06, 2009, 08:35:31 PM »
ReplyReply

Hi!

I'm not familiar with high end audio, but I agree with the snake oil stuff. There are issues with measurements, too. In many cases the measurements are to simple, like the frequency response or total harmonic distorsion thing in audio. I guess that quite many MTF charts are calculated for each lens but only a few curves are published in the popular tests. Most of the images we have are not really in focus, because of the essentially non existing depth of field of today's digital technology (due to resolution) but also because of different failures of focusing technology. Even small focusing errors on the magnitude of microns in the image plane affect MTF. So slightly out of focus imaging may be of greater significance in real photography than optimum MTF.

Many apects like flare and ghosting are not easy to measure objectively. In my humble opinion there are some views which are ignorant or stupid:

1) Trying to convert all data into a single figure of merit (SFOM).
2) If we have a single figure of merit assuming that a product having a slightly higher SFOM is better than one having a slightly lower one.
3) The approach that all measurements are scrap

My opinion is more like:

- Something that measures bad is normally bad, at least if measurement is relevant
- There are things that are hard to measure which may have a high subjective significance

One issue I'd like to point out is that we are obsessed with sharpness and resolution. But any decent equipment today gives very good sharpness and resolution at optimum conditions. Small prints like 8x10 never gets us close to resolution limits of todays DSLRs so if we see that an image from one system is better than another in small prints we probably are not discussing resolution but something else.

Another issue is that the digital imaging chain is very flexible regarding rendition. There are very few software solutions, if any, that really work with pristine non manipulated images, so we always compare the output of two highly configurable processing chains.

A summary of this may be: Relevant measurements are good, but need to be taken with a grain of salt.

Best regards
Erik




Quote from: TomWalton
For the past 30 years, I've been professionally involved in designing signal processing systems for audio, sonar, radar, and telecommunications systems.  I've also read way too many statements about the mystical ineffability, the 'magic', the unknowable connection between design/implementation and perceived performance.  If your two pre-amps measured "identically"  (presumably in frequency response) and performed differently at a level reproducibly perceptible to  humans in blind trials, then you didn't do enough measurements.  Perhaps it was phase or amplitude non-linearity, or transient ringing effects, etc., but if its a gross enough effect for a human to distinguish, it has a measurable (in some parameter) effect on the signal waveform.

I'm not saying that different designs don't sound different, or that people shouldn't prefer one device's 'interpretation' over another.  People can be pretty good at 'different', but we're really terrible at 'better'.  Outside controlled blind trials, we're not even very good at reliably detecting differences among similarly performing gadgets.  The shameful/successful marketing of $1000 per foot pure crystalline-aligned unobtanium-alloy speaker cables shows how effective the marketing guys are at creating distinctions without differences in the minds of vulnerable, enthusiastic (well-heeled) hobbyists.  

Much of what is sold as high-end audio today is quantum snake oil.  The tube amplifiers that are so popular today among the 'golden ears' MR refers to sound good in their judgment, but they're (by design) not high-fidelity, if hi-fi is taken to mean faithful reproduction of the audio signal.  Tubes are well known for their compressive amplitude transfer characteristics, which in a nice Class A configuration (with little feedback to suppress the harmonic distortion) will produce the sweet 'syrupy'  harmonics (distortion) that the golden ears have declared to be High Fidelity (its especially sweetening to acoustic instruments and human voices.   I even like it.). Their preference for low-power final amplifiers (or overdriven pre-amp stages) is to insure that the tubes are frequently driven into their compressive performance regime.  Its fine that they like it, and that they are willing to pay for it.  Defending their 'refined' listening preferences by saying there's no correlation between measurements and quality though, is a rationalization of their preference for harmonically sweetened playback.  High-fi it ain't.  (I'm sure there's a Kodachrome/Velvia analogy in here somewhere - look at all the pretty colors)

So, while I'm puzzled, along with many others, with the DX0 'resolution independent' quality measures, I can't subscribe to the 'numbers don't matter' school.  Badly formulated or mis-applied metrics are common.  More common is mis-interpretation of technical measurements by non-specialists.  Most common by far, sadly, is over-reliance on, and endless yammering over the the importance of this or that technical parameter in what is, outside of medical and forensic imaging, an artistic endeavor more than a  technical one.  The numbers, properly formulated and interpreted, can characterize the performance of the instrument, but can't predict the artistic quality of the product.

Regards,

--Tom
« Last Edit: February 06, 2009, 08:37:28 PM by ErikKaffehr » Logged

JRSmit
Sr. Member
****
Offline Offline

Posts: 417


WWW
« Reply #52 on: February 07, 2009, 06:39:35 AM »
ReplyReply

Tom,

Just for the record, the measurments included all measurements (bandwidth, harmonics, IM, step signal(ringing)) typically used at that time, and typically used in marketing stuff and to limited extend in reviews. FTT for instance was then limited to highly specialized R&D laboratories, and not generally used.
In my case it was performed with calibrated quipment.
Nevertheless i agree with you that it must be measurable. And there are reproducable measurements that show for instance that cables do show different characteristics, and yes there are audible differences, and yes different is not always better.
Nor are so called blind tests above any error or disruption therefore of absolutely not of undisputable quality, check the archives of HiFi News & Record Review for instance.

I also agree that marketing and sales in the Audio realm preferred mystique (or snake oil, or ..) over real reproducable arguments, it simply made more money.

I do not agree that the "differences without distinctions" is limited to hobbyists, after all it are mostly educated professionals that design and produce the stuff. Just see if you can retrieve some biography on Andy Rappaport.

That is what concerns me in the photography realm, this history pattern repeats itself.

Also pure simple logic says that the more sites in a sensor for a given scene to take a picture form, the more data about the scene is captured in the picture, the bigger a site, if all other parameters are equal, the better the SNR is. So the next thing is the ADC, there a lot can go wrong, etc.
If somewhere in the total chain a limitation in resolution is introduced, some part of the signal is lost and cannot be restored again in subsequent stages of the chain.

So i agree with you about your statement being puzzled by the DxO quality measures, and agree that numbers do matter, and yes they, unless complete and proven correlated with human perceived results, only tell a portion of the total picture. Until then, just do not let them become leading in assessing quality.

I like your indication "artistic quality".

Regards,

Jan R.

Quote from: TomWalton
For the past 30 years, I've been professionally involved in designing signal processing systems for audio, sonar, radar, and telecommunications systems.  I've also read way too many statements about the mystical ineffability, the 'magic', the unknowable connection between design/implementation and perceived performance.  If your two pre-amps measured "identically"  (presumably in frequency response) and performed differently at a level reproducibly perceptible to  humans in blind trials, then you didn't do enough measurements.  Perhaps it was phase or amplitude non-linearity, or transient ringing effects, etc., but if its a gross enough effect for a human to distinguish, it has a measurable (in some parameter) effect on the signal waveform.

I'm not saying that different designs don't sound different, or that people shouldn't prefer one device's 'interpretation' over another.  People can be pretty good at 'different', but we're really terrible at 'better'.  Outside controlled blind trials, we're not even very good at reliably detecting differences among similarly performing gadgets.  The shameful/successful marketing of $1000 per foot pure crystalline-aligned unobtanium-alloy speaker cables shows how effective the marketing guys are at creating distinctions without differences in the minds of vulnerable, enthusiastic (well-heeled) hobbyists.  

Much of what is sold as high-end audio today is quantum snake oil.  The tube amplifiers that are so popular today among the 'golden ears' MR refers to sound good in their judgment, but they're (by design) not high-fidelity, if hi-fi is taken to mean faithful reproduction of the audio signal.  Tubes are well known for their compressive amplitude transfer characteristics, which in a nice Class A configuration (with little feedback to suppress the harmonic distortion) will produce the sweet 'syrupy'  harmonics (distortion) that the golden ears have declared to be High Fidelity (its especially sweetening to acoustic instruments and human voices.   I even like it.). Their preference for low-power final amplifiers (or overdriven pre-amp stages) is to insure that the tubes are frequently driven into their compressive performance regime.  Its fine that they like it, and that they are willing to pay for it.  Defending their 'refined' listening preferences by saying there's no correlation between measurements and quality though, is a rationalization of their preference for harmonically sweetened playback.  High-fi it ain't.  (I'm sure there's a Kodachrome/Velvia analogy in here somewhere - look at all the pretty colors)

So, while I'm puzzled, along with many others, with the DX0 'resolution independent' quality measures, I can't subscribe to the 'numbers don't matter' school.  Badly formulated or mis-applied metrics are common.  More common is mis-interpretation of technical measurements by non-specialists.  Most common by far, sadly, is over-reliance on, and endless yammering over the the importance of this or that technical parameter in what is, outside of medical and forensic imaging, an artistic endeavor more than a  technical one.  The numbers, properly formulated and interpreted, can characterize the performance of the instrument, but can't predict the artistic quality of the product.

Regards,

--Tom
Logged

Fine art photography: janrsmit.com
Fine Art Printing Specialist: www.fineartprintingspecialist.nl


Jan R. Smit
JRSmit
Sr. Member
****
Offline Offline

Posts: 417


WWW
« Reply #53 on: February 07, 2009, 06:45:07 AM »
ReplyReply

Toronto is some distance from Amsterdam, hard to just drop by, especially with this "pond" in between ;-)

However plans are to go to the USA in less than 2 weeks, i will try to get the return flight replanned to get to Toronto and pay you a visit.
I do hope it will be possible.
It would be in the period 20 - 22 February, are you there then?


Regards,


Jan R.


Quote from: michael
It's not the critical "exactly the same" test that some people want, but I now have about 15 prints from my recent shoot in Antarctica hanging at my gallery. Within the next week I should have about 30. In mid-March I'll have an open house and show. In the meantime if anyone wants to drop by my Toronto gallery when I'm there I'd be happy to show them to you.

What do they show? Most are in the 20" X 28" size range, matted to 28X34". Some are shot with the Sony A900 and some with the Phase One P65+. Can one see a difference at this size? Yes, absolutely if you know what to look for, and it's not just about resolution. But to anyone except a technically knowledgeable observer these differences don't jump out. They all look pretty terrific.

There are no simple answers.

Michael
Logged

Fine art photography: janrsmit.com
Fine Art Printing Specialist: www.fineartprintingspecialist.nl


Jan R. Smit
Ray
Sr. Member
****
Offline Offline

Posts: 8948


« Reply #54 on: February 07, 2009, 08:04:02 AM »
ReplyReply


Quote
My opinion is more like:

- Something that measures bad is normally bad, at least if measurement is relevant
- There are things that are hard to measure which may have a high subjective significance

Erik,
That sound reasonable, except I'm very curious as to what it might be that has a high subjective significance but which is hard to measure.

Before anyone jumps in with the notion of 'artistic' merit, let me say that measurements of camera performance have nothing to do with the artistic merit of the photos that may be taken with the camera. This discussion has nothing to do with artistic quality. As Ansel Adams used to say, "There's nothing worse than a sharp image of a fuzzy concept".

Quote
One issue I'd like to point out is that we are obsessed with sharpness and resolution. But any decent equipment today gives very good sharpness and resolution at optimum conditions. Small prints like 8x10 never gets us close to resolution limits of todays DSLRs so if we see that an image from one system is better than another in small prints we probably are not discussing resolution but something else.

DXO doesn't address sharpness and resolution directly, and probably for good reason. Those qualities are so variable according to the quality of the lens used. However, I suspect sharpness is affected by a poor SNR. At base ISO, SNR is reasonably good for most cameras, so the resolution of the sensor relates very strongly to the pixel count, with slight variation according to the strength of the AA filter. However, at high ISO where SNR can be either quite good or quite poor, I suspect that a low SNR would indicate poor resolution, but always in relation to the pixel count of course. Is this not so?

Quote
Another issue is that the digital imaging chain is very flexible regarding rendition. There are very few software solutions, if any, that really work with pristine non manipulated images, so we always compare the output of two highly configurable processing chains.

That's very true. But the RAW file is the digital negative. There may be many ways of improving the development process of that RAW file as one's skills improve, and as the performance of new RAW converters improve over time. It therefore seems important when choosing a camera to get one which produces the best RAW files, with the lowest noise, the greatest tonality, the highest DR etc, but not of course at the expense of all other considerations. That would be silly.

For example, the A900 has a very desirable feature in the form of an anti-shake sensor which applies to all lenses fitted to the camera. Some people might quite reasonably take the view that the anti-shake sensor of the A900 is more of an asset than the better high-ISO performance of another camera which doesn't have an anti-shake sensor. Others might take an opposite view. That's fair enough. But in order to know what the high-ISO performance is in the first place, one needs the sort of objective measurements that DXO is providing.

Without such objective testing, we are left with endless subjective squabbling and total confusion.
Logged
JRSmit
Sr. Member
****
Offline Offline

Posts: 417


WWW
« Reply #55 on: February 07, 2009, 09:11:21 AM »
ReplyReply

Ray,

The "RAW file being the digital negative", perhaps more to the point, the "developed digital negative".
The sensor site voltage values as captured then best compares with the exposed silverhalide crystals in the emulsion.
I take it we all know from B/W development experience how developing methods and materials can influence the outcome.
The DxOLab measures the outcome of the "development". How this "development" is done is company proprietary.

Anyhow, the endresult is what matters.


Regards,

Jan R.
Logged

Fine art photography: janrsmit.com
Fine Art Printing Specialist: www.fineartprintingspecialist.nl


Jan R. Smit
Graeme Nattress
Sr. Member
****
Offline Offline

Posts: 582



WWW
« Reply #56 on: February 07, 2009, 09:42:28 AM »
ReplyReply

Measurements are great - but what do you measure, how do you measure it, and how do you compare it? What is  the usefulness of a resolution measure without a corresponding measurement of the degree of aliasing, or usable contrast at that resolution. None of this boils down to a single number.

Measure noise, and it might be ugly noise, or it might be a nicer looking noise. A single number doesn't tell you anything about the character of the noise. Given the prevalence of noise reduction software, how do you account for some types of noise being easier to remove than others in a single number?

Measurements are great for engineers if they're repeatable and tell you  something useful to  help you with your design. Designing a good test for a certain quality is at least as important as to how you engineer improvements in that certain quality. As someone who has to do this, I spend more time eyeballing and doing direct A/B image comparisons than I do looking at measurement numbers. Numbers are the guide, but the resulting image is ALL.

Graeme
Logged

www.nattress.com - Plugins for Final Cut Pro and Color
www.red.com - Digital Cinema Cameras
ErikKaffehr
Sr. Member
****
Offline Offline

Posts: 8031


WWW
« Reply #57 on: February 07, 2009, 10:54:45 AM »
ReplyReply

Ray,

Some answers:
"That sound reasonable, except I'm very curious as to what it might be that has a high subjective significance but which is hard to measure."

Something I had on my mind was flare and ghosting, they are difficult to quantify because they are highly dependent on the shooting conditions.

Another issue is that there is often some esoteric quality attributed to certain lenses like "3D-look", I don't know what that means. I have two Sony lenses with blue labels saying Zeiss and I would say that they perform better than they test, at least in my own tests. A good explanation is something I don't have. Some clues I may have is that Zeiss may have priority on the central part of the image where they strive for high MTF at some frequencies whereas they put little emphasis on edge performance at large apertures. When I'm testing lenses I'm always looking at the corners at full aperture because I know that is the weakest point of all lenses.

Another such aspect is that knowledgeable and experienced people, like Michael Reichmann, say that the advantage of MFDBs is visible on relative small prints. This may be contrary to my expectation but there must be some reason. I made some experiments where I compared 12 MP A700 pictures with 24 MP A900 pictures printed in A2 and could see little difference, although there was a large advantage to the 24 MP A900 files when looking at similarly uprezzed files at actual pixels.

There are a few comments on this forum essentially claiming that DxO measurements don't have results that fits expectations or experience and therefore they must be trash, In my opinion it's better to discuss what DxO measures and what is does not.

Best regards
Erik


Quote from: Ray
Erik,
That sound reasonable, except I'm very curious as to what it might be that has a high subjective significance but which is hard to measure.

Before anyone jumps in with the notion of 'artistic' merit, let me say that measurements of camera performance have nothing to do with the artistic merit of the photos that may be taken with the camera. This discussion has nothing to do with artistic quality. As Ansel Adams used to say, "There's nothing worse than a sharp image of a fuzzy concept".



DXO doesn't address sharpness and resolution directly, and probably for good reason. Those qualities are so variable according to the quality of the lens used. However, I suspect sharpness is affected by a poor SNR. At base ISO, SNR is reasonably good for most cameras, so the resolution of the sensor relates very strongly to the pixel count, with slight variation according to the strength of the AA filter. However, at high ISO where SNR can be either quite good or quite poor, I suspect that a low SNR would indicate poor resolution, but always in relation to the pixel count of course. Is this not so?



That's very true. But the RAW file is the digital negative. There may be many ways of improving the development process of that RAW file as one's skills improve, and as the performance of new RAW converters improve over time. It therefore seems important when choosing a camera to get one which produces the best RAW files, with the lowest noise, the greatest tonality, the highest DR etc, but not of course at the expense of all other considerations. That would be silly.

For example, the A900 has a very desirable feature in the form of an anti-shake sensor which applies to all lenses fitted to the camera. Some people might quite reasonably take the view that the anti-shake sensor of the A900 is more of an asset than the better high-ISO performance of another camera which doesn't have an anti-shake sensor. Others might take an opposite view. That's fair enough. But in order to know what the high-ISO performance is in the first place, one needs the sort of objective measurements that DXO is providing.

Without such objective testing, we are left with endless subjective squabbling and total confusion.
Logged

Ray
Sr. Member
****
Offline Offline

Posts: 8948


« Reply #58 on: February 07, 2009, 04:03:02 PM »
ReplyReply

Quote from: JRSmit
Ray,

The "RAW file being the digital negative", perhaps more to the point, the "developed digital negative".
The sensor site voltage values as captured then best compares with the exposed silverhalide crystals in the emulsion.
I take it we all know from B/W development experience how developing methods and materials can influence the outcome.
The DxOLab measures the outcome of the "development". How this "development" is done is company proprietary.

Anyhow, the endresult is what matters.


Regards,

Jan R.

Jan,
That's not how I understand it. DXO are trying to test the performance of the sensor independently of the performance of any particular RAW converter. They measure the sensor's response before demosaicing has taken place, for example.

When I say a RAW file is a digital negative, I mean it's analagous to an undeveloped piece (or roll) of film. How much would film afficionados pay for a magical roll of film that could be developed and redeveloped in as many ways as they liked and as often as they liked? After developing a 36 exposure, ISO 100 roll of film, suppose you find that there are a couple of shots that should have been 'push processed' because they were underexposed, and another couple of shots that needed less development time because they were overexposed, and yet another couple of shots that could have benefited from a different type of developer.

When this happens with film, you're basically stuffed. You only get one chance at development. With RAW files, the opportunities are endless. I know I can certainly get better results now from my 5 1/2 year old D60 raw files using the latest version of ACR, than I did with the early version of BreezeBrowser that I used before ACR was available.
Logged
michael
Administrator
Sr. Member
*****
Offline Offline

Posts: 4925



« Reply #59 on: February 07, 2009, 04:18:25 PM »
ReplyReply

Jan,

I am on vacation in Costa Rica from Feb 14 till 21 and so I should be available in the days after.

If you wish to come by drop me a line beforehand and I'll be sure to be available as long as I'm in town. (I am teaching all day on the 25th).

Michael
Logged
Pages: « 1 2 [3] 4 5 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad