Ad
Ad
Ad
Pages: [1] 2 3 4 »   Bottom of Page
Print
Author Topic: DxO Sensor Mark  (Read 17842 times)
dreed
Sr. Member
****
Offline Offline

Posts: 1208


« on: January 28, 2011, 07:44:42 PM »
ReplyReply

I find it quite interesting that there are various successor models (GH-2, 30D, etc) that have arrived with a sensor that is inferior to their predecessor. Looking at the data, it's tempting to adopt a position where the purchasing of a new camera (for the pursuit of better quality photographs) is delayed until the relevant data shows up on sites such as DxO's to ensure that image quality doesn't drop.
Logged
ErikKaffehr
Sr. Member
****
Online Online

Posts: 7239


WWW
« Reply #1 on: January 29, 2011, 01:35:18 AM »
ReplyReply

Hi,

There have been some discussion been around GH1 vs. GH2 and DxO data. According to Michael Reichmann the GH2 has better image quality than the GH1, or at least no worse.

DxO says that it would take about five DxO-mark points to tell image quality apart. The GH2 has 50% more pixels which may benefit if you print large or have very fine detail.

I'd suggest that the measurements are quite accurate, but they may not agree with photographers perception of real world image quality.

Check this link:
http://www.dxomark.com/index.php/en/Camera-Sensor/Compare-sensors/(appareil1)/371|0/(appareil2)/640|0/(appareil3)/677|0/(onglet)/0/(brand)/Sony/(brand2)/Leica/(brand3)/Panasonic

I took this example because Michael Reichmann owns all three cameras. Michael ranks the Leica before the Sony. My guess is that Leica's lenses help a lot. They essentially pump more fine edge contrast (MTF) trough the sensor. The higher MTF (sometimes called microcontrast) would enhance shadow detail.

Finally DxO data is before raw processing, a slight increase in denies may hide most differences between GH1 and GH2.

I'd suggest that the DxO-mark data is very interesting but it may not always match real world photographic experience.

Best regards
Erik


I find it quite interesting that there are various successor models (GH-2, 30D, etc) that have arrived with a sensor that is inferior to their predecessor. Looking at the data, it's tempting to adopt a position where the purchasing of a new camera (for the pursuit of better quality photographs) is delayed until the relevant data shows up on sites such as DxO's to ensure that image quality doesn't drop.
Logged

wolfnowl
Sr. Member
****
Offline Offline

Posts: 5698



WWW
« Reply #2 on: January 29, 2011, 03:36:39 PM »
ReplyReply

Peter:

A small error you may wish to correct: "This is partly because Canon's two full-frame models (5D Mark II and 1Ds Mark IV) are currently 2 and 3 years old."

And I didn't know the Mark IV had been officially announced!   Grin

Mike.

P.S. Fascinating, article, BTW.
« Last Edit: January 29, 2011, 04:14:19 PM by wolfnowl » Logged

If your mind is attuned to beauty, you find beauty in everything.
~ Jean Cooke ~


My Flickr site / Random Thoughts and Other Meanderings at M&M's Musings
dreed
Sr. Member
****
Offline Offline

Posts: 1208


« Reply #3 on: January 29, 2011, 03:50:09 PM »
ReplyReply

Erik,

I think you've missed the point of what I was trying to say and that is that despite what camera manufacturers might hope, a newer camera sensor does not always appear to be better. More over, if it falls within the range that DxO can deliver it as being inferior without it being perceived as being inferior then the layman is unlikely to call it out as such. The question this raises is what sort of internal testing do camera manufacturers do when evaluating the performance of a new sensor and what are their goals with bringing new cameras to market? If it was just a problem for the Canon xxD line, I'd perhaps think less of it, but maybe it's not just a Canon problem...

I'm curious home much the brain plays on the subjective analysis of whether or not something is better or worse. If it is not possible to perceive any difference between sensor ability (due to the scores being too close) then are we likely to regard the newer as better because of psychological reasons? (it is *newer*, we've just bought it, therefore it must be at least as good, if not better..)
Logged
bradleygibson
Sr. Member
****
Offline Offline

Posts: 829


WWW
« Reply #4 on: January 29, 2011, 04:07:29 PM »
ReplyReply

My guess is that Leica's lenses help a lot. They essentially pump more fine edge contrast (MTF) trough the sensor. The higher MTF (sometimes called microcontrast) would enhance shadow detail.

My understanding is that DxOMark Sensor score data is obtained without the use of lenses in front of the sensors...  (This ensures the measurements are of sensor performance, not sensor+optics performance.)  They now have DXO Mark scores for sensor + lens scoring.
Logged

ErikKaffehr
Sr. Member
****
Online Online

Posts: 7239


WWW
« Reply #5 on: January 29, 2011, 04:26:26 PM »
ReplyReply

Hi,

That is a different issue. Let's assume that a subject signal varies between 0 and 50 photons and SNR is 10. A mediocre lens would perhaps transfer 10% contrast at Nyquist limit. So signal would vary between 0 and 5 (10% of 50). Because noise exceeds signal we would see no signal. Let's assume that lens transfers 40% contrast, than signal would vary between 0 and 20 photons and the signal would be clearly visible.

On the other hand, 40% at Nyquist is very high and would lead to fake detail.

The reasoning here is a bit oversimplified, but a lens with good MTF would yield better detail in the shadows even on a sensor that is relatively noisy.

The net result is that a sharp lens on a mediocre sensor may outperform a mediocre lens on a sensor with very little noise.

Best regards
Erik

My understanding is that DxOMark Sensor score data is obtained without the use of lenses in front of the sensors...  (This ensures the measurements are of sensor performance, not sensor+optics performance.)  They now have DXO Mark scores for sensor + lens scoring.

« Last Edit: January 29, 2011, 04:28:44 PM by ErikKaffehr » Logged

deejjjaaaa
Sr. Member
****
Offline Offline

Posts: 743


« Reply #6 on: January 29, 2011, 09:37:41 PM »
ReplyReply

My understanding is that DxOMark Sensor score data is obtained without the use of lenses in front of the sensors...

they actually use some form of optics - just look at the pictures published by DxoMark itself for example







now do you think that there is no lens mounted and somehow that camera is tested w/ open sensor ?

now they might be using some custom made fix w/ registration distance allowing the same lens to be used to test all cameras

just read the article = http://www.dxomark.com/index.php/en/Learn-more/DxOMark-database/DxOMark-testing-protocols/Testing-lab = it is full of photos showing the actual process and you can see cameras being tested w/ some lens mounted... just like they can't test sensors without them actually being in the cameras and with  firmware producing the raw files they can't test 'em  without some optics based on what kind of targets we can see they are actually using... the only way to really test w/o optics is to illuminate the whole sensor (making sure that reflections from anything in the mirror box are eliminated or greatly reduced) with either reflected or emitted light w/ some specific parameters, one at a time and repeat that through the set of light parameters - but this is clearly not the case based on the photos we can  see
« Last Edit: January 30, 2011, 02:38:31 PM by deejjjaaaa » Logged
deejjjaaaa
Sr. Member
****
Offline Offline

Posts: 743


« Reply #7 on: January 29, 2011, 09:44:50 PM »
ReplyReply

The GH2 has 50% more pixels which may benefit if you print large or have very fine detail.

where did you get 50% more pixels ? 12mp (4000 x 3000) vs 16mp (4608 x 3456) in 4:3 mode is not 50%
« Last Edit: January 29, 2011, 09:47:27 PM by deejjjaaaa » Logged
bradleygibson
Sr. Member
****
Offline Offline

Posts: 829


WWW
« Reply #8 on: January 29, 2011, 10:47:50 PM »
ReplyReply

they actually use some form of optics - just look at the pictures published by DxoMark itself for example

Clearly, my understanding was wrong.

now they might be using some custom made fix w/ registration distance allowing the same lens to be used to test all cameras

A possibility, but as you point out, that doesn't seem to be the case for the Sensor tests that they show the setup for...

Still, when I read (emphasis mine):

"All sensor scores reflect only the RAW sensor performance of a camera body... DxOMark does not address such other important criteria as image signal processing, mechanical robustness, ease of use, flexibility, optics quality, value for money, etc."

from their site, I am at a loss on how to reconcile how they are "not addressing optics quality" while at the same time appear to be using optics on their sensor tests.

Anyone with some insight into this?

Thank you,
« Last Edit: January 29, 2011, 11:10:52 PM by bradleygibson » Logged

ErikKaffehr
Sr. Member
****
Online Online

Posts: 7239


WWW
« Reply #9 on: January 30, 2011, 12:24:43 AM »
ReplyReply

Hi,

It's very simple. DxO-mark (sensor) is about sensor noise and color characteristics. So all they do is to project different color or grey patches on the sensor and measure the response. It is probably practical to use a lens or some kind optical device to project the patch on the sensor.

The lens testing is entirely a different matter. A special test target is photographed and analyzed by software to determine resolution, MTF, vignetting, distortion and lateral chromatic aberration.

Best regards
Erik


Clearly, my understanding was wrong.

A possibility, but as you point out, that doesn't seem to be the case for the Sensor tests that they show the setup for...

Still, when I read (emphasis mine):

"All sensor scores reflect only the RAW sensor performance of a camera body... DxOMark does not address such other important criteria as image signal processing, mechanical robustness, ease of use, flexibility, optics quality, value for money, etc."

from their site, I am at a loss on how to reconcile how they are "not addressing optics quality" while at the same time appear to be using optics on their sensor tests.

Anyone with some insight into this?

Thank you,
Logged

bradleygibson
Sr. Member
****
Offline Offline

Posts: 829


WWW
« Reply #10 on: January 30, 2011, 01:34:21 AM »
ReplyReply

Yes, of course the lens + sensor scores require optics, but the OP mentioned DxO Sensor Mark measurements, which I was under the impression was performed without optics.

The link you provided seems to show sensor measurements, for example, dynamic range and SnR calculations being performed with optics (which is apparently in contradiction to the quote from DxO I posted above).  
Logged

Peter van den Hamer
Newbie
*
Offline Offline

Posts: 43


« Reply #11 on: January 30, 2011, 02:55:46 AM »
ReplyReply

A small error you may wish to correct: "This is partly because Canon's two full-frame models (5D Mark II and 1Ds Mark IV) are currently 2 and 3 years old." [..]

Mike.

P.S. Fascinating, article, BTW.
Oops / thanks / fixed / thanks  Smiley
« Last Edit: January 30, 2011, 03:04:21 PM by vdhamer » Logged
Peter van den Hamer
Newbie
*
Offline Offline

Posts: 43


« Reply #12 on: January 30, 2011, 03:40:11 AM »
ReplyReply

Erik,

[..] a newer camera sensor does not always appear to be better. More over, if it falls within the range that DxO can deliver it as being inferior without it being perceived as being inferior then the layman is unlikely to call it out as such. The question this raises is what sort of internal testing do camera manufacturers do when evaluating the performance of a new sensor and what are their goals with bringing new cameras to market? If it was just a problem for the Canon xxD line, I'd perhaps think less of it, but maybe it's not just a Canon problem... [..]

You ask about models that are (presumably accidentally) a step back compared to their predecessor. You could widen this to "models that are not state-of-the art in terms of noise" (thus including competition: recent models with similar sensor sizes).  As far as I can tell things are getting quite professional in this area:
  • new models are tested by friendly photographers who are very familiar with previous or competing models. This covers broad and soft feedback.
  • models are submitted to DxO for private testing. This paid service covers very technical image quality experts.
  • various manufacturers test inhouse using DxO "Analyzer" software, which is related to the DxOMark tests
  • a major manufacturer will almost certainly have proprietary benchmarks (for hard numbers) as well as a few actual photographers on their pay roll for more comprehensive feedback
I suspect the industry is getting more mature about this level of benchmarking than say in the Canon 30D days (5 years ago): back then a good price for a solid camera with competitive resolution was important. Nowadays attention has partly shifted from MPixels to image quality. I believe that independent, repeatable, and more or less scientific benchmarks can accelerate this trend: it shifts the focus of manufacturers and users a bit by giving a clear yardstick.

But the achiles heel of "hard" benchmark numbers is always that they don't cover everything you may care about. You still need at least 3 yardsticks - just for image quality alone: noise data (e.g. DxOMark Sensor), resolution (sensor MPixels), and the impact of lens quality (the DxOMark Score - terrible name, or DPReview/Photozone.de/LuLa/etc, or your own quick-and-dirty tests). Peter
Logged
Peter van den Hamer
Newbie
*
Offline Offline

Posts: 43


« Reply #13 on: January 30, 2011, 04:29:12 AM »
ReplyReply

Yes, of course the lens + sensor scores require optics, but the OP mentioned DxO Sensor Mark measurements, which I was under the impression was performed without optics.

My understanding is also (like ErikKaffehr and deejjjaaaa) that sensor noise measurements are done by imaging back-lit neutral density filters using lenses (DxOMark-testing-protocols). All you want to do is illuminate a large part of the sensor with the same intensity. That won't work by pointing a bare camera body at a wall: you will get more vignetting via the mount and mirror box than you would with a lens. So you either put a lens on the camera or make your own collimating optics.

But DxOMark Sensor requires a different set of tests than used for Sensor-and-Lens ("DxOMark Score") testing. But all require optics. But the lens sharpness has negligible impact: you could arguably defocus the lens slightly and get the same results as long as you stay clear of the edges of the test patches. A pinhole would also (kind of) work ;-)

Engineers and scientists will point out that numerous questions remain about test details for any precision measurement: e.g. light source homogeneity, light source stability, finite test patch size, vignetting, dust on the source, dust on the optics. I can assure you (I worked for years in labs) that precisions measurements are a major headache. Some of these issues are nowadays covered by international standards where the experts jointly develop measurement protocols. DxOMark is active in some of these committees (source: LinkedIn and private communications). And DxO says that outside engineers regularly get to see the setup and discuss the procedures used. This is normal in engineering: if you challenge my measurement results, I either need to exhaustively document measurement details and you review them, and/or you send in experts to see if you can find a flaw in the measurments. You can bet that a major manufacturer will contact DxOMark whenever their products get lower scores than hoped for.

[fixed typo on 11-2-11, a historic day for other reasons]
« Last Edit: February 11, 2011, 03:00:28 PM by Peter van den Hamer » Logged
qwz
Jr. Member
**
Offline Offline

Posts: 82



WWW
« Reply #14 on: January 30, 2011, 07:02:28 AM »
ReplyReply

DXO's Noise and dynamic range may be fairly correct (but i think the approach of Alex Tutubalin - developer of LibRaw for example is more accurate cause analysis of this aspects via Per Channel basis and takes in aacount even per-сolumn black level subtraction, different to various sensors and usually not published by vendors)

According to this
http://www.dxomark.com/index.php/en/Learn-more/DxOMark-database/DxOMark-testing-protocols/Color-sensitivity
DXO color sensitivity is suitable ONLY for measurement of pre-raw (inside sensor and imaging processor) color noise reduction algorithms and not says anything about real-world, not Gretag target.

Сonsequently, digital backs or even some dslrs with CCD sensor (or some CMOS like sony 900|850) have really bad rating, not useful to evaluate its image quality ('cause photojournalists low-light work is not thing digital backs is designed for).




Logged
barryfitzgerald
Sr. Member
****
Offline Offline

Posts: 566


« Reply #15 on: January 30, 2011, 11:23:00 AM »
ReplyReply

I'm sure tech heads love all the numbers but the reality of things is that many of us merely care not what a test says but what happens for field use. And looking at some of their results I have to say their DR numbers simply do not make any sense nor do they reflect real world use or results.

This is the problem when you go on numbers alone DxO is a bit of fun no doubt of some use to some people but it's completely inadequate as a tool looking to influence a buying choice. It's a bit like the computer benchmarking sites some very nice details tests and ok as a rough guide but do you really care if you squeeze an extra few seconds on the photoshop benchmark? Probably not. I'd suggest using a camera..and looking at the real output of it before crunching numbers.
Logged
Peter van den Hamer
Newbie
*
Offline Offline

Posts: 43


« Reply #16 on: January 30, 2011, 02:28:34 PM »
ReplyReply

DXO color sensitivity is suitable ONLY for measurement of pre-raw (inside sensor and imaging processor) color noise reduction algorithms and not says anything about real-world, not Gretag target.

Сonsequently, digital backs or even some dslrs with CCD sensor (or some CMOS like sony 900|850) have really bad rating, not useful to evaluate its image quality ('cause photojournalists low-light work is not thing digital backs is designed for).

I don't quite get your point. My article doesn't provide a reason why medium-format cameras (or digital backs) don't score too well. I don't have that answer, so I am certainly interested in understanding this.

Let's break down the discussion into elements to see what precisely you don't agree with:
  • DxO's color sensitivity benchmark essentially measures the chroma noise in the raw file generated by the camera. Agree?
  • Using a handful of colors to sample the amount of chroma noise across the entire gamut makes sense. Agree?
  • The GretagMacbeth color chart is a workable choice for such a "handful of colors". Agree?
  • The DxOMark Sensor benchmark is agnostic to whether a sensor uses CCD or CMOS technology. It just looks at the results. Agree?
  • Medium-format digital backs score well on color sensitivity sub-benchmark. On average they do better than full-frame or smaller sensors. Agree?
  • Different raw converters will give different image quality, but DxOMark is benchmarking sensors rather than benchmarking raw converters or their demosaicing/noise_reduction algorithms. Agree?
  • It would be fair to use a familiar demosaicing algorithm, no additional noise reduction, and the manufacturer's color matrix to convert from the raw color space to a standard (e.g. sRGB) color space. And then to analyze the measured noise values. Agree?
I am not sure about the last bullet point (not sure DxO does this, not sure this is what they "should" do).

Peter
« Last Edit: January 30, 2011, 02:33:15 PM by vdhamer » Logged
Peter van den Hamer
Newbie
*
Offline Offline

Posts: 43


« Reply #17 on: January 30, 2011, 03:00:13 PM »
ReplyReply

looking at some of their results I have to say their DR numbers simply do not make any sense nor do they reflect real world use or results.
[..] I'd suggest using a camera..and looking at the real output of it before crunching numbers.

I believe DxOMark uses a definition of Dynamic Range that results in higher absolute numbers than most other definitions (e.g. clarkvision.com). Their definition is summarized in the article.

Comparing dynamic range data that has been measured in different ways, is indeed not very helpful. So I would just look at relative rankings (per data source) and approximate DR differences (per data source). Comparing DxOMark DR differences to DR differences measured/estimated elsewhere might work reasonably well.

Hope this helps, Peter
Logged
ErikKaffehr
Sr. Member
****
Online Online

Posts: 7239


WWW
« Reply #18 on: January 30, 2011, 05:20:23 PM »
ReplyReply

Hi,

Regarding DxO I like the measurements but dislike the score. That applies to both DxO-mark sensor and DxO-mark lens.

Best regards
Erik

But the achiles heel of "hard" benchmark numbers is always that they don't cover everything you may care about. You still need at least 3 yardsticks - just for image quality alone: noise data (e.g. DxOMark Sensor), resolution (sensor MPixels), and the impact of lens quality (the DxOMark Score - terrible name, or DPReview/Photozone.de/LuLa/etc, or your own quick-and-dirty tests). Peter
Logged

Sekoya
Newbie
*
Offline Offline

Posts: 16


« Reply #19 on: January 30, 2011, 05:23:36 PM »
ReplyReply

Peter,
the DxO results have widely been dissected to extract three basic underlying sensor properties: quantum efficiency, full well capacity, and read noise. I can understand that you might have wanted to limit the scope of your article and not discuss them there. But maybe you comment here whether you agree with the common approach taken to extract these values.
I find it particularly interesting to see progress over time, sensor size, and sensor designer for these three properties. As I understand them, quantum efficiency (which somehow includes the fill factor) helps with low light noise but does not really affect dynamic range, full well capacity (naturally scaled for sensel size) affects dynamic range but not low light noise and read noise affects both dynamic range and low light performance.
What I am struggling a bit with is the relationship between read noise and noise in the amplification and the A/D converter. It is said that for cameras which do not have decreasing DR when going from minimum to moderate ISO, the DR is limited by the noise in the A/D converter and not the read noise at the sensel. It is also said that the latest Sony sensors have such a great DR performance (and that somewhat includes the D3x sensor) because of the column ADC which have very low noise levels.
If you have any insight on these issues, I would be very glad to here it.

Sekoya
« Last Edit: January 30, 2011, 05:27:10 PM by Sekoya » Logged
Pages: [1] 2 3 4 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad