Ad
Ad
Ad
Pages: « 1 ... 6 7 [8] 9 10 ... 15 »   Bottom of Page
Print
Author Topic: Sigma DP Quattro  (Read 36473 times)
BartvanderWolf
Sr. Member
****
Online Online

Posts: 3583


« Reply #140 on: February 17, 2014, 07:04:36 AM »
ReplyReply

Here's "A Brief History of the Pixel" by Richard F Lyon, from Foveon (direct PDF link).

http://www.foveon.com/files/ABriefHistoryofPixel2.pdf

More papers from Foveon:

http://www.foveon.com/article.php?a=74

Bryce Bayer's patent on the Bayer sensor.  Notice that the sensels are referred to as luminance-sensitive elements and chrominance-sensitive elements.  The G sensels are considered luminance elements.  The word "pixel" doesn't appear for what it's worth.

http://www.google.com/patents/US3971065

Hi Luke,

The term pixel is defined as part of the ISO 12231 Standard (Third edition 2012):
"ISO 12231, Photography Electronic still-picture cameras Terminology"

Quote
3.4
addressable photoelements
number of active photoelements in an image, which is equal to the number of active lines of photoelements multiplied by the number of active photoelements per line
Note 1 to entry: It is possible that the number of addressable photoelements may be different for the different colour records of an image. When the signal values of the photoelements are digitized, the digitized code values may be referred to as picture elements, or pixels.
Note 2 to entry: This term is also defined in ISO 16067-1, ISO 16067-2 and ISO 21550.
[SOURCE: ISO 12233:2000, definition 3.1]

In 'Note 1' it is indicated that also a partial color sample, e.g. such as in a Bayer CFA, (once digitized) can be called a pixel, just like an RGB sample such as in a Foveon sensor. The definition also makes clear that the maximum number of addressable photoelements of the Foveon Quattro design is defined by the Top layer (5424 x 3616 pixels, 19.6 MP, after digitization).

Pixels are therefore related to output.

Quote
3.147
raw DSC image data
image data produced by, or internal to, a DSC that has not been processed, except for A/D conversion and the following optional steps: linearization, dark current/frame subtraction, shading and sensitivity (flat field) correction, flare removal, white balancing (e.g. so the adopted white produces equal RGB values or no chrominance), missing colour pixel reconstruction (without colour transformations)
[SOURCE: ISO 17321-1:2006, definition 3.4]

This again explains that either an RGB, or a partial color data sample after processing (including 'missing colour reconstruction'), can be called a pixel.

IMHO, this all also shows that the inflated megapixel counts are there to confuse consumers with 'creative marketing', and are not what the industry standards really uses to describe the specifications of Digital Still Cameras (DSCs).

Cheers,
Bart
Logged
BartvanderWolf
Sr. Member
****
Online Online

Posts: 3583


« Reply #141 on: February 17, 2014, 07:17:47 AM »
ReplyReply

All this gets back to the idea of where the claimed 39MP comes from.  Being not a round number, one wonders how it was computed, and how veridical it is.

Hi Luke,

It's pretty unclear where that number comes from, other than from one of the interpolated JPEG output sizes. According to the DPReview article: "In addition to offering JPEGs at its 19.6MP luminance resolution, a 'Super-High' 39MP JPEG mode will also be offered (14-bit Raw files will include full 16.9+4.9+4.9MP data)."  I suppose that to be a typo, they intended to write 19.6+4.9+4.9, but got confused by the 16:9 aspect ratio from which the 39 Megapixel dimensions seem to stem. 39 MP would require some 8320 x 4688 pixels for a 16:9 aspect ratio.

Cheers,
Bart
« Last Edit: February 17, 2014, 07:24:37 AM by BartvanderWolf » Logged
Chris2500DK
Newbie
*
Offline Offline

Posts: 6


« Reply #142 on: February 17, 2014, 09:04:10 AM »
ReplyReply

Hi Luke,

It's pretty unclear where that number comes from, other than from one of the interpolated JPEG output sizes. According to the DPReview article: "In addition to offering JPEGs at its 19.6MP luminance resolution, a 'Super-High' 39MP JPEG mode will also be offered (14-bit Raw files will include full 16.9+4.9+4.9MP data)."  I suppose that to be a typo, they intended to write 19.6+4.9+4.9, but got confused by the 16:9 aspect ratio from which the 39 Megapixel dimensions seem to stem. 39 MP would require some 8320 x 4688 pixels for a 16:9 aspect ratio.

Cheers,
Bart

The only place the 39mp comes into play is with the SuperHigh 3:2 jpeg output which is 7,6805,120 pixels or 39.3 mpixels. Just how Sigma ended up with a ~41.5% scale up from the top layer resolution of the raw file in both dimensions being the right number is anyone's guess (unless they come out and tell us at some point), but it could be their estimate of "Bayer equivalent resolution".

It's going to be a hard sell, but I don't think it's completely unfair to claim higher resolution. Bayer sensors don't have full resolution in any of the three colors anyway, but that's generally accepted as "true resolution" none the same.
Logged
BartvanderWolf
Sr. Member
****
Online Online

Posts: 3583


« Reply #143 on: February 17, 2014, 09:58:32 AM »
ReplyReply

The only place the 39mp comes into play is with the SuperHigh 3:2 jpeg output which is 7,6805,120 pixels or 39.3 mpixels. Just how Sigma ended up with a ~41.5% scale up from the top layer resolution of the raw file in both dimensions being the right number is anyone's guess (unless they come out and tell us at some point), but it could be their estimate of "Bayer equivalent resolution".

Hi Chris,

Ah, 3:2 SuperHigh JPEG it must be! It also makes some sense, in that the diagonal resolution of a regular pixel grid is Sqrt(2) = 1.41x higher than the Horizontal/Vertical resolution. So they use an interpolation method that exploits that additional diagonal resolution, but interpolation it still is.

Quote
It's going to be a hard sell, but I don't think it's completely unfair to claim higher resolution. Bayer sensors don't have full resolution in any of the three colors anyway, but that's generally accepted as "true resolution" none the same.

I've done simulations of the effect of Bayer CFA demosaicing on resolution, and it only reduces luminance resolution by some 6.4%. The absolute limit on resolution is imposed by the physics of sampled imaging, and it is dictated by the sampling density (sensels/unit length) of the sensor. That physical resolution limit will be very closely approached by an AA-less sensor with a good lens (such as the Sigma DP Merrill/Quattro) that is not stopped down too far (to minimize diffraction), but that resolution boundary cannot be exceeded. It's physically impossible in a single capture to get higher resolution, even if the lens would be perfect.

Many comparisons between Bayer CFA sensors and Foveon designs are flawed by the use of an AA filter on the Bayer designs, and no AA-filter on the Foveons. It's sad, but simple to overlook that important fact.

Cheers,
Bart
Logged
LKaven
Sr. Member
****
Offline Offline

Posts: 798


« Reply #144 on: February 17, 2014, 10:02:26 AM »
ReplyReply

Hi Bart,

We're obviously both contributing true things.  

I started with the claim that there is nothing that necessitates a 1:1 mapping between sensor elements and tricolor raster pixels.  And I think the multilevel Foveon Quattro challenges our ideas of what a paradigmatic sensel is.  

I think you agree on this much.  You would say that Foveon Quattro has 19M "pixels" and not 29M.  So you would count only the first level sensors as "pixels" but not the second and third levels.  So there is no necessity for a 1:1 mapping in your view.

But I also claim that nothing necessitates that the number of final output pixels be equal or less than the number of sensels.  It is an empirical question: a contingent matter of fact, not a necessary one.  The number of output pixels might be greater than the number of sensels.  

This is especially the case here because the digital photography sensors, of those that we know so far, commit one to confabulation, unlike in classical Nyquist theory.  As such, judgments of being "believable" or "convincing" enter into the picture.  

And for some reason, Sigma claims a capability of 39M pixels from this sensor.  Why 39M, and not 38M or 40M?  What is it a function of?  Whatever the case, I suspect that there is something both convincing and believable about those 39M pixels in just the way that attempting to derive 39M pixels from a 12MP Bayer sensor would look neither believable nor convincing.  
  

Notes

For historical reasons, I dug up Bayer's patent, and took an interest in the way that he named his sensor elements as luminance-sensitive and chrominance-sensitive.  I do not take this as a counterargument against the idea that they are also "pixels" in some practical sense, so we're not in real disagreement here.  The key word here is "practical".  

Sometimes organizations adopt "standards" or institutionalized definitions as a practical way to regulate discourse in an active area of research and development.  In some cases, these definitions do capture something essential about the subject of the definition.  

In many cases, however, nominal definitions are applied to subjects that do not admit a nominal essence.  And in some cases, a merely stipulative definition is used for purposes of social regulation, sometimes for political reasons.  Unfortunately, often the stipulative definition offers no truth value; it is simply what we call a "nominal kind".  

For me, the ISO definition of "pixel" falls into some of these areas.  It is stipulative, though it does capture something of the subject.  It is there to regulate discourse.  It is clearly not authoritative in any scientific sense, nor immune to revision.  Maybe this is the best it will get.

Logged

LKaven
Sr. Member
****
Offline Offline

Posts: 798


« Reply #145 on: February 17, 2014, 10:07:31 AM »
ReplyReply

Many comparisons between Bayer CFA sensors and Foveon designs are flawed by the use of an AA filter on the Bayer designs, and no AA-filter on the Foveons. It's sad, but simple to overlook that important fact.

I'd be interested to see an A/B comparison between, say, a Nikon/Toshiba 24MP APS-C sensor without OLPF, and a Quattro.

Then I'd like to see the same images rendered at 39MP.  I wonder which one will fall apart first and why?

One day soon, we will see one of these 39MP images, and then we'll see how believable it looks, or not.
Logged

BartvanderWolf
Sr. Member
****
Online Online

Posts: 3583


« Reply #146 on: February 17, 2014, 10:59:44 AM »
ReplyReply

I'd be interested to see an A/B comparison between, say, a Nikon/Toshiba 24MP APS-C sensor without OLPF, and a Quattro.

Then I'd like to see the same images rendered at 39MP.  I wonder which one will fall apart first and why?

Hi Luke,

Yes, that would be fun. It could even be done in such a way that an objective resolution limit value can be established, instead of the usual subjective 'tests'. The ISO suggests to use a star target for that purpose, because it is relatively insensitive to JPEG tone curve and local sharpening adjustments.

ISO 12233:2014(en) [Photography Electronic still picture imaging Resolution and spatial frequency responses]
Quote
A second sine wave-based SFR metrology technique is introduced in this edition. Using a sine wave modulated target in a polar format (e.g. Siemens star), it is intended to provide an SFR response that is more resilient to ill-behaved spatial frequency signatures introduced by the image content driven processing of consumer digital cameras.

Quote
One day soon, we will see one of these 39MP images, and then we'll see how believable it looks, or not.

Yes, I'm also looking forward to it.

Cheers,
Bart
Logged
eronald
Sr. Member
****
Offline Offline

Posts: 3976



WWW
« Reply #147 on: February 17, 2014, 12:46:28 PM »
ReplyReply

Hi Bart,

We're obviously both contributing true things.  

I started with the claim that there is nothing that necessitates a 1:1 mapping between sensor elements and tricolor raster pixels.  And I think the multilevel Foveon Quattro challenges our ideas of what a paradigmatic sensel is.  

I think you agree on this much.  You would say that Foveon Quattro has 19M "pixels" and not 29M.  So you would count only the first level sensors as "pixels" but not the second and third levels.  So there is no necessity for a 1:1 mapping in your view.

But I also claim that nothing necessitates that the number of final output pixels be equal or less than the number of sensels.  It is an empirical question: a contingent matter of fact, not a necessary one.  The number of output pixels might be greater than the number of sensels.  

This is especially the case here because the digital photography sensors, of those that we know so far, commit one to confabulation, unlike in classical Nyquist theory.  As such, judgments of being "believable" or "convincing" enter into the picture.  

And for some reason, Sigma claims a capability of 39M pixels from this sensor.  Why 39M, and not 38M or 40M?  What is it a function of?  Whatever the case, I suspect that there is something both convincing and believable about those 39M pixels in just the way that attempting to derive 39M pixels from a 12MP Bayer sensor would look neither believable nor convincing.  
  

Notes

For historical reasons, I dug up Bayer's patent, and took an interest in the way that he named his sensor elements as luminance-sensitive and chrominance-sensitive.  I do not take this as a counterargument against the idea that they are also "pixels" in some practical sense, so we're not in real disagreement here.  The key word here is "practical".  

Sometimes organizations adopt "standards" or institutionalized definitions as a practical way to regulate discourse in an active area of research and development.  In some cases, these definitions do capture something essential about the subject of the definition.  

In many cases, however, nominal definitions are applied to subjects that do not admit a nominal essence.  And in some cases, a merely stipulative definition is used for purposes of social regulation, sometimes for political reasons.  Unfortunately, often the stipulative definition offers no truth value; it is simply what we call a "nominal kind".  

For me, the ISO definition of "pixel" falls into some of these areas.  It is stipulative, though it does capture something of the subject.  It is there to regulate discourse.  It is clearly not authoritative in any scientific sense, nor immune to revision.  Maybe this is the best it will get.



I think you could usefully employ the word normative, as in grandfathering in existing implementations by incumbents and providing barriers to entry, in the case of the ISO resolution measurement standard among others.

Edmund
Logged

Edmund Ronald, Ph.D. 
LKaven
Sr. Member
****
Offline Offline

Posts: 798


« Reply #148 on: February 17, 2014, 01:05:01 PM »
ReplyReply

I think you could usefully employ the word normative, as in grandfathering in existing implementations by incumbents and providing barriers to entry, in the case of the ISO resolution measurement standard among others.

I agree!
Logged

The Ute
Full Member
***
Offline Offline

Posts: 187


« Reply #149 on: February 17, 2014, 05:09:11 PM »
ReplyReply

Sigma CEO was interviewed again at CP+.

He was asked when the new Quattro might be made available ?

He said he hopes it will be available by June.

Nothing on pricing because the interviewer did not even ask.
Logged
Chris2500DK
Newbie
*
Offline Offline

Posts: 6


« Reply #150 on: February 17, 2014, 11:25:11 PM »
ReplyReply

Regarding the 39mp output:
I'm guessing that upscaling a Merrill image by the same factor as the 39mp Quattro will give a good estimation.
It gives you a 30mp image that looks pretty good, you don't get that biting "Foveon sharpness" but it's still nice and detailed.
I wouldn't complain if it was a Bayer image viewed at 100%.
Logged
MrSmith27
Jr. Member
**
Offline Offline

Posts: 76


« Reply #151 on: February 19, 2014, 06:28:52 AM »
ReplyReply

Native resolution is 5,4243,616.
Thus double resolution (*sqrt(2)) is 7680*5120 which is exactly 39MP.

Sigma photo pro always had the option to export files at double size, so they are not really doing anything new other that now it can be done in camera. In-camera JPGs from the Merrill generation looked considerably worse than jpgs processed from x3f files. My thinking is that by producing files of bigger pixel dimension in camera those might look better.

As for the discussion if 39MP is a "fair" description: Well the argument ist that the camera actually is only 19MP as the highest number of pixel on one sensor layer is 19MP. In a way that makes sense. Then again I could argue that any Bayer image should only be 50% of the advertised Megapixels because that's the highest number of pixel for a given color. Or maybe I could argue that a Bayer image is actually only 25% of the advertised Megapixels, because only one in 4 pixels/sensels collects blue/red data? Or maybe it's 0% of the advertised Megapixel because it's only one large interpolation?
Logged
BartvanderWolf
Sr. Member
****
Online Online

Posts: 3583


« Reply #152 on: February 19, 2014, 07:00:28 AM »
ReplyReply

As for the discussion if 39MP is a "fair" description: Well the argument ist that the camera actually is only 19MP as the highest number of pixel on one sensor layer is 19MP. In a way that makes sense.

Not only does it make sense, it's demonstrably correct. Just take a shot of a star target, if you need proof.

Quote
Then again I could argue that any Bayer image should only be 50% of the advertised Megapixels because that's the highest number of pixel for a given color.

The ISO standards organization disagrees, but then what do they know about anything ...

And again, demonstrably the typical resolution limit of Red and Green and Blue channels approaches the Nyquist frequency, although at a lower modulation level (especially when an OLPF is used), and it may fluctuate between 50% and 100% of the limiting resolution depending on the Luminance differences between colors.

Cheers,
Bart
Logged
BJL
Sr. Member
****
Offline Offline

Posts: 5124


« Reply #153 on: February 19, 2014, 09:34:13 AM »
ReplyReply

I understand Sigma's discomfort about measuring sensor resolution by counting photosites, which gives Bayer CFA sensors some unfair advantage. But its counter-claims of "Bayer equivalent MP counts" just muddy the waters further.

How hard would it be to quote a few standard measures that are well-established in the photographic technical community, like lines per pixel height at 50% MTF or lines per pixel height at which aliasing sets in?  I know that these do not give a complete and perfect picture (no one or two numbers can) but they would be less imperfect and biased measures that any flavor of pixel count.


(Note: I suggest counting "lines" rather than "line pairs" or "cycles", just to be more closely-related to the pixel counts that people are already used to. The difference is just a factor of two, so does not bias any comparisons.)
« Last Edit: February 19, 2014, 11:20:12 AM by BJL » Logged
MrSmith27
Jr. Member
**
Offline Offline

Posts: 76


« Reply #154 on: February 19, 2014, 09:59:13 AM »
ReplyReply

Not only does it make sense, it's demonstrably correct. Just take a shot of a star target, if you need proof.

The ISO standards organization disagrees, but then what do they know about anything ...

And again, demonstrably the typical resolution limit of Red and Green and Blue channels approaches the Nyquist frequency, although at a lower modulation level (especially when an OLPF is used), and it may fluctuate between 50% and 100% of the limiting resolution depending on the Luminance differences between colors.

Cheers,
Bart

Could I not transfer this argument to vertical sensors and argue that this is explicitly why the middle and bottom layer should count for something?
Logged
Chris2500DK
Newbie
*
Offline Offline

Posts: 6


« Reply #155 on: February 19, 2014, 10:00:14 AM »
ReplyReply

You could measure resolution under specific circumstances, but if you do it in black and white you don't test color resolution, and for cameras with interchangable lenses how do you determine which lens to use?

Regarding the double size output from Sigma Photo Pro, for the older versions of the Foveon sensor (both the 4.7mp x3 and the Merrill generation) the double size output was doubled in both dimensions, so a Merrill file comes out at 9408x6272 pixels, or 60mp.
Logged
BartvanderWolf
Sr. Member
****
Online Online

Posts: 3583


« Reply #156 on: February 19, 2014, 10:46:22 AM »
ReplyReply

Could I not transfer this argument to vertical sensors and argue that this is explicitly why the middle and bottom layer should count for something?

Hi,

The vertical stacking only improves color resolution, although not as much in the 'Quattro' design compared to the 'Merrill' design. The densest sampling is done in the top layer, which therefore dictates maximum resolution. Sampling pitch dictates Nyquist frequency.

This is verifiable by shooting a test chart. Shoot one with a 'Merrill' design, and again with a 'Quattro' design, and one can measure the diameter of the central blur disc. Both designs will probably resolve all the way to 92 pixels diameter, which equals Nyquist, and the straight 'radials' break out in hyperbolic luminosity aliasing artifacts within that diameter.

The 'Merrill' design with its 5 micron sampling pitch will therefore resolve (144 / pi) / (92 pixels x 0.005 millimetre) = 99.6 cycles/mm at Nyquist, and the 'Quattro' design with its 4.33 micron sampling pitch will therefore resolve (144 / pi) / (92 pixels x 0.00433 millimetre) = 115.1 cycles/mm at Nyquist. Because the sensors have the same physical dimensions, the 'Quattro' design will allow output that is some 15.6x larger than from a 'Merrill' design with the same output resolution.

As a reference, a Nikon D800E will have a Nyquist resolution (which it almost reaches) of 102 cycles/mm on sensor, but because the sensor array is physically (24/15.7mm = 53%) larger, it will have a bit higher output resolution (because it requires less magnification to reach the same output size), or can be output larger with the same output resolution.

The somewhat higher color resolution of the Foveons will narrow the gap somewhat, but not fully. So a 39MP file from the 'Quattro' design will not match the 36 MP result from the D800E. That is my prediction, but you don't have to take my word on it, you can test it yourself. That's why I make these tools available for free.

Cheers,
Bart
« Last Edit: February 19, 2014, 11:18:31 AM by BartvanderWolf » Logged
RobertJ
Sr. Member
****
Offline Offline

Posts: 590


« Reply #157 on: February 19, 2014, 10:51:59 AM »
ReplyReply

Speaking of the double size with a RAW file in SPP, I don't recommend using it at all.

I tested a normal file upsized in PhotoZoom Pro vs. a double size file from RAW in SPP, and SPP's double size output introduced jagged edges around highlights (the reflection of the light source in a metal necklace was outlined in jaggies).  

PhotoZoom Pro didn't produce any artifacts/jaggies and is far, far superior.  It's about as good as it gets, actually.  Don't output double size!

Actually, I've found that when any RAW program can output a larger size, although you'd think it's the best way to upsize, it turns out to be no better than PS, and no where near PhotoZoom Pro...
Logged
BartvanderWolf
Sr. Member
****
Online Online

Posts: 3583


« Reply #158 on: February 19, 2014, 11:11:56 AM »
ReplyReply

You could measure resolution under specific circumstances, but if you do it in black and white you don't test color resolution, and for cameras with interchangable lenses how do you determine which lens to use?

Hi,

It's not so much Black and White that matters but the luminance differences, and different colors usually also have luminance differences. That's why the ISO resolution tests for digital still cameras are mostly luminance oriented.

It is possible to design a worst case scenario resolution target, but that would not give much practical info because such subject colors are rarely encountered in real life. Besides, different colors have different diffraction blur sizes, so at apertures narrower than f/5.0 the resolution will be visually lower for Red than for Green (although it's hard to see because of the lower color resolution of our eyes, and lower color detail in our subjects). Lenses also are usually better corrected in Green wavelengths as well, exception are the very expensive Apochromatic lens designs. I think the 'Quattro' design will strike a nice balance between what can be resolved and what our eyes can detect.

Cheers,
Bart
Logged
BartvanderWolf
Sr. Member
****
Online Online

Posts: 3583


« Reply #159 on: February 19, 2014, 11:15:12 AM »
ReplyReply

Actually, I've found that when any RAW program can output a larger size, although you'd think it's the best way to upsize, it turns out to be no better than PS, and no where near PhotoZoom Pro...

Yes, I agree. PhotoZoom Pro is superior, BTW for all types of sensors ...

Cheers,
Bart
Logged
Pages: « 1 ... 6 7 [8] 9 10 ... 15 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad