Ad
Ad
Ad
Pages: [1] 2 3 4 »   Bottom of Page
Print
Author Topic: Naked sensor  (Read 8488 times)
Dave Millier
Full Member
***
Offline Offline

Posts: 117


WWW
« on: February 15, 2012, 01:18:06 PM »
ReplyReply

Interesting article from Sean.

However, I feel he has understated the problems that come from omitting the AA filter. As a (not so proud) owner of a Kodak 14n and several Sigmas, it's very obvious to me that there is quite a bit more to it than colour moire. The Kodak is afflicted by a number of pixel level distortions, the well known "christmas tree lights" being an obvious example (takes the form of coloured speckles, points, lines and threads that occur on regularly repeating patterns). Those who think it only occurs on fabric or fly screens have obviously not tried to do landscapes with a Kodak - it happens on rock, brickwork, twigs, even shingle beaches. Sean's claim about the Foveon sensor seems wrong to me too. Luminance aliasing occurs just as readily even if it isn't so immediately obvious. But jaggies, diagonal hatching and rope-like artifacts on thin lines are readily discernible on some images. This test shot of mine has a beautiful demonstration in the spacing of the gaps in the balustrade as you look right to left and also the diagonal tiling pattern on the orange roof. You'll have to trust me on this one, but the tiling is actually a coventional horizonal pattern, the sensor has invented the diagonal stripes (indeed I suspect that this false data often by pure fluke makes the Foveon resolution seem higher than it actually is):

 

One thing I've wondered for some time since we have had cameras with in camera stabilisation is why can't the IS be used in reverse, to give the sensor a little shake just sufficient to blur the optical image to match the sensor resolution. Then we could safely do away with AA filters without getting images full of jaggies.
Logged

My website and photo galleries: http://www.whisperingcat.co.uk/
deejjjaaaa
Sr. Member
****
Offline Offline

Posts: 743


« Reply #1 on: February 15, 2012, 01:25:49 PM »
ReplyReply

One thing I've wondered for some time since we have had cameras with in camera stabilisation is why can't the IS be used in reverse, to give the sensor a little shake just sufficient to blur the optical image to match the sensor resolution.

well, Pentax was rumored to provide different vertical/horizontal AA blurring strength (through AA filters) in K10D to account for a sensor shake because of a shutter induced vibration in its less than perfect SR implementation... trying to make use of that deficiency so to say.
Logged
BJL
Sr. Member
****
Offline Offline

Posts: 5083


« Reply #2 on: February 15, 2012, 01:31:27 PM »
ReplyReply

Thanks for the tile example of color-less aliasing, and your other comments, which make sense to me. [Added: in particular, all digital (all discrete sampling) has aliasing unless suitable low pass filtering is applied, it is only color aliasing like moiré that is related to specific color sampling strategies like the Bayer CFA.]

On this point
One thing I've wondered for some time since we have had cameras with in camera stabilisation is why can't the IS be used in reverse, to give the sensor a little shake just sufficient to blur the optical image to match the sensor resolution. Then we could safely do away with AA filters without getting images full of jaggies.
check out this thread that I started recently: Ultrasonic sensor jiggling for AA effect
« Last Edit: February 15, 2012, 02:06:21 PM by BJL » Logged
theguywitha645d
Sr. Member
****
Offline Offline

Posts: 970


« Reply #3 on: February 15, 2012, 03:26:07 PM »
ReplyReply

After reiding the article, I think photographers don't understand significant figures. It seems that keeping numbers to every decimal place during a calculation is required.

This obsession to keep every possible piece of detail is so important even when this detail can only be perceived at home in front of their monitors at 100% is getting a little silly. It has even come to the point where photographers are afraid to open up or stop down a lens to have an ideal DoF for an image rather than lose "pixel-level detail" at the lens's "sweet spot." Even when this detail will never be perceived in a print. Although, manufactures can now save some money by just putting a fixed f/11 aperture in each lens and be done with it.
Logged
peterzpicts
Newbie
*
Offline Offline

Posts: 12


« Reply #4 on: February 15, 2012, 05:04:00 PM »
ReplyReply

Great article Sean, Loosing the AA filter makes lots of sense. I have been living AA free with my SD14 for almost 4 years. Although I think the 800lbs gorilla in the corner being ignored is the Foveon's lack of lateral color reconstruciton of CFA sensors.  This is the second part of what gives X3's look or "presence" as I like to call it.
I am not in denial of the quirks of the Sigma line, if Sigma could get them under control they could really make some headway.
Pete
Logged
Graeme Nattress
Sr. Member
****
Offline Offline

Posts: 582



WWW
« Reply #5 on: February 15, 2012, 07:23:09 PM »
ReplyReply

Talking about chroma moire as if that was the only issue when loosing an OLPF on a Bayer CFA is missing the point that luma aliasing can also be a pretty nasty problem, and unlike chroma moire, it is practically impossible to remove. Luma moire effects all types of sensor systems - the three chip system used in video cameras, the RGB stripe pattern used by Sony, the Foveon system as well as the Bayer CFA traditionally used in digital cameras.

If you have sufficiently high resolution on your sensor that you don't need an OLPF, then that only means that there is sufficient optical filtering occurring elsewhere in the system, be it the lens MTF, or diffraction from the aperture.

My concerns are doubly so for motion, where any aliasing or moire is more visible due to movements in the system causing the patterns to move in the opposite direction. This causes issues for motion adaptive codecs on the distribution side, and the bad effects of the aliasing are even harder to remove due to the motion.

Logged

www.nattress.com - Plugins for Final Cut Pro and Color
www.red.com - Digital Cinema Cameras
BernardLanguillier
Sr. Member
****
Offline Offline

Posts: 7523



WWW
« Reply #6 on: February 15, 2012, 07:33:20 PM »
ReplyReply

Nikon for sure has to be praised for being the first camera manufacturer who:

1. Provides both options,
2. Openly speaks about the possible issues with the AA filter less version.

For those photographers who think that having a credible back up is an essential part of their operations, getting one of each might be the best of both worlds.  Grin

Cheers,
Bernard
Logged

A few images online here!
bjanes
Sr. Member
****
Offline Offline

Posts: 2714



« Reply #7 on: February 16, 2012, 08:35:41 AM »
ReplyReply

Great article Sean, Loosing the AA filter makes lots of sense. I have been living AA free with my SD14 for almost 4 years. Although I think the 800lbs gorilla in the corner being ignored is the Foveon's lack of lateral color reconstruciton of CFA sensors.  This is the second part of what gives X3's look or "presence" as I like to call it.
I am not in denial of the quirks of the Sigma line, if Sigma could get them under control they could really make some headway.

Sean makes a big deal that with the M9 Leica did not want to ruin the resolution of their excellent lenses with a low pass filter. However, I understand that the short flange to sensor distance of the rangefinder (no retrofocus design needed to allow for mirror movement), it would be very difficult to include a low pass filter. Indeed, with the M8 they couldn't even include an infra-red filter. That short distance also creates problems with lens cast, especially with wide angle lenses.

Regards,

Bill
Logged
KirbyKrieger
Sr. Member
****
Offline Offline

Posts: 357


WWW
« Reply #8 on: February 16, 2012, 10:41:56 AM »
ReplyReply

After reiding the article, I think photographers don't understand significant figures. It seems that keeping numbers to every decimal place during a calculation is required.

This obsession to keep every possible piece of detail is so important even when this detail can only be perceived at home in front of their monitors at 100% is getting a little silly. It has even come to the point where photographers are afraid to open up or stop down a lens to have an ideal DoF for an image rather than lose "pixel-level detail" at the lens's "sweet spot." Even when this detail will never be perceived in a print. Although, manufactures can now save some money by just putting a fixed f/11 aperture in each lens and be done with it.

I'm going to clumsily wade into what I half-discern to be newly roiled waters plashing on the sands surrounding Michael's luminous lake.  (Partial disclosure  Wink : my degree is in rhetoric; my advanced degree is in painting; and that winky emoticon needs a skilled make-over.)

theguywitha645d makes a good point, poorly.  The point made is that the differences being discussed may not be perceivable in a print.  The way it was made was to:
 - demean the author by generalizing to "photographers",
 - demean the author by presenting the criticism as a failure to understand basic science
 - falsely strengthen the claim by presenting the straw man "the afraid photographer"
 - and then lather the false argument with the pepper-jelly of "knowing" sarcasm

I see this unmannered bullying behavior regularly.  I see it more and more frequently on LuLa.  It has no place in public discussion.  Imho.

Sean Reid carefully specifies that the differences he discusses are perceivable.  (I don't recall whether he differentiates between on-screen and printed.)  theguywitha645d's disputation might be better expressed as "IME, I don't see these differences in my prints.  Could you confirm that you do, at what size, and for what audience is this important?"

In general, I encourage all on-line participants to not make attempts to win arguments, but rather to refine knowledge.  Our tribal beavering is best devoted to chipping away the bark to reveal the pith.  Ask and answer specific questions.  Use your intellectual bite to make sense, not scents.

Logged

welder
Newbie
*
Offline Offline

Posts: 40


« Reply #9 on: February 16, 2012, 11:07:51 AM »
ReplyReply

Hmm. I love detailed photos as much as any photographer. But my gut feeling is this whole business about removing the AA filter is a bit overhyped. From what I've seen in comparisons that actually use the same cameras with the only diffence being the AA filter removed in one, the gain in resolution is rather small and the resulting artifacts when they appear are a bit garish to my eye. The moire patterns maybe not such a big deal because they can be corrected but the aliasing and jagged edges, that's harder to deal with....the images are sharper sure, but tend to feel less organic. Or maybe I'm just looking at the wrong examples  Undecided
Logged
billh
Full Member
***
Offline Offline

Posts: 108


« Reply #10 on: February 16, 2012, 02:01:37 PM »
ReplyReply

Kirby,
I love this! Both the “rhetoric” and the very welcome points made by your beautiful prose. I was a science major and hadn’t a clue a major like rhetoric existed.
I used the Ricoh A12 for a week and compared the images to those I took with a Sony NEX-5n and 7. The images from the Ricoh convinced me to order the D800E instead of the D800. I’ve read various explanations saying what we are seeing are artifacts, not actual image detail, but for me the bottom line is the images made without AA filters look as if they have more detail in them than those from sensors with AA filters. There seems to be an inherent desire lurking within many photographers to extract the most detail possible. In the past we used slow films and experimented to various developers, and when possible used a large format camera. Perhaps these new cameras sans AA filters are the fine grain film of the past.
Logged
theguywitha645d
Sr. Member
****
Offline Offline

Posts: 970


« Reply #11 on: February 16, 2012, 03:21:33 PM »
ReplyReply

Use your intellectual bite to make sense, not scents.



My degree is not in rhetoric, but imaging. But let me try again. At least let me try to frame the issues.

The problem about significant figures is applicable in imaging. Just because an effect can be measured, does not mean it is significant. Just as I can do math, it does not mean every decimal place should be kept.

Lets go into some basics. Image quality is subjective in that it is based on a viewer--there is no absolute frame in which to judge image qualities outside the reference of a human observer. Because of this, the idea of a standard viewing distance is important. The standard viewing distance is defined as a distance equal to the diagonal of the print/display image. You can use different definition depending criteria, but the standard viewing distance works well for how folks view an image and should be sufficient for this topic--I see nothing the Reid's essay where he is apply any special criteria to a photograph.

A 300dpi 8x10 inch print viewed at about 10 inches will appear as a continuous tone reproduction--the pixels will not be resolved. Because of standard viewing distance, a 150dpi 16x20 print viewed at 20 inches or a 75dpi 32x40 print viewed at 40 inches will appear the same and the pixels will still not be resolved. So the issue with image quality is really not one where when you print larger you need more pixels, but rather once you have reached a certain number of pixels the angular size of the pixels fall below where an observer can resolve them. If you do the math, when you reach about 7.2MP, you have a photo-quality image that can be printed to any size.

Are there reason to have more than 7.2MP? Yes, simply because the human visual system can work beyond its resolving power limits. Detection, for example, is very powerful in the visual system--think power cables in the sky or cell towers on the horizon. This is very different from resolving line pairs which is separating a surface with a frequency, but just noticing a line feature in a uniform field. So by going to high-resolution sensors we can perceive finer details and those details will have better contrast. But like anything, the human visual system is limited.

Another good reason for a standard viewing distance is that subjective image qualities such as depth of field and sharpness can be judged--both qualities change with ratio of viewing distance to print size. Because images from different formats are enlarged by different amounts to reach a display size, the circle of confusion (CoC) which defines both sharpness and depth of field are based on format and not on pixel pitch. An 18MP image that looks sharp at standard viewing distance, will still remain sharp if I divide the pixels into four creating a 72MP image. Whether an image looks sharp at a pixel level (the 18MP will look sharper than the 72MP up-rezed image at 100%) does not matter, it is whether the image looks sharp when viewed.

Lets look at Reid's basic hypothesis of why not wanting an AA filter. His argument is that you should get everything from your lens. Why? There is nothing to suggest the the human visual system can perceive infinite amounts of detail. There is a limit to our perception. Already a 7.2MP image will make a photo quality image where the pixels are unresloved.

Secondly, is it even possible? Since the resolving power of the lens is not simply a property of the lens, target contrast directly impacts resolving power, the lower the contrast, the lower the resulting resolving power.  Only the plane of focus would carry the information he is talking about, everything else will be something less, but DoF is a real quality and that something less sharp than at the plane of focus is still acceptably sharp. And then if I want a certain amount of DoF by either open up where aberrations impact detail or stopping down where diffraction impacts the detail, how is anything supposed to get "the most" out of your optics? How is the AA filter going to have a significant impact when so many variables are in the system? How can you spot from single images the impact of the AA filter?

What is more important, the smallest details or the perception of the image in its entirety. People want a pleasing image and will not and cannot see the stuff that you can see at a 100% monitor view. Here is some of my experience with this. People were really impressed by the wonderful 6' high Darwin poster I created for them for his birthday--he did not come, you understand. The print was made from a 700 pixel tall web image. I routinely print MFD files on 44" roll paper. No one has been able to spot the difference between a 22MP sensor or a 40MP sensor. 20" prints from m4/3 files look beautiful. And visitors to the Imaging Center I work at are amazed at the detail of a 24" microscope image take with a 4MP camera--they are always disappointed it is 4MP because it does not look that way to them. And when I print my MFD prints to 44x58, I can not even see the detail that is in the file when I put my face into the image--hardly a proper viewing condition and one I have never seen try by a viewer. Now, some of these cameras have an AA filter and some do not. Looking at large prints (20+"), I see no evidence of an AA filter--all the images are sharp and detailed. I am not saying these prints/images were all the same, but the images in and of themselves are sharp and detailed.

I think Reid is making a fundamental error in his evaluation. He is evaluating images at 100% on a monitor view. That is not a real world conditions. And as the pixel count goes up, that condition moves further and further away from a real viewing condition. The variability in the photographic process going to be far more significant than an AA filter--having an AA filter does not result in a soft image, just some loss to frequencies at the Nyquist limit. Reid's own argument suggests he really does not know, he is simply making an argument that because he thinks he sees detail at 100% more clearly without an AA filter, that that is significant detail. I have never read that is true nor do I have experience that that is true--my experience fits the theories I have studied. Reid himself does not really offer any support to his claim.
Logged
Dave Millier
Full Member
***
Offline Offline

Posts: 117


WWW
« Reply #12 on: February 16, 2012, 04:33:14 PM »
ReplyReply

This is really interesting stuff, thank you.

I have something to add that may slightly contradict one aspect of what you say.

On the DPReview Sigma forum where I spend too much time, there is (unsurprisingly) a great deal of attention spent on considering images on screen at 100%.  Some Foveon fans claim that they are fans because the characteristics of the Foveon mean you can enlarge a great deal more than you can with a CFA sensor and achieve bigger more detailed prints. But many more aren't interested in prints but viewing only on screen at 100% (where the Foveon usually looks good). 

I've questioned this habit in some depth and it appears that quite a few people get their enjoyment of photographs, not from admiring how skillful composition and lighting comes together in the whole image to make a great picture,  but rather from viewing at 100% and exploring all the details close up, scrolling through the image section by section. 

I'm a print man myself and can't understand how someone can willingly give up the pleasure of viewing beautiful artistic composition in exchange for forensic examination of detail.  But this habit does seem surprisingly prevalent. The "wow" factor (in terms of sharpness and clarity) is seemingly valued more highly than composition.

This may well go far to explain why so few good photographs are shot on Foveon based cameras  Wink

Logged

My website and photo galleries: http://www.whisperingcat.co.uk/
hjulenissen
Sr. Member
****
Offline Offline

Posts: 1615


« Reply #13 on: February 16, 2012, 04:46:01 PM »
ReplyReply

Are there reason to have more than 7.2MP? Yes, simply because the human visual system can work beyond its resolving power limits. Detection, for example, is very powerful in the visual system--think power cables in the sky or cell towers on the horizon. This is very different from resolving line pairs which is separating a surface with a frequency, but just noticing a line feature in a uniform field. So by going to high-resolution sensors we can perceive finer details and those details will have better contrast. But like anything, the human visual system is limited.
I have a problem understanding that a system might work "beyond its resolving power limits". Detection cannot detect features that are small enough that the human eye optics smears it into a lump, or that cannot be "read" with at least two photo sites with sufficient contrast?

I do get that the HVS may be non-linear, and that estimates of "resolving power" using a swept sine may or may not be easily reinterpreted to judge if a power-line will be sufficiently detailed shown so as to not cause visible errors.

As a first approximation, I would assume that any camera system that does "proper" pre-filtering before sampling, would be able to render a given range of spatial frequencies/edge rise-times with sufficient accuracy/contrast so as to be transparent (in this regard) up to a given display size/vieweing distance.

-h
Logged
billh
Full Member
***
Offline Offline

Posts: 108


« Reply #14 on: February 16, 2012, 06:11:47 PM »
ReplyReply

There sure seems to be a lot of energy going into explaining why cameras without AA filters should not be selected over those with AA filters.

I see information (detail) in images from sensors without AA filters that is absent from these with AA filters, when both images were taken with the same lens and aperture. If the detail is present in one image and not the other, how can we expect the image with the missing detail to look the same as that with the detail when printed? In the case where the AA filtered image is simply blurred slightly, or less sharp, can this difference (non AA filter) advantage be negated by post processing (USM)?

Medium format digital cameras come without AA filters, and now cameras with smaller sensors are appearing without them. Why would the camera companies do this if there were not advantages that potentially override the disadvantages? From Nikon’s website (http://imaging.nikon.com/lineup/dslr/d800/features01.htm):

Optical low-pass filter optimized for sharpness on the D800

Reducing false color and moiré is the main job of the optical low-pass filter located in front of the image sensor. However, this benefit is generally gained with a small sacrifice of sharpness.
The ultimate attention to detail — the D800E
Nikon engineers have developed a unique alternative for those seeking the ultimate in definition. The D800E incorporates an optical filter with all the anti-aliasing properties removed in order to facilitate the sharpest images possible.

Thanks!
Logged
ErikKaffehr
Sr. Member
****
Offline Offline

Posts: 6922


WWW
« Reply #15 on: February 16, 2012, 11:34:29 PM »
ReplyReply

Hi,

An AA filtered image needs more sharpening then an unfiltered image.

The point you make that an unfiltered image has more detail  may depend on it producing fake detail. For instance the structures on a feather (is that called "barb"?) may be finer than the native resolution of the sensor (called Nyquist limit). On a correctly filtered image detail beyond sensor resolution would be a gray mass while an unfiltered image would "invent" detail with lower frequency.

See enclosed image, the red line shows the Nyquist limit, all detail right of Nyquist is "fake". What you see is that detail "mirrors around" the Nyquist limit. The lower image is same as the above one, converted to monochrome.

These images were shot with a Sony Alpha 55 SLT which probably has some AA-filtering but it seems to be weak. I have noticed aliasing in real images so I made these test shots to find out.

I also added a screen dump from Nikon's demo page for the Nikon D800/D800E, where extra sharpening was made on the filtered image (as an OLP-filtered image needs more sharpening). I don't think that the non AA-filtred image has better detail than the left one. Now you could also sharpen the right image a bit more and really have a "sharpening race".

Best regards
Erik



There sure seems to be a lot of energy going into explaining why cameras without AA filters should not be selected over those with AA filters.

I see information (detail) in images from sensors without AA filters that is absent from these with AA filters, when both images were taken with the same lens and aperture. If the detail is present in one image and not the other, how can we expect the image with the missing detail to look the same as that with the detail when printed? In the case where the AA filtered image is simply blurred slightly, or less sharp, can this difference (non AA filter) advantage be negated by post processing (USM)?

Medium format digital cameras come without AA filters, and now cameras with smaller sensors are appearing without them. Why would the camera companies do this if there were not advantages that potentially override the disadvantages? From Nikon’s website (http://imaging.nikon.com/lineup/dslr/d800/features01.htm):

Optical low-pass filter optimized for sharpness on the D800

Reducing false color and moiré is the main job of the optical low-pass filter located in front of the image sensor. However, this benefit is generally gained with a small sacrifice of sharpness.
The ultimate attention to detail — the D800E
Nikon engineers have developed a unique alternative for those seeking the ultimate in definition. The D800E incorporates an optical filter with all the anti-aliasing properties removed in order to facilitate the sharpest images possible.

Thanks!

« Last Edit: February 16, 2012, 11:44:50 PM by ErikKaffehr » Logged

hjulenissen
Sr. Member
****
Offline Offline

Posts: 1615


« Reply #16 on: February 16, 2012, 11:58:48 PM »
ReplyReply

Why would the camera companies do this if there were not advantages that potentially override the disadvantages?
I am a cynic. I expect companies to introduce those products that (according to their predictions) cost the least to manufacture, that can sell i the biggest numbers and at the highest selling price.

Now, having the best possible "image quality" (whatever that is) certainly may help, but it is probably not the only thing that major camera manufacturers think about. For many products, it seems that it is more profit-efficient to include e.g. a swiveling lcd-screen than to spend money on long-term sensor research (or to buy the most expensive model from those who did).

-h
Logged
theguywitha645d
Sr. Member
****
Offline Offline

Posts: 970


« Reply #17 on: February 17, 2012, 09:56:17 AM »
ReplyReply

I have a problem understanding that a system might work "beyond its resolving power limits". Detection cannot detect features that are small enough that the human eye optics smears it into a lump, or that cannot be "read" with at least two photo sites with sufficient contrast?

I do get that the HVS may be non-linear, and that estimates of "resolving power" using a swept sine may or may not be easily reinterpreted to judge if a power-line will be sufficiently detailed shown so as to not cause visible errors.

As a first approximation, I would assume that any camera system that does "proper" pre-filtering before sampling, would be able to render a given range of spatial frequencies/edge rise-times with sufficient accuracy/contrast so as to be transparent (in this regard) up to a given display size/vieweing distance.

-h

This is complicated and I am not sure systemically I understand it enough to go into fine technical detail. However, there are sort of two issues, what the camera "sees" and what the viewer of the print sees. Here I am taking about why having a file from a camera that exceeds the 7.2MP "limit" can make a difference--there is no question that high MP sensors impact the image.

The test target you choose, influences the result. If I do a lines per mm target and a dots per mm target, I get different results. With lines per mm, I have a square wave that at the threshold of resolution turns into a sine wave. As we approach the limit of resolving power, not only does the frequency increase, but the amplitude decreases, both from the top and bottom--the blacks and whites go gray. What if I put a single black line on a white field, this is the idea of detection. The amplitude also changes, but not the same way, the white is still there and only the black is making an amplitude change. Basically, I can more easily see a black line thinner than a resolving power patch of lines.
Logged
BJL
Sr. Member
****
Offline Offline

Posts: 5083


« Reply #18 on: February 17, 2012, 10:07:30 AM »
ReplyReply

If the detail is present in one image and not the other, how can we expect the image with the missing detail to look the same as that with the detail when printed?
No one is saying that they will be the same; the debate is whether the combination of extra detail plus contamination by aliasing artifacts (not just moiré) is better or worse. As a teacher, I prefer an honest "I don't know" to a wrong answer, but that is getting too philosophical. The choice is clearly one where different photographers and different photographic situations will lead to different choices.

Quote
In the case where the AA filtered image is simply blurred slightly, or less sharp, can this difference (non AA filter) advantage be negated by post processing (USM)?
/quote]
To a good extent yes, but I will let other demonstrate and debate that. Looking at "straight, minimally processed raw" might appeal to some puritanical sense of "Photographic Correctness" (PC), but it is most often not the best way to make use of the information recorded by the sensor.

Quote
If the detail is present in one image and not the other, how can we expect the image with the missing detail to look the same as that with the detail when printed?

In the case where the AA filtered image is simply blurred slightly, or less sharp, can this difference (non AA filter) advantage be negated by post processing (USM)?

Why would the camera companies do this if there were not advantages that potentially override the disadvantages? From Nikon’s website (http://imaging.nikon.com/lineup/dslr/d800/features01.htm):

Optical low-pass filter optimized for sharpness on the D800

Reducing false color and moiré is the main job of the optical low-pass filter located in front of the image sensor. However, this benefit is generally gained with a small sacrifice of sharpness.


Quote
Why would the camera companies do this if there were not advantages that potentially override the disadvantages? From Nikon’s website ... Reducing false color and moiré is the main job of the optical low-pass filter located in front of the image sensor. However, this benefit is generally gained with a small sacrifice of sharpness.
And on the other hand, "why do the vast majority of cameras have an OLPF, which increases manufacturing costs, if there were not advantages that potentially override the disadvantages?"


But beyond the obvious cost advantage in most cases of omitting the OLPF, I will float are several other possibilities:

1. In some cases, aliasing could be the lesser of two evils. Or, for some photographers, the post-processing effort to erase moiré could be less than for careful sharpening, if a small enough fraction of their images need moiré removal.

2. Once enough customers ask for it, and are willing to pay a sufficient premium for it, the free market system will provide it. And that demand might driven by factors other than pure reason and evidence: see the book "Thinking, Fast and Slow" on how far we fall short of perfectly rational decision making.

3. It could be the judgment by MF marketing people that resolution, and the sharpness seen in 100% pixel peeping, is a significant factor in many customers' choice between DMF vs 35mm format DSLRs. (As evidence, look at all the people judging the D800 inferior to DMF on the basis of viewing a few crops at 100% pixels on-screen, without even considering appropriate sharpening as part of the "giving it your best shot" comparison, and without seeing comparisons at equal image size and at the higher PPI that are likely to be used when displaying high MP images.) So some very sharp example images, avoiding ones visible aliasing artifacts, can help persuade some people to pay the premium for DMF over 35mm, and Nikon might likewise think it can push some decisions back the other way with the D800E.

Logged
Ray
Sr. Member
****
Offline Offline

Posts: 8812


« Reply #19 on: February 17, 2012, 10:13:15 AM »
ReplyReply

I think Reid is making a fundamental error in his evaluation. He is evaluating images at 100% on a monitor view. That is not a real world conditions. And as the pixel count goes up, that condition moves further and further away from a real viewing condition. The variability in the photographic process going to be far more significant than an AA filter--having an AA filter does not result in a soft image, just some loss to frequencies at the Nyquist limit. Reid's own argument suggests he really does not know, he is simply making an argument that because he thinks he sees detail at 100% more clearly without an AA filter, that that is significant detail. I have never read that is true nor do I have experience that that is true--my experience fits the theories I have studied. Reid himself does not really offer any support to his claim.

I would agree that the increase in detail and microcontrast that is probably visible in a D800E shot compared with a D800 shot of the same scene using the same lens, after both images have been appropriately sharpened, is probably trivial and probably not noticeable when viewing large prints from a recommended viewing distance.

It would be revealing to do some comparisons involving stepping back slightly when using the D800E, so that after cropping the D800E file to the same FoV as the D800 shot one could compare say a 33mp D800E shot with the full 36mp shot from the D800, then a 30mp shot, then 27mp etc. It might be better to use 2-dimensional subjects to avoid any confusion resulting from different perspectives.  Grin

With such a procedure one could get a clearer idea of just how much that extra resolution, resulting from the lack of an AA filter, might be worth in terms of pixel count. Is it equivalent to a 5% increase in pixel count or a 10% or 20% etc?

I never get excited by small increases in pixel count when I'm upgrading a camera. My first upgrade from my first DSLR (the 6mp Canon 60D) was the 8mp 20D, which represented a 33% increase in pixel numbers. My main reason for the upgrade was the significantly improved noise characteristis of the 20D. I considered the extra pixels as icing on the cake, but not significant.

I recall doing some resolution comparisons with those two cameras some years ago, and came to the conclusion that any increase in pixels numbers of less than 50% is probably not worth bothering with, and therefore I would never upgrade a camera based solely on an increase in pixel numbers of less than 50%, and perhaps not even 50%.

However, as regards comparing images at 100% on screen, I think that's quite legitimate. The issue of whether or not differences seen at 100% on screen may or may not be visible on certain size prints viewed from a certain distance, is a separate concern.

The first step is to establish whether or not there are any differences, and to do that one usually has to pixel peep.

Having established that there are noticeable differences at 100% on the monitor, one should then address the real-world, practical circumstances whereby such differences could be apparent.

The first example that springs to mind is the habit of viewers, not only photographers, to inspect a large print at a close distance, when possible, and for no other reason than to appreciate and take pleasure in the observation of any fine detail and texture that is not visible from the 'appropriate' or 'correct' viewing distance.

I believe this is a matter of normal inquisitiveness. When we view the real world, we expect to see greater detail the closer we look at any object. If our eyesight is not up to the task, we know that we will be able to see more and more detail as the power of the magnifying glass increases.

The photograph has the reputation of 'capturing' reality, albeit in a 2-dimensional format most of the time. However, if one approaches a large print to view the fine detail and instead discovers bluriness, it's a disappointment. Have you never experienced that before, theguywitha645d?  Grin

The second example that springs to mind is the opportunity for cropping. When one inspects a small portion of the total image at 100% on the monitor and finds that it is significantly sharper or more detailed than another 100% view of the same scene taken with a different camera, one knows that any print of a crop that is the same size as the image on the screen, will also reveal such differences, when viewed from the same distance as one views one's monitor.

That's worth something, surely.

Logged
Pages: [1] 2 3 4 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad