Ad
Ad
Ad
Pages: « 1 [2]   Bottom of Page
Print
Author Topic: Lens outresolving sensor or vice versa - what's the real story?  (Read 6091 times)
BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3649


« Reply #20 on: December 16, 2012, 02:04:02 PM »
ReplyReply

The question posed in the first post is (as I understand it): can the "diffraction limit" be used to perfectly predict at what pixel pitch/aperture combination the final resolution is limited only by diffraction (i.e. increasing the amount of megapixels will give exactly zero benefit).

http://coinimaging.com/blog1/?p=139In a classical physics, linear, noiseless world, one would expect proper deconvolution to be theoretically able to resolve point-sources whos Airy disks overlap to a very high degree (assuming that the blur kernel can be precisely estimated, that there is very little noise etc).

In practice, this is practically impossible because of noise and other errors, but that is another story than the "theoretically, you can never move beyond the diffraction limit, no matter what".

Hi h,

The real problem will be that lower than maximum contrast detail will have been attenuated into a featureles signal, even if there would not be any noise. The quantization limit of 14-bit ADCs will leave some room to retain a relevant micro contrast, but when higher spatial frequencies are concerned, then contrast will be reduced a lot already due to the area sampling of the sensels.

Suppose some scene micro detail has a contrast of 10:1 by itself, the area sampling could reduce that to perhaps 20% of that, so 2% signal would be left if no optics were involved. When the MTF of lens/diffraction reduces that to 10%, then only 0.2% will remain. Add a bit of noise (photon shot noise, read noise), and there will be no relevant signal left to deconvolve. Adding a bit of defocus (even within the DOF zone) will crush all hopes of recovery.

That's why I mentioned that the different impact on higher spatial frequencies (= lower MTF) will kill those first, and somewhat lower spatial frequencies stand a bit better chance because their MTF will be higher.

Cheers,
Bart
Logged
hjulenissen
Sr. Member
****
Offline Offline

Posts: 1679


« Reply #21 on: December 16, 2012, 02:21:10 PM »
ReplyReply

Hi h,

The real problem will be that lower than maximum contrast detail will have been attenuated into a featureles signal, even if there would not be any noise. The quantization limit of 14-bit ADCs will leave some room to retain a relevant micro contrast, but when higher spatial frequencies are concerned, then contrast will be reduced a lot already due to the area sampling of the sensels.

Suppose some scene micro detail has a contrast of 10:1 by itself, the area sampling could reduce that to perhaps 20% of that, so 2% signal would be left if no optics were involved. When the MTF of lens/diffraction reduces that to 10%, then only 0.2% will remain. Add a bit of noise (photon shot noise, read noise), and there will be no relevant signal left to deconvolve. Adding a bit of defocus (even within the DOF zone) will crush all hopes of recovery.

That's why I mentioned that the different impact on higher spatial frequencies (= lower MTF) will kill those first, and somewhat lower spatial frequencies stand a bit better chance because their MTF will be higher.

Cheers,
Bart
This is exactly the way of reasoning that I wanted to suggest. Rather than claiming that some arbitrary airy-disk distance is the absolute limit for spatial information recoverable, it seems reasonable to assume that diffraction leads to a gradual loss of practical resolution due to vanishing (high-spatial-frequency) contrast. The limit is likely to depend somewhat on input, raw processing etc.

-h
Logged
BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3649


« Reply #22 on: December 16, 2012, 05:00:20 PM »
ReplyReply

This is exactly the way of reasoning that I wanted to suggest. Rather than claiming that some arbitrary airy-disk distance is the absolute limit for spatial information recoverable, it seems reasonable to assume that diffraction leads to a gradual loss of practical resolution due to vanishing (high-spatial-frequency) contrast. The limit is likely to depend somewhat on input, raw processing etc.

Indeed. As an additional (auxiliary) module of my webpage with the free Slanted Edge Analysis tool, I provide a tool, under Step 4, that can be used to calculate the theoretical (under optimal no noise, etc. conditions) resulting Optical Transfer Function performance. The (comma and space separated) tabulated data allows to copy/paste and parse the different components for graphical representation of various MTF scenarios (only diffraction at 564nm, OTF of diffraction plus defocus, MTF of lens+demosaicing, different apertures, different sensel pitches).

A very good lens at its optimal aperture (e.g. f/4) can typically reach a blur diameter with sigma 0.7, and at f/16 that blur could be increased to a sigma of 1.1 (depending on the actual lens). It will show how a smaller sensel pitch will come closer to resolving whatever resolution (MTF at the various levels of detail) is left to be had after diffraction (and defocus) took their share of the input signal contrast levels.

Here are two examples of the MS Excel converted graphical output from an optimal aperture f/4 scenario on a Nikon D800, compared to the OTF, and the only diffraction limited MTF (only relevant frequencies up to 'Nyquist' are given):

Where despite the good lens, the performance is still clearly limited (mostly) by the MTF of the lens.

And an f/16 scenario on a Nikon D800, compared to the diffraction only limited MTF, and the OTF (diffraction + limited COC) :


Where the limitations by the lens, and the diffraction (even when including the 1.5x sensel pitch DOF zone limits in the COC and resulting OTF) are compared.

The MTF in the above data is a close approximation of the actual system performance by using the Slanted Edge analysis as indicated on the webpage (and help pages). The OTF and Diffraction limit curves are models, based on the given input parameters and the explanation by David Jacobson (http://photo.net/learn/optics/lensTutorial).

Cheers,
Bart
« Last Edit: December 16, 2012, 08:08:58 PM by BartvanderWolf » Logged
Pages: « 1 [2]   Top of Page
Print
Jump to:  

Ad
Ad
Ad