Ad
Ad
Ad
Pages: « 1 2 [3] 4 »   Bottom of Page
Print
Author Topic: Nikon D800/E Diffraction Limits  (Read 15731 times)
Ray
Sr. Member
****
Offline Offline

Posts: 8812


« Reply #40 on: July 06, 2012, 11:05:27 PM »
ReplyReply

To simplify further, the formula is max f-stop = P x 1.054. The D800 has a pixel pitch of 4.87 microns, so the corresponding f/stop is f/5.1. As one exceeds this critical f/stop, loss of contrast is often more noticeable than the loss of resolution. These considerations derive from the laws of physics and are not a defect in the D800. If you use f/16 on the D800, the results will be no worse than with the D3, which has 8.4 micron pixels.

I'm sure you know, Bill, that many mathematical formulae and principles of Physics are approximations, not only for the sake of simplicity, but because of uncertaintanties built into the fabric of reality, and the ever-present possibility that sometimes mand-made theories can be either flat out wrong or plain imprecise, or that the people using the theories in any particular instance may be misapplying them.

Having taken the trouble to compare F16 images from a couple of cropped-format cameras with a much higher pixel density than the D3, and one of which has even a slightly higher pixel density than the D800 (the Canon 50D), I am confident that it is extremely unlikely that resolution at F16 with the D3 would be as good as resolution at F16 with the D800. I suspect the resolution differences would be clearly noticeable at 100% on monitor, after appropriate sharpening for each image and upsizing of the smaller file.

Now it so happens I still have my 12.7mp Canon 5D which is very close to the pixel density of the Nikon D3. If I have the time, and time really is a problem but I might be able to find it in the interests of the pursuit of truth, I could do another comparison between my old 5D used at F16 and my new D800E used at F16, to see which is sharpest.

I could take bets on the results of the outcome (to give me an incentive), but I doubt that Michael would allow betting activities on his site.  Grin
« Last Edit: July 07, 2012, 03:08:40 AM by Ray » Logged
ErikKaffehr
Sr. Member
****
Online Online

Posts: 6918


WWW
« Reply #41 on: July 07, 2012, 03:46:12 AM »
ReplyReply

Hi,

It seems that different posters may have different requirements.

I do agree that viewing distance plays a crucial role. If viewing distance is increased the eye may not be resolve the finest detail. So sharpening may mask lack of detail. The picture looks sharp. Watching closer the lack of detail is obvious.

Best regards
Erik



That doesn't mean anything. Why would anyone want a 150DPI print these days?
Logged

BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3012


« Reply #42 on: July 07, 2012, 05:00:56 AM »
ReplyReply

I'm sure you know, Bill, that many mathematical formulae and principles of Physics are approximations, ...

So you are trying to suggest that the laws of diffraction are not accurate, despite the fact that the phenomenon can be seen (first reports go back to 1828), measured, reproduced, and accurately calculated?

I do agree that poorly executed tests can produce puzzling results, or that flawed interpretation can lead to the wrong conclusions. But blaiming a lack of understanding on the accuracy of the laws of physics ..., surely you can do better than that.

Just perform the test and, if the result seems to add something worthwhile, by all means share it.

Cheers,
Bart
Logged
Ray
Sr. Member
****
Offline Offline

Posts: 8812


« Reply #43 on: July 07, 2012, 09:49:54 AM »
ReplyReply

So you are trying to suggest that the laws of diffraction are not accurate, despite the fact that the phenomenon can be seen (first reports go back to 1828), measured, reproduced, and accurately calculated?

I'm trying to say in general terms that most laws are not perfectly accurate. What is often considered to be accurate is simply sufficiently accurate for the immediate purposes. Heck! We don't even know what 95% of the matter and energy in the universe is made of, despite the recent discovery of the Higgs Boson which has caused great excitement in the world of Physics.

More specifically, the Wikipedia article you refer to contains the following comment about the Airy disk.
Quote
The Airy pattern falls rather slowly to zero with increasing distance from the center, with the outer rings containing a significant portion of the integrated intensity of the pattern. As a result, the root mean square (RMS) spotsize is undefined (i.e. infinite). An alternative measure of the spot size is to ignore the relatively small outer rings of the Airy pattern and to approximate the central lobe with a Gaussian profile.


Quote
Just perform the test and, if the result seems to add something worthwhile, by all means share it.

I have done the tests. Following Emil Martinec's advice to use a banknote as a test target, I took about 100 shots in 2009, using tripod and LiveView, comparing my 10mp Canon 40D with my 15mp 50D. I took several series of shots at different distances to the target, and different apertures, and paid particular attention to the accuracy of focussing. By varying the distance to the banknote, I eventually found a distance which produced significant aliasing and moire which was clearly visible on the LiveView screen when the target was in focus. I found this method useful because the resolution of the LiveView screen on the 40D is lower resolution than the 50D screen. As a consequence, I was sometimes not totally certain I was precisely in focus with the 40D, unless I used the presence of moire as an indication of 'spot on' focussing.

The results of my tests are quite clear. At F16, the 50D (equivalent to a 38.4mp full-frame sensor) has a very slight resolution advantage compared with the 40D, equivalent to a 25.6mp full-frame sensor. However, I admit that such an advantage, apparent in terms of the greater legibility of the finest text on the banknote, would only be noticeable on very large prints viewed close up.

Nevertheless, if I wanted to make and 8"x10" print representing a 200% crop of a part of the scene as viewed on my monitor, say a rare bird on the branch of a tree, or an interesting geological pattern on a cliff face, I know that I would prefer to use the 50D shot at F16. I might even prefer the 50D shot at F16 to the 40D shot at F8. I posted a comparison on another thread at http://www.luminous-landscape.com/forum/index.php?topic=68359.60 which shows significant aliasing in the 40D shot at F8, yet no better detail.

As a result of such tests, I would be very surprised if the gap between the D800 and the D3 at F16 were not wider and more obvious, perhaps noticeable at 100% on screen.


Cheers!  Ray
Logged
bjanes
Sr. Member
****
Offline Offline

Posts: 2714



« Reply #44 on: July 07, 2012, 01:11:25 PM »
ReplyReply

I'm sure you know, Bill, that many mathematical formulae and principles of Physics are approximations, not only for the sake of simplicity, but because of uncertaintanties built into the fabric of reality, and the ever-present possibility that sometimes mand-made theories can be either flat out wrong or plain imprecise, or that the people using the theories in any particular instance may be misapplying them.

Having taken the trouble to compare F16 images from a couple of cropped-format cameras with a much higher pixel density than the D3, and one of which has even a slightly higher pixel density than the D800 (the Canon 50D), I am confident that it is extremely unlikely that resolution at F16 with the D3 would be as good as resolution at F16 with the D800. I suspect the resolution differences would be clearly noticeable at 100% on monitor, after appropriate sharpening for each image and upsizing of the smaller file.

Now it so happens I still have my 12.7mp Canon 5D which is very close to the pixel density of the Nikon D3. If I have the time, and time really is a problem but I might be able to find it in the interests of the pursuit of truth, I could do another comparison between my old 5D used at F16 and my new D800E used at F16, to see which is sharpest.

I could take bets on the results of the outcome (to give me an incentive), but I doubt that Michael would allow betting activities on his site.  Grin

At least two main factors are in play: the point spread function (PSP) of the lens, which is affected by diffraction and aberrations, and the PSP of the sensor, including the effects of the raw converter. One may add defocus as discussed in an excellent article by Erik Kaffer. I now have both the D3 and the D800e and can verify that optimal sharpness with both cameras with the 60mm f/2.8 AFS MicroNikkor is at f/4 to f/5.6 and this is determined by the sweet spot of the lens and not pixel size. This is confirmed in a post by Bobn2 on DPReview.

System MTF may be obtained by convolving the PSP of the lens with that of the sensor. At f/16 both cameras will be handicapped by the larger Airy disc at this aperture, but the D800 will likely have better MTF as you predict because of the better MTF of the sensor. Since MTFs multiply, the old adage that resolution is determined by the weakest link in the imaging chain is not true. See this explanation by Bobn2 and also the demonstration of convolution on the Wolfram site.

Regards,

Bill
Logged
BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3012


« Reply #45 on: July 07, 2012, 01:40:36 PM »
ReplyReply

More specifically, the Wikipedia article you refer to contains the following comment about the Airy disk.
Quote
The Airy pattern falls rather slowly to zero with increasing distance from the center, with the outer rings containing a significant portion of the integrated intensity of the pattern. As a result, the root mean square (RMS) spotsize is undefined (i.e. infinite). An alternative measure of the spot size is to ignore the relatively small outer rings of the Airy pattern and to approximate the central lobe with a Gaussian profile.

The emphasis in bold that you applied to the quote, suggests that you either are unfamiliar with the terminology, or made an attempt to spin the article in your favor. "Spotsize is indetermined" is mentioned as a consequence of "i.e. infinite", it doesn't mean that it can't be quantified with high accuracy and precision. It's somewhat similar to the division 1/3 which cannot be expressed exactly as a "real number" (it can be as a rational number though), or a division by zero (result is "complex infinity").

"To approximate the central lobe with a Gaussian profile" is mentioned as a less accurate alternative to the approximation of the actual shape of part of the exact pattern. The approximation itself can be accurately calculated to any precision one desires.

The diameter of the central lobe for a circular aperture is 'approximated' by:
2.4393397825330089098530776949305103557587186615502242... x wavelength x F-number , I limited the precision (hence approximation indicated by ellipses) to some 53 decimal positions, and the amplitude can also be calculated to any precision one requires, for as many lobes as one finds useful. BTW, the F-numbers are usually also approximations of the actual dimensions but it's up to the user to use more accurate input or not. The formula is exact.

As for your image example, the screen zoom resampling by two different amounts does eliminate the size differences, but it doesn't help the comparison due to the added resampling artifacts (especially on the lower '40D' crop). Despite that, and the additional moiré on the 40D (AA-filtering+diffraction was not strong enough to prevent that), I also see light diagonal stripes on the 40D image (on the vertical dark bar just left next to the portrait) that are missing in the 50D crop. It doesn't look like aliasing, so it seems to be higher resolution (due to less diffraction?) ...

Cheers,
Bart
Logged
Fine_Art
Sr. Member
****
Offline Offline

Posts: 890


« Reply #46 on: July 07, 2012, 02:27:19 PM »
ReplyReply

Hi,

It seems that different posters may have different requirements.

I do agree that viewing distance plays a crucial role. If viewing distance is increased the eye may not be resolve the finest detail. So sharpening may mask lack of detail. The picture looks sharp. Watching closer the lack of detail is obvious.

Best regards
Erik


A typical room may be 12ft across. You probably want to give a viewer more detail than their eye can resolve at 1/2 that distance. That leaves nothing to detract from the image when they take it home.

My pictures start to look soft when my eye is 8" from the picture. I want people to feel free to get close and explore moving their head around a bit. You want to immerse them in a place and time they will never see again. That is worth buying. Anything less will hold interest for a few seconds only.

What is the distance now specified for an immersive HDTV experience? Its much closer than most living rooms are set up for. It would actually look weird to have the sofa that close to the TV on the wall. Standing close in front of a framed picture does not look weird. People will do it.
Logged
Wayne Fox
Sr. Member
****
Offline Offline

Posts: 2738



WWW
« Reply #47 on: July 07, 2012, 03:22:47 PM »
ReplyReply

I want people to feel free to get close and explore moving their head around a bit. You want to immerse them in a place and time they will never see again. That is worth buying. Anything less will hold interest for a few seconds only.
I agree with this.  The idea of a "normal" viewing distance has always been more about when the printing technology will fail (normal for a billboard vs normal for a magazine spread). to me a great image can pull you in as you "immerse yourself" (very good description) in the image. Sure, not everyone will look at it close ... they may have no interest in the subject matter.

I also feel that images are not hanging in a gallery but in a normal location, and often the viewer will be directed much closer because of the natural flow of the location.  I've seen some very large images hanging in a multi million dollar home in Park City, Utah. They were used on a magnificent stairway about 5 feet wide, and from the living area below looked OK.  Climb the stairs and suddenly they looked terrible.

I have no control of where the image will be hung or how close people may choose to be or perhaps even be forced to be.  But I certainly don't want to print with such low resolution the image quality degrades very quickly as you approach the image if a viewer so chooses.
Logged

stevesanacore
Full Member
***
Offline Offline

Posts: 214


« Reply #48 on: July 07, 2012, 05:05:14 PM »
ReplyReply

As a long time 4x5 shooter, I was under the impression that diffraction gets worse as the focal length gets shorter. I thought it was the actual size of the aperture at given f-stop, and had nothing to do with the focal length. In other words a 150mm 4x5 lens at f16 had much less diffraction than a 50mm lens at f16 on a 35mm camera. I remember shooting at f22-32 very often with my 4x5 cameras and don't recall any sharpness issues. I recently shot a job with my 17mm TSE Canon on my 1DsMk3 and noticed a major fall off in sharpness at anything above f11. I wonder if lenses that have mediocre sharpness to begin with don't show the effect as much?
Logged

We don't know what we don't know.
texshooter
Full Member
***
Offline Offline

Posts: 200


« Reply #49 on: July 07, 2012, 06:16:07 PM »
ReplyReply

I certainly don't want to print with such low resolution the image quality degrades very quickly as you approach the image if a viewer so chooses.

I agree.

Don't you get peeved when some photogs try to argue with you about how unnecessary they feel 30+ megapixel cameras are becaue of the theoretical 5-15 ft. viewing distance. I think this argument is an anachronism, a vestige of the day when the only art hung on the wall were paintings. Those days are gone. I say use however many pixels you need so the viewer cannot tell the difference in resolution whether they are standing across the room or whether they are smudging the print with their nose. If all you can afford is a 12 MP camera that's perfectly understandable, but stop suggesting D800 and medium format shooters are off-the-deep-end show-offs. I hear this all the time at local photo hobby clubs, and it makes my jaw clinch.
« Last Edit: July 07, 2012, 06:43:08 PM by texshooter » Logged
BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3012


« Reply #50 on: July 07, 2012, 06:21:21 PM »
ReplyReply

As a long time 4x5 shooter, I was under the impression that diffraction gets worse as the focal length gets shorter. I thought it was the actual size of the aperture at given f-stop, and had nothing to do with the focal length.

Hi,

It actually has to do with the angular aperture, and as such both the actual aperture size and the focal length are in play. However, since our aperture numbers (F-number) are a ratio (f/#) between focal length and aperture size, diffraction is constant (as is the angular aperture) at a given F-number.

Quote
In other words a 150mm 4x5 lens at f16 had much less diffraction than a 50mm lens at f16 on a 35mm camera. I remember shooting at f22-32 very often with my 4x5 cameras and don't recall any sharpness issues.

That is because the image (and the diffraction) requires less output magnification for a given output size. The f/16 on the 35mm image was magnified much more. The f/22 - f/32 required much less magnification so the actual diifraction patterns stayed small enough to not affect output sharpness too much.

Quote
I recently shot a job with my 17mm TSE Canon on my 1DsMk3 and noticed a major fall off in sharpness at anything above f11. I wonder if lenses that have mediocre sharpness to begin with don't show the effect as much?

For the 1DsMk3, f/11 is probably the sweetspot where corner resolution has improved enough and center resolution has fallen enough to provide even sharpness across the image-circle, I know it does on my TS-E 24mm II. Optical theory predicts that center resolution will start to be visually impacted by diffraction at apertures narrower than f/7.1 on the 1DsMk3. It's not the optical quality, which probably is second to none, but pure physics.

Cheers,
Bart
Logged
BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3012


« Reply #51 on: July 07, 2012, 06:36:33 PM »
ReplyReply

Don't you get peeved when some photogs try to argue with you about how unnecessary they feel 30+ megapixel cameras are because of the theoretical 5-15 ft. viewing distance.

Hi,

Not really, but I do think, "if thinking that makes you happy, be my guest", I know better.

I agree with Wayne, assuming we managed to capture the soul (light/composition/intent) of our image, it is about the realism with which the image is rendered that delivers the knock-out punch, total submission/submersion, nothing to distract. I want the surface/material structure of the subjects/objects to become almost tangible.

Cheers,
Bart
Logged
Ray
Sr. Member
****
Offline Offline

Posts: 8812


« Reply #52 on: July 07, 2012, 07:28:05 PM »
ReplyReply

I also see light diagonal stripes on the 40D image (on the vertical dark bar just left next to the portrait) that are missing in the 50D crop. It doesn't look like aliasing, so it seems to be higher resolution (due to less diffraction?) ...

Those broad, diagonal, colored stripes that are very obvious on the 40D crop occur in a number of places around the head of the Aboriginal. Not being as knowledgeable as you on such technical matters as aliasing and moire, my first reaction was that those diagonal stripes were in fact artifacts or moire. But I always like to do real-world checking, so I pulled out a $50 banknote from my wallet and studied it carefully with a magnifying glass.

I can assert categorically that those diagonal, faintly colored stripes do not exist on the banknote. They are false detail. We have here a case of a sensor of higher resolution than a D800 producing a better and more accurate image at F16 than a lower resolution sensor at F8; better in terms of accuracy of detail; better in terms of DoF; and at least equal in terms real detail.

I rest my case.

Cheers!   Ray
Logged
BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3012


« Reply #53 on: July 07, 2012, 08:13:18 PM »
ReplyReply

Those broad, diagonal, colored stripes that are very obvious on the 40D crop occur in a number of places around the head of the Aboriginal. Not being as knowledgeable as you on such technical matters as aliasing and moire, my first reaction was that those diagonal stripes were in fact artifacts or moire.

They are, which tells us that there is more optical detail present than the lens+diffraction at the sensel pitch can resolve.

Quote
But I always like to do real-world checking, so I pulled out a $50 banknote from my wallet and studied it carefully with a magnifying glass.

Good, that saves me from getting one (Australian $50, David Unaipon (1872–1967) portrait, not the Edith Cowan (1861–1932) version) at my local bank, which I still might because it looks like an interesting test subject ...

Quote
I can assert categorically that those diagonal, faintly colored stripes do not exist on the banknote. They are false detail.

Yes, those are obviously aliasing artifacts. No problem, that's to be expected when diffraction doesn't kill all fine detail.

What seems to be missing from the more diffraction affected 50D image crop, is the diagonal area detail in the 40D crop, that I marked in Red (the vertical bar) on the attached copy of your image.

Cheers,
Bart
« Last Edit: July 07, 2012, 09:12:41 PM by BartvanderWolf » Logged
Jan Brittenson
Newbie
*
Offline Offline

Posts: 41



« Reply #54 on: July 07, 2012, 09:42:50 PM »
ReplyReply

Every point in the image consists of light that has passed through every point in the aperture.  It's not a pinhole.  Light close to the rim diffracts.  Therefore, every point in the image consists of some proportion of light spread through diffraction and some proportion of undiffracted light that passed through the center.  This proportion is purely dependent on the diameter of the aperture.  Since it's not a pinhole and there is half a lens in front of it and half a lens behind it, the portion of light that diffracted represents a star-shaped point spread (PSF).  This is trivial to verify by stopping down and photographing a bright point (aka 'unit impulse') like a light.  Star shape.  No airy disk.  Because an aperture is not a pinhole.

The diffracted light subject to the PSF forms a veil that sits on top of the undiffracted image, and which results in contrast loss.  The spread is very broad (as can be seen in the star from a point light - the 'arms' extend quite wide) so most of the energy is scattered widely.  But it does increase in intensity as you get close to the center.

The reason the PSF is a star shape is equally easy to understand.  The narrower the aperture, the greater the spread.  When an aperture has straight blades it's wider in the corners where the blades meet and narrower around the middle between the corners.  The former diffracts less, and the latter more.  This variation in intensity shows as a star.  With a perfectly round aperture it would roughly approximate a gaussian, similar to spherical aberration.

The various rules of thumb were created for telescopes (which lack apertures), where stars get distorted due to diffraction - from the barrel at the entry and exit pupils of the telescope.  This is still a rule a thumb and not a strict physical relationship though, because the pupils on a telescope aren't in focus.  However, they probably adequately in focus to produce something similar to airy disks.  (Just like the barrel can cause fuzzy vignetting at the pupils.  A photographic aperture of course can't cause vignetting.)  But an aperture can't project airy disks any more than closing it causes vignetting.
Logged
dimapant
Newbie
*
Offline Offline

Posts: 27


WWW
« Reply #55 on: July 08, 2012, 12:47:05 AM »
ReplyReply

Very, very interesting discussion, with some deep analysis, properly  well  done and I really learned some interesting information.

But……in a picture, what cannot be seen, it does not matter at all.

In my personal opinion, there is the most important parameters which are missing in this thread:  it has not been defined  the word “visible” in the  original question.

In other word,  talking of diffraction, it has been forgotten the most important things: the definition of the print size and viewing distance.

At which print size and viewing distance we refer  to answer to the original request to be “visible”?

Having not define those two parameters, defining visibility of diffraction is impossible, on a post card print size a certain picture  can be seen and can be judged as perfect at a certain viewing distance, whilst the same picture could not be printable for a good quality print on A2 print size on the same, or different, viewing distance.

What you see, even terrible, on a monitor at 200 x, it could be completely irrelevant on a certain print size/viewing distance.

Many thanks to all of you for the sharing of the deep expertise and  information and best regards.   

Alessandro
Logged
ErikKaffehr
Sr. Member
****
Online Online

Posts: 6918


WWW
« Reply #56 on: July 08, 2012, 04:13:01 AM »
ReplyReply

Hi,

I made some tests with a Sony Alpha 77 SLT (a really small pitch sensor) and an old 100/2.8 Minolta Macro lens.

The two samples below are taken with f/16 and f/5.6, both deconvolution sharpened. About optimal sharpening to my taste.

The f/16 image may look sharp when viewed at long distance, but fine line pattern visible in the red box is very clearly lost on the f/16 image.

Sharpening (Lightroom 4.1):

f/16: 75/1.3/100/17
f/5.6: 55/0.8/100/17

Best regards
Erik
Logged

erpman
Jr. Member
**
Offline Offline

Posts: 51


« Reply #57 on: July 08, 2012, 05:30:21 AM »
ReplyReply

Quote
Quote
In other words a 150mm 4x5 lens at f16 had much less diffraction than a 50mm lens at f16 on a 35mm camera. I remember shooting at f22-32 very often with my 4x5 cameras and don't recall any sharpness issues.

That is because the image (and the diffraction) requires less output magnification for a given output size. The f/16 on the 35mm image was magnified much more. The f/22 - f/32 required much less magnification so the actual diifraction patterns stayed small enough to not affect output sharpness too much.

Does this apply to stitching too? Or is it related to the actual size of the negative/sensor relative to the aperture size?

Let´s say I stitch 3 images taken in portrait mode so the resolution of the digital "negative" is increased to 7360x10000 pixels. In order to print it horizontally on a 110cm roll I would have to interpolate the image to about 200% (or reduce the resolution to 150ppi) whereas a regular d800 image file (7360 × 4912px) would have to be interpolated about 280-300%. Is this what is meant with magnification?

It appears to me that stitching (if that suits your shooting style) could be one way to avoid diffraction problems when you want that ultra-deep DOF??

Logged
ErikKaffehr
Sr. Member
****
Online Online

Posts: 6918


WWW
« Reply #58 on: July 08, 2012, 05:43:37 AM »
ReplyReply

Hi,

I don't think it works that way. If you stitch the focal length would be longer, so you would need to stop down more. It is possible to achieve extended depth using "focus stacking".

The reason that diffraction was not obvious in 4x4" was mainly "that you were not looking" and perhaps also the use of Tri-X film. This article illustrates it very well:

http://www.photodo.com/topic_138.html

In short: 135 at optimal aperture using TMAX-100 outperforms 4x5" at f/22 using Tri-X. Article is worth reading.

This article may also be of some interest: http://echophoto.dnsalias.net/ekr/index.php/photoarticles/29-handling-the-dof-trap

Best regards
Erik


Does this apply to stitching too? Or is it related to the actual size of the negative/sensor relative to the aperture size?

Let´s say I stitch 3 images taken in portrait mode so the resolution of the digital "negative" is increased to 7360x10000 pixels. In order to print it horizontally on a 110cm roll I would have to interpolate the image to about 200% (or reduce the resolution to 150ppi) whereas a regular d800 image file (7360 × 4912px) would have to be interpolated about 280-300%. Is this what is meant with magnification?

It appears to me that stitching (if that suits your shooting style) could be one way to avoid diffraction problems when you want that ultra-deep DOF??


Logged

BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3012


« Reply #59 on: July 08, 2012, 06:41:05 AM »
ReplyReply

Does this apply to stitching too? Or is it related to the actual size of the negative/sensor relative to the aperture size?

Let´s say I stitch 3 images taken in portrait mode so the resolution of the digital "negative" is increased to 7360x10000 pixels. In order to print it horizontally on a 110cm roll I would have to interpolate the image to about 200% (or reduce the resolution to 150ppi) whereas a regular d800 image file (7360 × 4912px) would have to be interpolated about 280-300%. Is this what is meant with magnification?

Yes.

The optical system, limited in resolution by residual lens aberrations and diffraction, projects a still very high resolution image on the sensor array. The sensel pitch sets another physical limit on how much of that can be resolved. The denser the sampling, the higher the resolution that can be utilized from the projected optical image. That results in an on sensor resolution that can be expressed in physical units such as cycles/mm.

That resolution will be proportionally reduced when the data is going to be magnified to a larger output size than the sensor array. If we can limit the required output magnification, because we have a physically larger sensor array or because we stitch (the result of) a few smaller ones together, then the resolution that was originally captured will be better preserved.

Quote
It appears to me that stitching (if that suits your shooting style) could be one way to avoid diffraction problems when you want that ultra-deep DOF??

It does help if we do not have to magnify the existing diffraction to the point that it becomes clearly visible as lost reolution. However, for a given Field of View it won't help to just stitch some more images together, because that will only increase our FOV. While that helps to reach a certain output size with a lower output magnification, it may not give us the FOV we want, it may be too wide. To counter-act that, one typically shoots with a longer focal length with a narrower FOV but unfortunately also a shallower DOF. To compensate for that one could use a narrower aperture, but that defeats the purpose of reducing the visibility of diffraction blur.

For beating diffraction and achieving deep DOF at the same time, there are no free lunches. Diffraction is a physical boundary that can only be controlled with our choice of aperture and the result can be magnified untill we have to accept visible resolution losses.

The only real solution for deep DOF with very high resolution, is focus stacking. That will allow to reduce the diffraction losses by using a wider aperture, and we can add DOF back as we increase the number of stacked focus brackets. Focus stacking does come with its own set of practical limitations though ...

Cheers,
Bart
Logged
Pages: « 1 2 [3] 4 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad