Ad
Ad
Ad
Pages: « 1 [2] 3 4 »   Bottom of Page
Print
Author Topic: Moore's law for cameras  (Read 17838 times)
bradleygibson
Sr. Member
****
Offline Offline

Posts: 829


WWW
« Reply #20 on: July 13, 2009, 11:40:44 PM »
ReplyReply

Quote from: MichaelL
Mr. Maxwell made the statement, "We are very near the limit right now."

Therefore, how "near" is "very near"? That is, when using the current crop of pro-level lenses from the top camera makers on full-frame dslrs, what is the "limit"? Thirty-something mp? Forty-something? Fifty-something? A ballpark number is fine, since an exact number would be useless, and only theoretical, given manufacturing variations, etc.

And I'm speaking, of course, of current silicon, glass and noise-reduction technology- not vapor-ware like negative refraction lenses and other exotic and not-for-sale technology (which I look forward to seeing, but my wallet doesn't   )

So, any engineers, or armchair engineers care to take a crack at it?

Assuming an ideal lens ("perfect", not retrofocus or telephoto design), green light (520nm) at:
* f/8 gives a limit of 5.08 microns
* f/11 gives a limit of 6.97 microns
* f/16 gives a limit of 10.2 microns.

(Note that blue light will give a smaller limit and red light will give a larger limit.  And since our lenses ain't perfect, expect real world sizes to be larger as well.)

Typical photosite sizes on a 39-megapixel digital back are around 6.8 microns and a Nikon D3X (24 megapixel FF 35mm) is at 5.94 microns.  So, put another way, we're effectively 'there' now below f/8, and further decreases in sensel size represent diminishing resolution returns.

I believe that 5 micron sensel sizes might be a reasonable lower limit for high-end digital photography.  But that doesn't mean things will stop there, though...  

P.S.  There's a nice post on Airy disks/diffraction that's not too technical at http://www.cambridgeincolour.com/tutorials...photography.htm.
« Last Edit: July 13, 2009, 11:53:44 PM by bradleygibson » Logged

Jonathan Cross
Jr. Member
**
Offline Offline

Posts: 51


« Reply #21 on: July 14, 2009, 02:02:40 AM »
ReplyReply

I agree with Michael up to a point.  We already have cameras with very high pixel densities.  Just try working out how many pixels a full frame sensor would have at the density of those in a Canon G10.  The manufacturers of cameras like the G10 get round the diffraction problem by limiting the aperture.  I have a 5dMkII and have kept my previously bought 40D with its higher pixel density.  Why?  I will use the 40D for wildlife as I can get 'closer' with a given lens.  For wildlife I do not need a small aperture, rather a fast shutter speed.  My 5DMKII will be used for my real love, landscapes, where I need a small aperture and the shutter speed does not usually matter.  So why do I not have a MF digiatl camera for my landscapes - cost of course!

Logged
Nemo
Sr. Member
****
Offline Offline

Posts: 276



« Reply #22 on: July 14, 2009, 04:53:30 AM »
ReplyReply

Quote from: bradleygibson
Assuming an ideal lens ("perfect", not retrofocus or telephoto design), green light (520nm) at:
* f/8 gives a limit of 5.08 microns
* f/11 gives a limit of 6.97 microns
* f/16 gives a limit of 10.2 microns.

 That is true for the resolving of an isolated point...

Details are formed by overlaping Airy disks... and then you need more than one pixel per Airy disk... Bayer patterns imply you need even more pixels per Airy disk...

Logged
barryfitzgerald
Sr. Member
****
Offline Offline

Posts: 607


« Reply #23 on: July 14, 2009, 06:56:33 AM »
ReplyReply

Things could change a lot over the years. I would tend to agree with the article, in that there are some limits. Also, Moore's law has a habit of being applied outside it's original area (it's open to debate if it even holds true in that), next we will hear Moore's law applies to dental technology or car engines ;-)

Leaving that to one side. I don't see Bayer sensors sticking around in the long term, I think we might see the move to multi colour layers sensors, which will help improve colour reproduction, and help reduce the pressure on sensor density. The game could change..and significantly. What we see right now tech wise for sensors, is going to be a joke in 10 years time!
Logged
bradleygibson
Sr. Member
****
Offline Offline

Posts: 829


WWW
« Reply #24 on: July 14, 2009, 07:58:10 AM »
ReplyReply

Quote from: Nemo
That is true for the resolving of an isolated point...

Details are formed by overlaping Airy disks... and then you need more than one pixel per Airy disk... Bayer patterns imply you need even more pixels per Airy disk...

Hi, Nemo,

Correct, to discuss resolution, one requires the ability to distinguish two distinct points.

The above calculations are the center-to-center distances of two Airy disks positioned such that the center of the first Airy disk occurs at the first minimum of the second (Rayleigh criterion for diffraction limits).  AFAIK, this is generally considered to be the limit of resolution.

-Brad
« Last Edit: July 14, 2009, 08:00:37 AM by bradleygibson » Logged

samirkharusi
Full Member
***
Offline Offline

Posts: 196


WWW
« Reply #25 on: July 14, 2009, 08:39:18 AM »
ReplyReply

Quote from: bradleygibson
Assuming an ideal lens ("perfect", not retrofocus or telephoto design), green light (520nm) at:
* f/8 gives a limit of 5.08 microns
* f/11 gives a limit of 6.97 microns
* f/16 gives a limit of 10.2 microns.
Let's try to make a huge leap, the one like 4x5 having given way to 35mm format over the past 60 years. The way I see the future is that the sensors will be much smaller (= lenses being much smaller, and easier to design at high quality). Hence the current (for 35mm format lenses) best performance at f8 will become best performance at f4. The relevant diffraction limit becomes 2.5 microns, and you can still have 20 megapixels on a 4/3rd sensor from which a 16x20" print shot at f4 is indistinguishable from a similar print made from a 35mm format sensor shot at f8. See, there is plenty of room for further development. Next, lenses get optimised for f2.8, sensors halve again in size and still have a useful 20 megapixels. You are by then running into seriously performing, pocketable cameras, approaching your own eye's physics. To me, it's the sensor size that will soon become the next target, Olympus sees that, but perhaps they are  a couple of years too early and thus they are still fighting an uphill battle. Canon and Nikon already have their crop cameras and are still dabbling in making seriosly good lenses in crop mounts (EFs and similar). Once they can get their crop cameras into the 20 to 30megapixel range I expect that they will be accompanied by leading-edge premium quality EFs lenses. The march can continue onwards for at least a decade in similar vein, without diffraction being the limiter. Recall that 16mm C-mount lenses have for many years being seriously fast (around f1.0). If such lenses are, say, optimised at f2.8, and they are accompanied by 16mm format, 20 MP sensors, wow... Depth of field matters take care of themselves. Who uses f64 currently on 35mm format? f8 will one day become overkill on such tiny sensors.
Logged

Bored? Peruse my website: Samir's Home
Nemo
Sr. Member
****
Offline Offline

Posts: 276



« Reply #26 on: July 14, 2009, 12:34:58 PM »
ReplyReply

Quote from: bradleygibson
Hi, Nemo,

Correct, to discuss resolution, one requires the ability to distinguish two distinct points.

The above calculations are the center-to-center distances of two Airy disks positioned such that the center of the first Airy disk occurs at the first minimum of the second (Rayleigh criterion for diffraction limits).  AFAIK, this is generally considered to be the limit of resolution.

-Brad

Sensors cannot resolve points at the Rayleigh criterion... because the contrast is too low (9%). The Rayleigh criterion was established for the separation of stars in telescopes, not for the separation of points in digital photography.

For resolving a separate point you need a pixel with a diagonal equal to the diameter of the disk... but for resolving line pairs formed by disks you need at least 2 pixels per line pair. How large those pixels have to be? That depends on the separation of the disks... Separation means contrast. A minimum contrast is required by the sensor, for resolving the detail. Therefore, a minimum separation is needed as well. And therefore a particular pixel size is necessary for maximum resolving power of the lens + sensor team. And we are considering monochromatic sensors...

Things are a bit more complex than Cambridgecolor.com explains...



Logged
dalethorn
Guest
« Reply #27 on: July 14, 2009, 01:59:38 PM »
ReplyReply

The time scales are interesting from a user point of view.  On the one hand, DSLR's with much better image quality have gotten so large that a Leica S2 "MF" camera is now smaller than some of them.

Looking at best choices from the lower end, in April 2005 I could buy a Casio pocket camera with 35-105 equiv. zoom and 10 mp on a 1/1.8 sensor, then 4 years later buy a Panasonic with 25-300 equiv. zoom and 10 mp on a 1/2.33 sensor.

So what did 4 years of tech improvements bring?
Zoom, fantastic.  Opens up whole new worlds of opportunity.  For DSLR's, the smaller size and improved stability of the zoom lenses is good news.
Zoom motor, still poor.  Thankfully my zoom on the G1 is manual.  The GH1 is another matter.
Image stabilization, all good news.
Size of camera, slightly thicker, still pocket size.  DSLR's are now available in ever-smaller sizes.
Quality of image, same or slightly better, and no worse noise.  DSLR's with smaller sensors are improving a lot.
HD video, stereo wideband sound, all good.  HD video is moving into most DSLR's now.
Battery, about the same.  The bad news is, most of the smaller "DSLR" cameras like Pana G1 and Oly Pen have a short battery life.
Memory, all good news.
Transfer speed, much better.
Screens are much better now.
Flash, little or no better.  Not very practical to use external flash on a pocket camera.

DSLR's will continue to do some things that all-in-one cameras won't do anytime soon, like use special tilt lenses etc.  But most other features will float up or down more or less equally.  And even if sensor and per-pixel image quality don't improve a lot, the smaller and better-stabilized zoom lenses are bringing new worlds of opportunity to walkaround shooting.

Maybe it's time to say that DSLR's have become the medium format of the present, and the official MF designation has moved into some other territory.
Logged
bradleygibson
Sr. Member
****
Offline Offline

Posts: 829


WWW
« Reply #28 on: July 14, 2009, 06:08:25 PM »
ReplyReply

Quote from: Nemo
Sensors cannot resolve points at the Rayleigh criterion... because the contrast is too low (9%). The Rayleigh criterion was established for the separation of stars in telescopes, not for the separation of points in digital photography.

For resolving a separate point you need a pixel with a diagonal equal to the diameter of the disk... but for resolving line pairs formed by disks you need at least 2 pixels per line pair. How large those pixels have to be? That depends on the separation of the disks... Separation means contrast. A minimum contrast is required by the sensor, for resolving the detail. Therefore, a minimum separation is needed as well. And therefore a particular pixel size is necessary for maximum resolving power of the lens + sensor team. And we are considering monochromatic sensors...

Things are a bit more complex than Cambridgecolor.com explains...

I will disagree that sensors cannot respond to a 9% difference in contrast; and re:CambridgeColor.org, things are always more complex than any one paper explains, but IMHO, the reference article serves as a good starting point.

Otherwise, I generally agree with you.  I felt a full theoretical analysis (which I wouldn't be qualified to do anyway) brings in too many variables, assumptions and thus will end up avoiding the question being asked ("how far away are we?").

The size of the Airy disks is what it is for the light, ideal lens and aperture selected.  As for resolving it, you are correct in that how best to do it (2x oversampling, or 4x with a bayered sensor?, or otherwise; establishing minimum acceptable contrast; what to do since no real lens is ideal, etc., etc., etc.) is more involved.
« Last Edit: July 14, 2009, 06:15:00 PM by bradleygibson » Logged

ErikKaffehr
Sr. Member
****
Offline Offline

Posts: 7640


WWW
« Reply #29 on: July 14, 2009, 11:50:21 PM »
ReplyReply

Hi,

Just one practical observation. A Swedish monthly, "Foto", does pretty sold lens tests at the Hasselblad factory using their MTF equipment, and they have essentially seen that Olympus lenses trend to max out before f/8. Also they have seen that best aperture seems to be between f/5.6 and f/8 on resolution test targets. "Foto" seems to have difficulties finding good enough lenses for the newest cameras. My view may be:

- There is probably some good reason for higher pixel density but it may not be to increase resolution.
- En example is that with high pixel density diffraction may act as low-pass (AA) filter and perhaps allow for more extensive sharpening.
- Smaller pixels may be more noisy, but that may be overcome with binning and downsampling.
- Cost per sqaure centimetres is going down, this may make larger chips economically more feasible. This is a slow process. Now we have affordable "full frame 135".
- Regarding in camera processing capability Moore's laws apply fully. Processing 5 or more 24 Mpixel images in a second is impressive, especially considering how slow general purpose computers are at raw conversion.
- Development may slow down. Sensor technology may be near optimum.
- I would bet on Bayer matrix beeing around for a long time. Other technologies may seem advantageous but the Bayer matrix solution is quite flexible regarding filter choices.
- Carl Zeiss had some interesting info on MTF vs. resolution and they certainly say that 24 Mpixel technology is not lens limited. I'll try to dig up that article, it's public but not easily found.  

Here is the Zeiss Article:
http://www.zeiss.co.uk/C12567A8003B8B6F/Em...F_Kurven_EN.pdf
http://www.zeiss.co.uk/C12567A8003B8B6F/Em...Kurven_2_en.pdf ( Resolution limit is discussed on Page 22)

The second article refers to separately downloadable images. I found them here:


http://www.zeiss.de/C12567A8003B8B6F/Graph...le/Image_01.jpg
http://www.zeiss.de/C12567A8003B8B6F/Graph...le/Image_02.jpg
...
http://www.zeiss.de/C12567A8003B8B6F/Graph...ile/Bild_10.jpg
This image compares 24 MPixel full frame (1) with scanned slides from 9x12 (2), 7x6 (3) and 24x36 (4) (using 100 ISO slide film and 4000 PPI)

http://www.zeiss.de/C12567A8003B8B6F/Graph...ile/Bild_13.jpg
This image compare "original" with 24 and 12 MPixels



Best regards
Erik

Quote from: bradleygibson
I will disagree that sensors cannot respond to a 9% difference in contrast; and re:CambridgeColor.org, things are always more complex than any one paper explains, but IMHO, the reference article serves as a good starting point.

Otherwise, I generally agree with you.  I felt a full theoretical analysis (which I wouldn't be qualified to do anyway) brings in too many variables, assumptions and thus will end up avoiding the question being asked ("how far away are we?").

The size of the Airy disks is what it is for the light, ideal lens and aperture selected.  As for resolving it, you are correct in that how best to do it (2x oversampling, or 4x with a bayered sensor?, or otherwise; establishing minimum acceptable contrast; what to do since no real lens is ideal, etc., etc., etc.) is more involved.
« Last Edit: July 15, 2009, 06:43:50 AM by ErikKaffehr » Logged

Nemo
Sr. Member
****
Offline Offline

Posts: 276



« Reply #30 on: July 15, 2009, 05:29:08 AM »
ReplyReply

I think Ray Maxwell is right, but the calculation based on pixel size versus airy disk size isn't a good reference. It depends on the Bayer pattern (or lack of) and it depends on the maximum detail to be resolved and its nature (linear detail, sinusoidal detail, high contrast detail...?), on the optical quality of the lens (at wide apertures), etc.

I belive we will see 35mm format cameras with 30 MP or so...

Logged
charleski
Newbie
*
Offline Offline

Posts: 34


« Reply #31 on: July 15, 2009, 08:31:10 AM »
ReplyReply

Quote from: ErikKaffehr
Finally, I can see a benefit of increasing pixel densities further. One reason is that we can eliminate the need for low pass (anti aliasing) filter if we make the airy ring about the same size as a pixel. I'd also guess that it's better to have more pixels than "uprezz" using interpolation.

Best regards
Erik
I was about to post almost the exact same comment. The current barrier to sensor resolution is the quality of the anti-aliasing filter. In most cases* that's a discrete element in front of the sensor. While there's a lot of research going into finding good alternatives to the standard birefringent filter used on most cameras, it's certainly true that one solution is simply to oversample the data.

The current crop of cameras actually lie in an uneasy middle-ground, where the anti-aliasing filter is required for shooting at wide apertures, but unnecessary once stopped down past a diffraction limit which is well within the range of commonly-used apertures. If the sensor resolution is increased such that the camera/lens combination is diffraction-limited at all apertures, then the anti-aliasing filter with its multiplicative MTF reduction can be dispensed with completely which may yield a noticeable benefit.



*Yes, I know there are some MFD manufacturers who don't use an anti-aliasing filter and claim they can deal with artifacts in software. And yes, images from their systems do look extremely sharp when there are no visual cues to reveal how much of that detail is merely false, aliased data (moire is clearly visible on high-frequency parallel lines because we know what the image should look like, in the absence of such a clear visual clue you can often get by with allowing a lot of high-frequency aliasing garbage into the image, but it's really just hf noise). Of course, once aliasing noise has been introduced into a set of data samples there is no way of removing it without also removing part of the signal - proper anti-aliasing must take place in the analogue domain.
Logged

Ray
Sr. Member
****
Offline Offline

Posts: 8900


« Reply #32 on: July 15, 2009, 09:03:47 AM »
ReplyReply

I've compared the 10mp Canon 40D with the 15mp Canon 50D using the same Canon 50/1.4 prime on both bodies. The pixel density of the 50D on full frame 35mm would be 39mp.

The current 5D2 has the pixel density of the 8mp 20D. My comparisons between the 40D and 50D lead me to believe there could be a worthwhile benefit in a 39mp FF 35mm DSLR, not necessarily or not only in resolution at the plane of focus, but in depth of field.

For example, the 50D at F16 produces about the same resolution as the 40D at F11, at the plane of focus. Using both cameras at F11 results in a marginal resolution edge to the 50D, but not as great as the DoF edge of the 50D at F16 (compared with the 40D at F11).

Comparing the 50D at F11 with the 40D at F5.6 produces a more dramatic DoF benefit. My 50/1.4 is slightly sharper at F5.6 than at F8, but not by much. At the plane of focus, the resolution of the 50D at F11 is about equal to the 40D at F5.6, at least with my copy of the 50/1.4. However, the DoF of the 50D at F11 is substantially greater than the DoF of the 40D at F5.6.

I should mention that such differences have been examined on monitor at 100% and 200%, representative of very large prints.

Logged
Nemo
Sr. Member
****
Offline Offline

Posts: 276



« Reply #33 on: July 15, 2009, 12:23:56 PM »
ReplyReply

Ctein about diffraction:

http://theonlinephotographer.typepad.com/t...arithmetic.html

Very interesting.

.
Logged
Ronny Nilsen
Sr. Member
****
Online Online

Posts: 341


WWW
« Reply #34 on: July 15, 2009, 03:10:38 PM »
ReplyReply

Ctein:
Why 80 Megapixels Just Won't Be Enough...

Do I hear a 400 megapixel FF DSLR before the race is over?

Ronny
Logged

Ray
Sr. Member
****
Offline Offline

Posts: 8900


« Reply #35 on: July 15, 2009, 11:15:29 PM »
ReplyReply

Quote from: Nemo
Ctein about diffraction:

http://theonlinephotographer.typepad.com/t...arithmetic.html

Very interesting.

Yes. It is interesting, and I tend to agree with Ctein that one can't apply simple mathematical formulas to describe reality. Calculations of Airy disc size equated to pixel size don't tell the whole story.

One concern I have, is that increasing pixel count on the same size sensor tends to increase total read noise, because more pixels have to be read. Without compensating improvements in other areas, such as increased quantum efficiency of the individual pixels, or new ways of arranging things, such as having all the processing transistors on the reverse side of the CMOS sensor, or dumping the Bayer-type array in favour of one which doesn't filter out any of the light, then the disadvantages of increasing pixel count may cancel out the benefits.

Imagine a Foveon type sensor made of meta-materials using nanotechnology such that the layers on the sensor that are sensitive to individual frequency bands (R,G&B) are completely transparent to the other frequencies that are not collected, imposing no loss of efficiency as the photons pass through to be collected on the layer(s) underneath.

I believe the Bayer-type arrangement filters out about half of the light that passes through the lens. That's a whole stop of sensitivity that's been wasted. Current Foveon sensors use materials that allow certain frequency bands to pass through to another layer of silicon, but with nowhere near 100% efficiency. There's considerable absorption by the silicon which results in noise.
Logged
charleski
Newbie
*
Offline Offline

Posts: 34


« Reply #36 on: July 16, 2009, 02:03:03 AM »
ReplyReply

Quote from: Ray
Yes. It is interesting, and I tend to agree with Ctein that one can't apply simple mathematical formulas to describe reality.
Well of course you can, in that force really does equal mass times acceleration (for example). I think Ctein is railing against the inappropriate use of Occam's razor ('Simplicity=Truth') in practical applications.
Logged

Ray
Sr. Member
****
Offline Offline

Posts: 8900


« Reply #37 on: July 16, 2009, 05:15:26 AM »
ReplyReply

Quote from: charleski
Well of course you can, in that force really does equal mass times acceleration (for example). I think Ctein is railing against the inappropriate use of Occam's razor ('Simplicity=Truth') in practical applications.
 

Hhmmm! I'm whizzing around the sun at approximately 108,000 kms/hour. I weigh about 85kgs. I guess I must be very forceful    .
Logged
Nemo
Sr. Member
****
Offline Offline

Posts: 276



« Reply #38 on: July 16, 2009, 07:08:07 AM »
ReplyReply

Quote from: Ray
Yes. It is interesting, and I tend to agree with Ctein that one can't apply simple mathematical formulas to describe reality. Calculations of Airy disc size equated to pixel size don't tell the whole story.

One concern I have, is that increasing pixel count on the same size sensor tends to increase total read noise, because more pixels have to be read. Without compensating improvements in other areas, such as increased quantum efficiency of the individual pixels, or new ways of arranging things, such as having all the processing transistors on the reverse side of the CMOS sensor, or dumping the Bayer-type array in favour of one which doesn't filter out any of the light, then the disadvantages of increasing pixel count may cancel out the benefits.

Imagine a Foveon type sensor made of meta-materials using nanotechnology such that the layers on the sensor that are sensitive to individual frequency bands (R,G&B) are completely transparent to the other frequencies that are not collected, imposing no loss of efficiency as the photons pass through to be collected on the layer(s) underneath.

I believe the Bayer-type arrangement filters out about half of the light that passes through the lens. That's a whole stop of sensitivity that's been wasted. Current Foveon sensors use materials that allow certain frequency bands to pass through to another layer of silicon, but with nowhere near 100% efficiency. There's considerable absorption by the silicon which results in noise.


There will be a point from which more pixels will be bringing more problems than advantages. Then, new ways of image improvement will be interestiing. Foveon type sensors aren't really competitive right now. Remember when the easiest way of microprocesor's improvement was mhz increase. At this stage of the technology the best way of image improvement is increasing the number of pixels, combined with improvements in sensor architecture. Back-illuminated CMOS sensors with sophisticated electronics will be a large step forward. We will see 35mm sensors with 30 or more MP, soon. You will be able to use the full resolution potential or, by means of pixel binning, get more "quality" per pixel (dynamic range, noise). RAW images based on multiple exposure shots will be the norm very soon as well...

Current Foveon sensors need quite large pixels. They would be great competing against bayer sensors with the same number of photodetectors, but Foveons cannot do this. So is more easy and cost saving to get the same with a Bayer architecture and more pixels... I think this will change. I don't know when, but it will happen. Then we will have another huge step forward...

The true bottle neck seems to be the printing technologies... but, is photography based on prints any more?
« Last Edit: July 17, 2009, 05:50:37 AM by Nemo » Logged
Ray
Sr. Member
****
Offline Offline

Posts: 8900


« Reply #39 on: July 16, 2009, 07:13:58 PM »
ReplyReply

Quote from: Nemo
Back-illuminated CMOS sensors with sophisticated electronics will be a large step forward. We will see 35mm sensors with 30 or more MP, soon.

I think a step up from the 21mp of the 5D2 to just 30mp would be too little. 40mp would be better. If such a sensor were back-illuminated to enable the use of larger photodiodes, had no AA filter which would also reduce costs as well as improve resolution, had a few panchromatic pixels to further improve low-noise performance, I might not be able to resist buying such a camera, if the price were right.  
 
Whatever happened to that Kodak invention where half the pixels of the Bayer-type array were replaced with panchromatic pixels?
Logged
Pages: « 1 [2] 3 4 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad