Ad
Ad
Ad
Pages: [1] 2 3 4 »   Bottom of Page
Print
Author Topic: Moore's law for cameras  (Read 18145 times)
imagico
Newbie
*
Offline Offline

Posts: 47


WWW
« on: July 13, 2009, 01:41:14 AM »
ReplyReply

Reading Ray Maxwell's essay i have to say while i agree with the central idea that the pixel numbers of sensors will most likely not scale according to Moore's law in the future i do not think i fully agree with his reasoning:

The end of Moore's law has been postulated many times for semiconductors - for very similar reasons as Ray has given for camera sensors - namely the limitations of physics and the lack of actual need.  None the less the structure sizes in chip production continue to shrink since engineers found ways to work around this and economic pressure continues to favor faster and cheaper components.

The bottom line:  I think there is no reason why Moore's law should not apply to digital camera technology in general - with a somewhat different time scale possibly (this is not the same for all computer components either - HDs have a somewhat different scaling than RAM or processors).  But of course there will be focus shifts from some aspects (like maybe the pixel count) to others (like maybe sensitivity, dynamic range, metering).  The problem will be more how to measure the validity of Moore's law since it is not so straight away to measure the performance of a digital camera as it is for a computer.

The fact that the viewers of photos taken with various cameras can hardly tell the difference between different models and possibly even less so in the future is actually something that brings cameras closer to computers - the computer used to produce a certain digital product is usually not visible in the product itself in any way.
Logged

Christoph Hormann
photolog / artificial images / other stuff
feppe
Sr. Member
****
Offline Offline

Posts: 2909

Oh this shows up in here!


WWW
« Reply #1 on: July 13, 2009, 02:22:42 AM »
ReplyReply

Hurdles computer chip designers ran into are due to materials used, so they can overcome them by changing the materials or their composition (to a certain extent, I'm not a chip engineer). The ultimate hurdle sensor and lens designers have to overcome is not of the lens or sensor, but of life itself - a fundamental physical limitation rather than an engineering problem.

Then again, there are already prototypes of negative refraction lenses even in visible wavelengths, which might change the ballgame just like changing the composition of chips did.

But I agree with the author that there are many more interesting things being prototyped and envisioned other than a hundred megapixel camera, for example variable DOF which can be changed in post. Also, while a hundred megapixels is certainly overkill for 99.99999% of applications, it allows for massive cropping and/or zooming into the image (think Zoomify).
Logged

Rob C
Sr. Member
****
Offline Offline

Posts: 12213


« Reply #2 on: July 13, 2009, 02:43:47 AM »
ReplyReply

Quote from: feppe
Also, while a hundred megapixels is certainly overkill for 99.99999% of applications, it allows for massive cropping and/or zooming into the image (think Zoomify).




But isnīt that the point? Zooming in becomes useless if all you do is zoom into mush.

Rob C
Logged

feppe
Sr. Member
****
Offline Offline

Posts: 2909

Oh this shows up in here!


WWW
« Reply #3 on: July 13, 2009, 03:33:41 AM »
ReplyReply

Quote from: Rob C
But isnīt that the point? Zooming in becomes useless if all you do is zoom into mush.

Indeed it is - but my comment was given with the hope that negative refraction lenses become reality in consumer cameras.

I'm not an optical engineer (or any other kind), so I have no idea how feasible it is. But human ingenuity never ceases to amaze me, and I wouldn't be surprised if we come up with ways to get around pesky physical limitations.

One of my favorite examples is how telescopes get around atmospheric distortions: make a mirror which deforms on cue tens or hundreds of times a second, and use the changes in the position of a known reference star as basis to calculate how to deform the mirror, thus fixing the distortions. What was thought physically impossible just decades ago is reality.
Logged

Nemo
Sr. Member
****
Offline Offline

Posts: 276



« Reply #4 on: July 13, 2009, 06:02:19 AM »
ReplyReply

Ray Maxwell's main argument is correct.

However, it is referring to the resolution of an isolate point with a monochromatic sensor.

Image details are formed by overlapping Airy disks. Yo need at least 4 pixels for a complete resolving of the light intensity differences into an Airy disk (a peak and two valleys for an isolated disk). Even more than 4 pixels if you has a Bayer pattern...

http://luminous-landscape.com/tutorials/resolution.shtml

Better lenses means smaller Airy disks at wide apertures... so there is room for improvement. Optical improvement. It is quite expensive though, and it translates to big lenses.

Overall, Maxwell made a good point. Additional increases in resolution has a marginal decreasing impact in the final detail resolved...
« Last Edit: July 13, 2009, 07:36:10 AM by Nemo » Logged
pegelli
Sr. Member
****
Offline Offline

Posts: 609



WWW
« Reply #5 on: July 13, 2009, 07:30:51 AM »
ReplyReply

I think his essay nicely proves why it's not needed with current optical technology.

However only time will tell if that is sufficient reason for it not to happen  
Logged

pieter, aka pegelli
Tim Gray
Sr. Member
****
Offline Offline

Posts: 2002



WWW
« Reply #6 on: July 13, 2009, 07:55:19 AM »
ReplyReply

With respect to a sensor limited to the traditional 35mm, of course Ray is correct - the laws of physics being what they are.   However there's nothing written in stone that says a sensor needs to be limited to 35mm (as is the case for the expensive MFs).  I don't understand that the operation of Moore's law in the confines of physical limitation (or a looser interpretation of his "law") would preclude inexpensive larger sensors.  In any event, regardless of the price, the form factor probably precludes significant demand for, say an 8x10" sensor with the same density of a P65.  So, he's probably right in a practical sense as well.

BTW, It "Innovators Dilemma" not "Inventors Dilemma".
Logged
bradleygibson
Sr. Member
****
Offline Offline

Posts: 829


WWW
« Reply #7 on: July 13, 2009, 08:26:02 AM »
ReplyReply

I won't debate the physics of Ray's article, because I believe he is essentially correct.

The thing about cameras though, is that people don't buy them based on how efficiently they use thier airy disks--people buy them to meet an emotional need.  Marketers understand this, and that is why you can find 8MP cellphone cameras and similarly ridiculously-sized pixels in point-and-shoot cameras.

I see no reasons why the megapixel race will not continue for the forseeable future:
  * As hardware gets faster there is no apparent penalty in performance
  * As compression gets better, there is no apparent penalty in filesizes
  * As hardware gets cheaper, there is no apparent penalty in cost
  * As competitors eek out a higher megapixel count, and other companies dutifully follow so as not to be at a marketing disadvantage, engineers exploit the bayer mosaic to go below the limit implied by the Airy disk (discussed already above in this thread).

It is that last point which I feel will be the prime motivator for pushing the number of pixels beyond the Airy disk limit.

-Brad
Logged

bjanes
Sr. Member
****
Offline Offline

Posts: 2882



« Reply #8 on: July 13, 2009, 10:39:30 AM »
ReplyReply

Quote from: imagico
Reading Ray Maxwell's essay i have to say while i agree with the central idea that the pixel numbers of sensors will most likely not scale according to Moore's law in the future i do not think i fully agree with his reasoning:

The end of Moore's law has been postulated many times for semiconductors - for very similar reasons as Ray has given for camera sensors - namely the limitations of physics and the lack of actual need.  None the less the structure sizes in chip production continue to shrink since engineers found ways to work around this and economic pressure continues to favor faster and cheaper components.

The bottom line:  I think there is no reason why Moore's law should not apply to digital camera technology in general - with a somewhat different time scale possibly (this is not the same for all computer components either - HDs have a somewhat different scaling than RAM or processors).  But of course there will be focus shifts from some aspects (like maybe the pixel count) to others (like maybe sensitivity, dynamic range, metering).  The problem will be more how to measure the validity of Moore's law since it is not so straight away to measure the performance of a digital camera as it is for a computer.

The fact that the viewers of photos taken with various cameras can hardly tell the difference between different models and possibly even less so in the future is actually something that brings cameras closer to computers - the computer used to produce a certain digital product is usually not visible in the product itself in any way.


This article from Stanford University Electrical Engineering, Moore meets Planck and Sommerfeld, discusses some of the limitations of applying Moore's law to sensors. One can scale the electronic part of the chip, but since imagers must interact with light, the implications of Moore's law are different for sensors than micro-electronic components. Planck (photon noise) and Sommerfeld (diffraction) limit the usefulness of scalilng. Read the article for details. I think that Ray's argument is sound.

Bill
Logged
Michael LS
Jr. Member
**
Offline Offline

Posts: 56


WWW
« Reply #9 on: July 13, 2009, 11:37:06 AM »
ReplyReply

Mr. Maxwell made the statement, "We are very near the limit right now."

Therefore, how "near" is "very near"? That is, when using the current crop of pro-level lenses from the top camera makers on full-frame dslrs, what is the "limit"? Thirty-something mp? Forty-something? Fifty-something? A ballpark number is fine, since an exact number would be useless, and only theoretical, given manufacturing variations, etc.

And I'm speaking, of course, of current silicon, glass and noise-reduction technology- not vapor-ware like negative refraction lenses and other exotic and not-for-sale technology (which I look forward to seeing, but my wallet doesn't   )

So, any engineers, or armchair engineers care to take a crack at it?
Logged

imagico
Newbie
*
Offline Offline

Posts: 47


WWW
« Reply #10 on: July 13, 2009, 11:46:21 AM »
ReplyReply

Quote from: bjanes
This article from Stanford University Electrical Engineering, Moore meets Planck and Sommerfeld, discusses some of the limitations of applying Moore's law to sensors. One can scale the electronic part of the chip, but since imagers must interact with light, the implications of Moore's law are different for sensors than micro-electronic components. Planck (photon noise) and Sommerfeld (diffraction) limit the usefulness of scalilng. Read the article for details. I think that Ray's argument is sound.

Please note i am not arguing the existence of physical limits that make it more and more inefficient to further decrease pixel sizes of sensors - both from the point of view of optics as from quantum mechanics.  What i argue is that this does not mean that Moore's law does not apply to digital photography technology in general as the essay says.

There is a well known example from computer history that comes quite close to the current megapixel problem - it is the megahertz race of processors from some years ago (but please don't overstress this analogy):  For quite some time the increase in computer performance was to a large extent accomplished by increasing clock speeds of the CPU.  When this inflationary increase of clock speeds found an end with Intel giving up the Pentium 4 design this was to an important part due to a hard physical limit they ran into: the ever increasing power dissipation with raising clock speeds leading to thermal power densities that could no more efficiently be handled (other aspects played a role as well of course).  But this did not stop the overall exponential scaling of performance according to Moore's law, technological development just switched from gaining performance by increasing clock speed to other areas (for example multi-core designs).

Please also keep in mind Moore's law does not apply to every single property of computers either - there are well known aspects of computer technology that did not scale exponentially in recent times at all, like for example memory access speeds, leading to significant changes in performance relations inside computer systems.  This does not mean though that you can say Moore's law is not a good approximation of scaling of computer technology as a whole.

Logged

Christoph Hormann
photolog / artificial images / other stuff
Alan Goldhammer
Sr. Member
****
Online Online

Posts: 1737


WWW
« Reply #11 on: July 13, 2009, 12:02:16 PM »
ReplyReply

Wow, I guess I'm going to have to pull out my old physics textbooks out and dust them off to get back up to speed on these issues!  The trouble is that we need to look at the camera as a whole in terms of it capturing and image and then to everything we use post-image to process the information into a final print.  I don't think we can predict with any degree of certainty what changes in any one aspect of the process will do to the final outcome (a pleasing print).  Sensor size, lens design, new computer algorythmns and the Adobe engineers who give us the software.  The one key think that Maxwell notes at the end of the essay is can anyone really tell the difference?  At some level the answer is yes, if we are looking at extremely large magnifications, but is that the real world (maybe for spy satellites it is)?  For most of us who don't print enormous panaramas and try to get everything of interest into a single image and then print the full frame at a modest enlargement the answer is no.  I can take a nice image with my D300 (tripod mounted) and print it on 13x19 paper and doubt that the quality will be markedly different had I used a full frame DSLR or medium format camera with corresponding back.  I do know with certainty that the latter equipment will cost a lot more money and that my cost/benefit calculation will likely point to the extra money spent on a new camera not being worth it at this point (prices may come down in the future).  Do I need a Hasselblad or Phase One to do what I'm doing; not really.

Interesting topic nonetheless and the articles are provocative.
Logged

ErikKaffehr
Sr. Member
****
Online Online

Posts: 8018


WWW
« Reply #12 on: July 13, 2009, 12:24:03 PM »
ReplyReply

Hi,

More's law, as it is mostly known, is about shrinking component size. The component sizes now days are well below 100 nm, while it seems obvious that sensor pixel sizes much smaller than 5 microns don't make much sense. To me it seems that pixel sizes don't really depend on manufacturing technology but on other factors, like diffraction and well capacity. For that reason I cannot see that an exponential increase in sensor resolution would make any sense.

It is quite obvious that photographic resolution is limited by diffraction. The question is at which pixel size we get diminishing returns. The only way of reducing diffraction is to increase the optimum aperture of photographic lenses, but there is a problem with that approach, namely that depth of field will be very small. So we could have a very high performing lens, say a lens which is diffraction limited at f/2.8, but such a lens could only achieve it's maximum resolution essentially in a single plane. Add to this the need to have the sensor in exact alignment with the sensor, in no way an easy feat.

More's law is not just about shrinking component size but also about increasing die size. This is absolutely relevant for photography. It means that big sensors are going to be more affordable. This can also be seen with the Canon 5D (II) and also the Sony Alpha 900. The same trend may also affect large sensor sizes.

Finally, I can see a benefit of increasing pixel densities further. One reason is that we can eliminate the need for low pass (anti aliasing) filter if we make the airy ring about the same size as a pixel. I'd also guess that it's better to have more pixels than "uprezz" using interpolation.

Best regards
Erik

Quote from: Tim Gray
With respect to a sensor limited to the traditional 35mm, of course Ray is correct - the laws of physics being what they are.   However there's nothing written in stone that says a sensor needs to be limited to 35mm (as is the case for the expensive MFs).  I don't understand that the operation of Moore's law in the confines of physical limitation (or a looser interpretation of his "law") would preclude inexpensive larger sensors.  In any event, regardless of the price, the form factor probably precludes significant demand for, say an 8x10" sensor with the same density of a P65.  So, he's probably right in a practical sense as well.

BTW, It "Innovators Dilemma" not "Inventors Dilemma".
Logged

ErikKaffehr
Sr. Member
****
Online Online

Posts: 8018


WWW
« Reply #13 on: July 13, 2009, 12:33:47 PM »
ReplyReply

Hi Tim,

It would probably be possible to increase sensor size to 6x6 or even 6x8 cm. Todays large format sensors are stitched from smaller sensors, perhaps because today's steppers cannot expose a full frame sensor in a single exposure. Canon is supposed to have a stepper with that capability, but it is quite obvious that sensors in the Nikon 3DX and the Alpha 900 are stitched, and so are Dalsa MF sensor.

Problem is that you also need an ecosystem, lenses that are good enough, cameras which are aligned within a few microns and customers willing to pay premium dollars. There are probably some three or four letter organizations having that kind of needs and assets.

Best regards
Erik


Quote from: Tim Gray
With respect to a sensor limited to the traditional 35mm, of course Ray is correct - the laws of physics being what they are.   However there's nothing written in stone that says a sensor needs to be limited to 35mm (as is the case for the expensive MFs).  I don't understand that the operation of Moore's law in the confines of physical limitation (or a looser interpretation of his "law") would preclude inexpensive larger sensors.  In any event, regardless of the price, the form factor probably precludes significant demand for, say an 8x10" sensor with the same density of a P65.  So, he's probably right in a practical sense as well.

BTW, It "Innovators Dilemma" not "Inventors Dilemma".
Logged

DaveCurtis
Sr. Member
****
Offline Offline

Posts: 461


WWW
« Reply #14 on: July 13, 2009, 04:22:01 PM »
ReplyReply

I gather that a "superlens" has been created with a negative refractive index  thus overcoming the so-called diffraction limit. Interesting stuff. Not sure if it's relavent to camera lenses though.

There seems to be several references to "Superlens" research on the net.

"A new superlens that could make it possible to film molecules in action in real time with visible light has been developed by HP Labs researchers.

The lens takes advantage of subwavelength details in evanescent components of light, which can propagate in a material with a negative refractive index. To achieve a record-breaking resolution of 1/12th of the wavelength of light, the researchers grew smooth silver film just a few tens of nanometers thick on a layer of germanium, forcing the silver to form a smooth thin film"

Logged

pedro.silva
Newbie
*
Offline Offline

Posts: 19


« Reply #15 on: July 13, 2009, 05:47:30 PM »
ReplyReply

greetings!
allow me to leave photon noise aside for now, and concentrate on the damning diffraction.  i was under the impression that small pixels would not necessarily pose an insurmountable diffraction problem: with the information recorded by small enough pixels, one could process that information, by deconvolution or whatever, and actually get higher resolution than with the bigger pixels.  and it would seem that diffraction shouldn't be too hard to model.  of course, that would help escalate our computer expenses even more...
i am far off?
oh, in case it's not obvious... i'm no engineer!
cheers,
pedro
Logged
BernardLanguillier
Sr. Member
****
Offline Offline

Posts: 8387



WWW
« Reply #16 on: July 13, 2009, 07:24:34 PM »
ReplyReply

Quote from: bradleygibson
The thing about cameras though, is that people don't buy them based on how efficiently they use thier airy disks--people buy them to meet an emotional need.  Marketers understand this, and that is why you can find 8MP cellphone cameras and similarly ridiculously-sized pixels in point-and-shoot cameras.

I see no reasons why the megapixel race will not continue for the forseeable future:
  * As competitors eek out a higher megapixel count, and other companies dutifully follow so as not to be at a marketing disadvantage, engineers exploit the bayer mosaic to go below the limit implied by the Airy disk (discussed already above in this thread).

It is that last point which I feel will be the prime motivator for pushing the number of pixels beyond the Airy disk limit.

I guess that different companies have different approaches to this. Nikon seems to be clearly less interested in going for more megapixels, the D3 was a brave move in that it clearly showed that at least one camera company dares to do what most knowledgeable photogrpahers had been requesting: stop the megapixel race and go for more DR and lower noise levels.

I believe that consumer are not stupid and they are now willing to listen to sales people telling them that better pixels are more important than more pixels. So I am therefore not sure that - for DSLRs at least - the race forward will keep focussing on more pixels.

The only missing piece is a good metrics of pixel quality, a number that people can relate to as easily as pixels. My proposal would be to call it... "pixel quality"... and to compute it in a standardized way a la DxO. This is, by the way, what DxO is shooting for with their DxOMark thing, they are trying to have their name associated with the measure of pixel quality. They have foreseen a world in which "DxOMark" is written in camera tags in stores next to "resolution", "weight" and "price". One of the smartest move in the camera industry in years IMHO.

Therefore, I believe that the next thing is more features, starting with video, lenses... and better pixels.

Cheers,
Bernard
Logged

A few images online here!
AndyF
Newbie
*
Offline Offline

Posts: 44


WWW
« Reply #17 on: July 13, 2009, 08:02:09 PM »
ReplyReply

It may not be that Moore's Law doesn't apply to digital photography, but how and where it will manifest itself.  For example, I can see a strong argument for an 88 Mpixel sensor that would normally be diffraction limited at 22 Mp.  Why?  To take advantage of the diffraction disk!  

Assuming (and requiring...) that the essential advance is those "88 Mp" pixels have the same sensitivity and noise performance as today's 22 Mp pixels, you could then fit the entire Bayer RGB cluster of pixels into one airy disk.  The end result should be 22 mega sensor sites, where each site is an RGB Bayer cluster of pixels.  Due to the airy disk causing that cluster to see the same spot of information in the image, you'll have the full RGB information at that spot.

Another way of exploiting the diffusion provided by the airy disk and sub-airy disk pixel sizes, would be clustering pixels of different sensitivity.  With one exposure, the nominal 1.0 pixel would be exposed, and so could a 0.25 and a 4.0 pixel.  That will provide a wider dynamic range.

There are some further complications to exploiting this, such as the disk encompassing different pixels at different f stops but they can be solved.
Andy
Logged
John Camp
Sr. Member
****
Offline Offline

Posts: 1260


« Reply #18 on: July 13, 2009, 10:37:10 PM »
ReplyReply

What if someone designed a camera without a shutter -- say one that sampled photon quantities at each well site over some simultaneous period of time, like 1/125, or 1/250, etc.
Logged
bradleygibson
Sr. Member
****
Offline Offline

Posts: 829


WWW
« Reply #19 on: July 13, 2009, 11:06:52 PM »
ReplyReply

John:
That's effectively what's happening with most modern DSLRs in live view mode, or when recording video.  The new micro 4/3 cameras and the RED cameras are both mirrorless as well.

Personally, I think it's a step in the right direction.  But removing the mechanical shutter wouldn't address the diffraction limit Ray's article discusses.

Andy:
Your idea is an example of the concept I am referring to when I say 'exploiting the Bayer pattern to go below the Airy disk limit'.  But remember that the Airy disks are overlapping, so each pixel is getting information mixed from adjacent sites.  I don't have any idea of how sort that out, even in theory.
Logged

Pages: [1] 2 3 4 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad