Ad
Ad
Ad
Pages: « 1 2 3 [4]   Bottom of Page
Print
Author Topic: 8x10 ...... digital  (Read 11077 times)
design_freak
Sr. Member
****
Offline Offline

Posts: 1071



« Reply #60 on: December 13, 2011, 03:25:50 AM »
ReplyReply


Imagine now that you will read the newspaper to be displayed on your contact lenses  Cool
Logged

Best regards,
DF

-------------------------------------------
WORK HARD AND BE NICE TO PEOPLE
-------------------------------------------
Dick Roadnight
Sr. Member
****
Offline Offline

Posts: 1730


« Reply #61 on: December 13, 2011, 04:00:43 AM »
ReplyReply

if you are using a Bayer-interpolated sensor with an anti-aliasing filter you need to down-sample from 30 or 40 Mpx to get an optimum 2Mpx file.

An anti-aliasing filter spreads the light destined for each pixel over the adjacent 8 pixels, so you could argue that you need a 10 time down-sample to compensate, and Bayer interpolation interpenetrates one pixel from 4 real pixels, so, theroetically, if these two factors effectively multiplied, you would need 40 pixels to get one optimal pixel.

So, according to that theory, a 1MPx crop from a 4 shot MF picture would be as good as a 40 Mpx AA Bayer picture, which is clearly not the case.

addition P.S.:

When I re-sampled for my wife's website:

http://rosalindcaplisacademy.co.uk/

I expected there to be no perceivable difference between the 15ish Mpx GH2 pictures and the 60Mpx H4D-60 picture (which did not make it to the web site, as it was the wrong shape)... but the difference was easy to see at the res I uploaded, but they further down-sampled them for the website.


I am not sure where you got this 30-40mp figure from.

A 12 MP image that is critical sharp and well sharpened already looks very good at pixel level, it looks amazing after downsizing to 1080p.

There might be a tiny difference between 30mp and 12mp downsize, but I doubt anyone would be able to see it on screen at 1080p.

Cheers,
Bernard

« Last Edit: December 13, 2011, 06:41:45 AM by Dick Roadnight » Logged

Hasselblad H4, Sinar P3 monorail view camera, Schneider Apo-digitar lenses
ondebanks
Sr. Member
****
Offline Offline

Posts: 805



« Reply #62 on: December 13, 2011, 05:55:18 AM »
ReplyReply

An anti-aliasing filter spreads the light destined for each pixel over the adjacent 8 pixels, so you could argue that you need a 10 time down-sample to compensate, and Bayer interpolation interpenetrates one pixel from 4 real pixels, so, theroetically, if these two factors effectively multiplied, you would need 40 pixels to get one optimal pixel.

So, according to that theory, a 1MPx crop from a 4 shot MF picture would be as good as a 40 Mpx AA Bayer picture, which is clearly not the case.

Dick, I think I can explain this. The Bayer interpolation happens afterwards and is decoupled from the convolution of the AA filter; interpolation (informed guessing of a missing value) is not the same type of process as convolution (spreading out of signal). So the area affected, when the two processes act on any given pixel location, is not the product of the number of adjoining pixels that they each individually operate on.

Also, the AA filter acts as a convolution kernel which is tapered, not a top-hat block; so the way that it distributes light is not nearly as severe as dividing it equally over 8 pixels. The vast majority of the light ends up in the central 4 pixels.

So if you took each block of 2x2 pixels as a "super-pixel", and instead of de-Bayering them by interpolation, directly assigned a real RGB colour from their R/G/B/G information, you'd have no Bayer guesswork artefacts; and the same 2x2 superpixel would also "suck in" almost all the AA-distributed light at its location.

Ray
Logged
BJL
Sr. Member
****
Offline Offline

Posts: 5085


« Reply #63 on: December 13, 2011, 10:29:27 PM »
ReplyReply

Mr. Rib,
    I think that was me you quoted, so here are some of the problems I know of.
1. The sensor chips have to output along at least one edge, so even when you stick several together, each one has to have an outside edge ... Else there has to be a substantial gap between chips for the output wiring. So some X-ray sensors use 2x2 arrays of chips, but that is a natural limit.
2. For most photographic purposes, the pixels have to be about 10 microns or less, which is 1/2500 inch, so getting two sensor chips to fit that snugly is tricky, and failing that snug fit there will be lines. (No big problem for X-rays though, or for this guy's "Polaroid")

By the way, these huge 77 micron pixels are about the size of the pixels on the iPhone's so-called Retina display, and so I guess that this sensor was manufactured using the same equipment used to make this new generation of high resolution LED panels. This LED panel making gear is clearly designed to handle far larger sizes than the gear used to make IC's and normal camera sensors. So maybe soon the same tools used to make the 8K resolution TV screens that are being aimed at in Japan could indeed make that 24” x 20" "digital Polaroid" sensor!
« Last Edit: December 13, 2011, 10:40:25 PM by BJL » Logged
Graham Mitchell
Sr. Member
****
Offline Offline

Posts: 2282



WWW
« Reply #64 on: December 14, 2011, 02:59:24 AM »
ReplyReply

As for the device in the original post, I emailed the owner again about a sample but no reply. Last time he replied that he was on holiday. Either he really doesn't want to share, or perhaps the whole thing is a hoax?
Logged

Graham Mitchell - www.graham-mitchell.com
hjulenissen
Sr. Member
****
Offline Offline

Posts: 1615


« Reply #65 on: December 14, 2011, 03:37:30 AM »
ReplyReply

if you are using a Bayer-interpolated sensor with an anti-aliasing filter you need to down-sample from 30 or 40 Mpx to get an optimum 2Mpx file.

An anti-aliasing filter spreads the light destined for each pixel over the adjacent 8 pixels, so you could argue that you need a 10 time down-sample to compensate, and Bayer interpolation interpenetrates one pixel from 4 real pixels, so, theroetically, if these two factors effectively multiplied, you would need 40 pixels to get one optimal pixel.

So, according to that theory, a 1MPx crop from a 4 shot MF picture would be as good as a 40 Mpx AA Bayer picture, which is clearly not the case.
What is "an optimum 2Mpx file", and why would I want it? I want "an optimum image", either hanging on my wall or shown on my computer display.

I think that in order to state "theoretically....", you should have a clearly stated, widely accepted theory. I dont think that you have.

An AA-filter acts to smooth/blur the image optically/continously prior to sampling, not totally unlike diffraction blurring. The exact kernel (smoothing function) is somewhat different from that of diffraction, and it is (hopefully) not dependent on camera/lense settings.

If the scene was flat spectrum, and the AA filter convolved with sensel coverage was a "perfect" sin(x)/x function (and the sensel itself was a point-sampler) and the sensor had no CFA, I believe that we could apply Shannon-Nyquist theory rather easily. In that case, an AA-filtered sensor could accurately capture any pattern of light that was bandlimited to N/2 maxima and N/2 minima either vertically or horizontally, if the sensor had N sensels in that dimension. Any light patterns that changed quicker than that (such as stepped edges) would be band limited.

What happens if we, say, change that sin(x)/x function with a rectangular integration corresponding to the sensel spacing (i.e. simulating a AA-filter less idealized sensor)? The Fourier transform of a rectangular function is a sin(x)/x function, so you would get some attenuation of "passband" (desired signal) and bleed-through of aliasing-causing frequencies. This can be easily seen by letting those rectangular integrators slide by an image of hard edges/impulses: the output can have relatively large changes for small changes in sensel/image alignement. For other spots, the expected output image could change exactly zero, even though the camera/scene have changed alignmnt by 1/2 sensel. In other words, it is not possible to recreate accurately the original scene (not even a bandlimited version). That is not to say that an inaccurate representation cannot be visually pleasing (or even more pleasing than the accurate version).

So what happens if we replace the AA filter with a realistic filter like what Canikon use? I dont know. Anyone know their spatial function?

So what happens if we allow the scene to actually have colors, and the sensor to have a CFA and demanding the use of demosaicing? Demosaicing is application specific, and usually proprietary, so I wont comment on that. But scene colors and CFA is interesting. If we assume an improbably narrow spectrally scene that only gets sensed by one of the CFA primaries, I believe that we can use the same analysis as for the color-less case, only that the sensels will be reduced to 1/4 (r,b) or 1/2 (g) while the AA filter stays the same. Clearly, this would make the (up until now) perfekt AA-filter less perfect (too high cutoff frequency), and we would have more spatial aliasing.

So what happens if we allow the scene to realistic spectrally? (most of the information/variation in the luminance)? I believe that this reduce the influence of the CFA spectral selectivity on spatial capture, and that the "monochrome" analysis iturns out to be quite relevant. Quite but not perfect. There will always be corner cases or nitty-grittys where the trade-offs present in Bayer-type sensors are made visible. I think that those trade-offs tend to be good ones for most applications

-h

edit:
most of my post is considering 1-d versions of the problem.
« Last Edit: December 14, 2011, 03:42:46 AM by hjulenissen » Logged
Dick Roadnight
Sr. Member
****
Offline Offline

Posts: 1730


« Reply #66 on: December 14, 2011, 02:26:46 PM »
ReplyReply

What is "an optimum 2Mpx file", and why would I want it? I want "an optimum image", either hanging on my wall or shown on my computer display.

I think that in order to state "theoretically....", you should have a clearly stated, widely accepted theory.

An optimal file is one that contains the maximum data per pixel, with minimum sampling or interpolation-created noise... to a point... the more you downsample, the better the per pixel quality of the file, when you get no more increase in quality with more downsizing, the per-pixel quality is as good as you can get... or optimal. This does, of course, depend on the down-sampling software... żand how do non-Bayer interpolated files form cameras without AA filters compare (pixel to pixel) to down-sampled CaNikon files?

Optimal per pixel quality is particularly important for web images, where the pixel dimensions are the limitation... Where the limitation is file size, or download time the problem is different, and the solution can be compression.

My theory uses simple arithmetic, which I have clearly explained. I explained the (false) assumptions I made and these have been clarified above by Ray.
« Last Edit: December 14, 2011, 02:40:05 PM by Dick Roadnight » Logged

Hasselblad H4, Sinar P3 monorail view camera, Schneider Apo-digitar lenses
theguywitha645d
Sr. Member
****
Offline Offline

Posts: 970


« Reply #67 on: December 14, 2011, 03:31:41 PM »
ReplyReply

Hi. Just editing my post.
« Last Edit: December 14, 2011, 03:55:16 PM by theguywitha645d » Logged
Pages: « 1 2 3 [4]   Top of Page
Print
Jump to:  

Ad
Ad
Ad