Ad
Ad
Ad
Pages: [1] 2 3 ... 5 »   Bottom of Page
Print
Author Topic: 16 bit dslr  (Read 9050 times)
david distefano
Full Member
***
Offline Offline

Posts: 125


« on: January 17, 2014, 09:59:48 PM »
ReplyReply

as i was reading old posts i came across a nov. 2007 post about the possibility of 16 bit dslr. now 7 years later does anyone see it as a possibility on cameras in the near future and with the  technology of today would it be more advantages vs. the increase in mp's.
Logged
Telecaster
Sr. Member
****
Offline Offline

Posts: 866



« Reply #1 on: January 18, 2014, 12:33:34 AM »
ReplyReply

I guess it's all down to the sensors. If one comes on the market with sufficient tonal gradation to merit 16-bit A/D conversion then we'll likely get 16-bit ADCs for it. It would make for good PR if nothing else.

-Dave-
Logged
ErikKaffehr
Sr. Member
****
Offline Offline

Posts: 7649


WWW
« Reply #2 on: January 18, 2014, 01:00:57 AM »
ReplyReply

Hi,

I very much doubt the usefullness of 16 bits. Right now 16 bits mean pretty much 13 bits + three bits of noise. Lens flare is also a factor, limiting density range. There is an amount of light bouncing around in the lens.

What I think we may see is similar to the extended range feature on some Fujifilm sensors, combining high-light and low-light pixels. This perhaps can be implemented using electronic shutters. The technology would be useful to handle specular highlights. This could still be implemented with 14 bits technology.

One thing to consider is that increasing dynamic range essentially also means that we get HDR on a chip. To utilize it, there is a need for HDR tone mapping or selective processing. Ugliness is around the corner. Consider this, high DR (each bit represents 1 EV of DR with linear coding) will be able to reproduce more specular highlights and deeper shadows.

12 bits is a 1:4096 contrast range
14 bits is a 1:16384 contrast range
16 bits is a 1:65536 contrast range

A high quality screen in a reasonably dark room probably has a contrast range of 1:400 and a good photographic print is probably 1:140 (best technology used on glossy paper, matte paper? Cut it in half)

So taming a wide contrast range takes some tricks, that is the reason many hate HDR.

These two articles may offer some insight:

http://echophoto.dnsalias.net/ekr/index.php/photoarticles/63-lot-of-info-in-a-digital-image
http://echophoto.dnsalias.net/ekr/index.php/photoarticles/61-hdr-tone-mapping-on-ordinary-image

Best regards
Erik



I guess it's all down to the sensors. If one comes on the market with sufficient tonal gradation to merit 16-bit A/D conversion then we'll likely get 16-bit ADCs for it. It would make for good PR if nothing else.

-Dave-
Logged

digitaldog
Sr. Member
****
Offline Offline

Posts: 9190



WWW
« Reply #3 on: January 18, 2014, 10:56:09 AM »
ReplyReply

It would make for good PR if nothing else.
Agreed, at least for the kinds of work done in these parts. For other capture (scientific?) maybe useful.
Logged

Andrew Rodney
Author “Color Management for Photographers”
http://digitaldog.net/
Telecaster
Sr. Member
****
Offline Offline

Posts: 866



« Reply #4 on: January 18, 2014, 04:16:24 PM »
ReplyReply

After getting the Sony A7r and setting it up with my usual (in-camera and RAW processor) flat defaults, I was taken aback by just how flat the files were. IMO ending up with a more pleasing and less HDR-ish look means throwing away tonal info. Sometimes lots of it.

I agree with Erik's comments. I'm all for having 16 bits of genuine image data if the tech can support it...but it would mean, among other things, another level of post work to contend with and more data to ultimately discard. Hmmm...

-Dave-
Logged
Torbjörn Tapani
Full Member
***
Offline Offline

Posts: 115


Re:
« Reply #5 on: January 18, 2014, 04:36:49 PM »
ReplyReply

Lightroom started as a HDR tone mapper. It handles 32 bit files. I think we will be alright with 16 bit RAW.
Logged
LKaven
Sr. Member
****
Offline Offline

Posts: 832


Re:
« Reply #6 on: January 20, 2014, 11:30:30 AM »
ReplyReply

Lightroom started as a HDR tone mapper. It handles 32 bit files. I think we will be alright with 16 bit RAW.

The use of 32 bits is for many things outside of traditional photography.  It allows the allocation of additional bits to low level signals (when those low-level signals are captured with sufficient fidelity to merit it).  It allows decisions about lighting to be made after the fact.  However these ideas did arise in the CGI world, where some practical considerations of physics did not intrude.

The first place we actually see sensors with 16-bits or more of dynamic range is in expanded /highlight/ headroom, and not in a lowered noise-floor.  Cinematographers need to be able to shoot in very high DR conditions with a graceful shoulder in highlight transition, something that film used to provide pretty nicely.
Logged

bjanes
Sr. Member
****
Online Online

Posts: 2824



Re:
« Reply #7 on: January 20, 2014, 02:21:06 PM »
ReplyReply

The use of 32 bits is for many things outside of traditional photography.  It allows the allocation of additional bits to low level signals (when those low-level signals are captured with sufficient fidelity to merit it).  It allows decisions about lighting to be made after the fact.  However these ideas did arise in the CGI world, where some practical considerations of physics did not intrude.

The first place we actually see sensors with 16-bits or more of dynamic range is in expanded /highlight/ headroom, and not in a lowered noise-floor.  Cinematographers need to be able to shoot in very high DR conditions with a graceful shoulder in highlight transition, something that film used to provide pretty nicely.

Digital capture is linear and one exposes so that the highlights are just short of clipping. The shoulder can be created in post processing, but I don't think it makes sense to separate out the highlights as a separate entity. With linear, all tones are equal (except for signal to noise, which becomes a problem in the shadows), and highlight headroom is determined by exposure.

Bill
Logged
LKaven
Sr. Member
****
Offline Offline

Posts: 832


Re:
« Reply #8 on: January 20, 2014, 03:01:26 PM »
ReplyReply

Digital capture is linear and one exposes so that the highlights are just short of clipping. The shoulder can be created in post processing, but I don't think it makes sense to separate out the highlights as a separate entity. With linear, all tones are equal (except for signal to noise, which becomes a problem in the shadows), and highlight headroom is determined by exposure.

Bill, I'm thinking of sensors that have been designed to be able to capture exposures that exceed a certain reference level for maximum exposure.  If the clipping point on an ISO 100 surface would be 0dB, the sensors in question would be able to capture +3-6dB without a corresponding change in reference levels.  In other words, dynamic range is extended only at the high end without being extended at the low end. 
Logged

Fine_Art
Sr. Member
****
Offline Offline

Posts: 1097


« Reply #9 on: January 20, 2014, 03:09:19 PM »
ReplyReply

The biggest limitation is screens which are almost all 8bit electronic devices. High colors are dithering. Some pro grade screens for medical imaging or graphic professionals are 10 bit. Does anyone know of 14 bit screens that can be bought?

So for archival of historical works 16 bit would be useful, for most anything else you will never see a difference. Prints have much less DR than screens.
Logged
Telecaster
Sr. Member
****
Offline Offline

Posts: 866



Re:
« Reply #10 on: January 20, 2014, 03:42:13 PM »
ReplyReply

Bill, I'm thinking of sensors that have been designed to be able to capture exposures that exceed a certain reference level for maximum exposure.  If the clipping point on an ISO 100 surface would be 0dB, the sensors in question would be able to capture +3-6dB without a corresponding change in reference levels.  In other words, dynamic range is extended only at the high end without being extended at the low end. 

People often forget that sensors themselves are not digital devices. Makes sense to me to design in a sensor-level highlight shoulder. I also like the idea of photosites that can fill to a certain level, then readout while continuing to capture—maybe with multiple readouts—during the course of a single exposure.

-Dave-
Logged
Vladimirovich
Sr. Member
****
Offline Offline

Posts: 1320


« Reply #11 on: January 20, 2014, 03:42:43 PM »
ReplyReply

The biggest limitation is screens which are almost all 8bit electronic devices. High colors are dithering. Some pro grade screens for medical imaging or graphic professionals are 10 bit.
high end greyscale medical displays are > 10 bits, are they not ... like 12 bits (4096 shades) ?
Logged
bjanes
Sr. Member
****
Online Online

Posts: 2824



« Reply #12 on: January 20, 2014, 06:17:12 PM »
ReplyReply

The biggest limitation is screens which are almost all 8bit electronic devices. High colors are dithering. Some pro grade screens for medical imaging or graphic professionals are 10 bit. Does anyone know of 14 bit screens that can be bought?

So for archival of historical works 16 bit would be useful, for most anything else you will never see a difference. Prints have much less DR than screens.

If 16 bit capture were available, it could be used in printing as the art of printing is in tone mapping higher dynamic range to what can be printed as Karl Lang explains here. Why else would one resort to HDR imaging in difficult scenes.

Bill
Logged
allegretto
Sr. Member
****
Online Online

Posts: 491


Re:
« Reply #13 on: January 20, 2014, 06:37:05 PM »
ReplyReply

People often forget that sensors themselves are not digital devices...

really? Do you have some reference for this? Light is digital, a photon is or is not. Sensors are buckets that fill with light and the recruitment (or lack of recruitment) of photons determines the output at that site.  So I'm not sure why you say this

However I would like to learn something here...
Logged
Telecaster
Sr. Member
****
Offline Offline

Posts: 866



Re:
« Reply #14 on: January 20, 2014, 09:28:09 PM »
ReplyReply

really? Do you have some reference for this? Light is digital, a photon is or is not. Sensors are buckets that fill with light and the recruitment (or lack of recruitment) of photons determines the output at that site.  So I'm not sure why you say this

However I would like to learn something here...

By your definition film is also digital. It too captures photons.

Photosites on a sensor capture light and then read out voltages corresponding to the amount of light captured. It's important to note they do not read out photon counts. The voltages are then quantized from continuous values into discrete ones by an analog-to-digital converter. It's at this point that the captured data becomes digital, not before.

Now in the end everything is digital. Quantum mechanics implies that space itself is like a grid, with an absolute minimum distance between any two points, rather than the smooth fabric of general relativity. But when it comes to "digital" cameras the digital part refers specifically to post-capture data quantization.

-Dave-
Logged
jrsforums
Sr. Member
****
Offline Offline

Posts: 744


Re:
« Reply #15 on: January 20, 2014, 11:05:24 PM »
ReplyReply

really? Do you have some reference for this? Light is digital, a photon is or is not. Sensors are buckets that fill with light and the recruitment (or lack of recruitment) of photons determines the output at that site.  So I'm not sure why you say this

However I would like to learn something here...

Why do you think they need ADC's....analog to digital converters
Logged

John
allegretto
Sr. Member
****
Online Online

Posts: 491


Re:
« Reply #16 on: January 20, 2014, 11:28:46 PM »
ReplyReply

By your definition film is also digital. It too captures photons.

Photosites on a sensor capture light and then read out voltages corresponding to the amount of light captured. It's important to note they do not read out photon counts. The voltages are then quantized from continuous values into discrete ones by an analog-to-digital converter. It's at this point that the captured data becomes digital, not before.

Now in the end everything is digital. Quantum mechanics implies that space itself is like a grid, with an absolute minimum distance between any two points, rather than the smooth fabric of general relativity. But when it comes to "digital" cameras the digital part refers specifically to post-capture data quantization.

-Dave-

yes, the quantization of light is always digital, and that's my point. Even in your retina.

we may read voltages due to the the way they are designed, but to me this is simply a first-pass conversion of light from it's native digital to an analog expression that requires an A>D conversion downstream. But that's simply a transformation that is transformed back. The collection of photons, which is what film or a sensor do, is quite digital. That is, that the analog output is a product of how the circuit is designed. not how sensors or Nature works.

If one wishes to argue that film is more analog since it is a graded absorption in the emulsion I guess that might fly, but the sensor has a specific digital "count" for each and every photon it collects. Voltages are simply a secondary conversion, not the process of absorption within the pixel. It's just the way engineers have (to this point) designed the process, not the actual event. It is quite possible that in the future the process will become digital all the way through and A>D convertors will go the way of... emulsions.

More properly, the sensor is certainly a digital device, that reads out in an analog fashion due to design (so it does a D>A conversion by design which later needs an A>D conversion). Let's take your point; that the capture is analog until the A>D conversion. This does not follow for me since a photon hits the sensor, a digital event if ever there was one, and the information is purely digital at that point. Not sure how to see it any other way.

BTW, quantum mechanics does "respect" the plank length. This should not be confused with thinking that no other interval or system is at play. It's just the theory most consistent with what we think we know… for now… But just as surely as newtonian thinking had to be "refined" by Einsteinian Physics, QM may just be a rest stop too.

Yes, in the end, all information appears to be digital (who knew?). Thus it survives transformation. At least that's a current theory too. For a beautiful analysis of this you might have already read Wolfram's "A New Kind of Science". If you have not seen this amazing work it's something that is well worth anyone who considers themselves a Scientist should be exposed to.
« Last Edit: January 21, 2014, 12:26:51 AM by allegretto » Logged
hjulenissen
Sr. Member
****
Offline Offline

Posts: 1683


« Reply #17 on: January 21, 2014, 04:04:23 AM »
ReplyReply

The biggest limitation is screens which are almost all 8bit electronic devices. High colors are dithering. Some pro grade screens for medical imaging or graphic professionals are 10 bit. Does anyone know of 14 bit screens that can be bought?

So for archival of historical works 16 bit would be useful, for most anything else you will never see a difference. Prints have much less DR than screens.
My several years old Dell screen is (AFAIK) capable of reproducing 10 bits through the use of so-called "FRC". If it is achieved via temporal or spatial dithering is not so critical as the question of if it gives a real and relevant benefit.

I think that the real limit is software, operating systems and content. Same as color-management, I guess ("8-bit sRGB works, so why fix it?").

For capture, there is no limit to what a Photoshop operator might want to do to her image file. Thus, any increase in the information about the original scene might be of relevance to some.


For 16 bits (per channel) to have any relevance besides powering marketing to normal photography, there has to be a real, measurable, visible benefit over other (presumably cheaper) alternatives like 14 bits, at least in some scenarios. I don't think that still image sensors are quite there yet?

-h
« Last Edit: January 21, 2014, 04:06:19 AM by hjulenissen » Logged
01af
Sr. Member
****
Offline Offline

Posts: 294


« Reply #18 on: January 21, 2014, 07:58:50 AM »
ReplyReply

For many years, 12 bit/channel used to be the state of the art. For a few years now, sensors with 14 bit/channel are becoming increasingly prevalent; especially among high-end cameras they are fairly common by now. I am not aware of any current DSLR cameras with 15 or 16 bit/channel. However technical progress is inevitable, so I guess they will appear some day eventually ... albeit not anytime soon.
Logged
Fine_Art
Sr. Member
****
Offline Offline

Posts: 1097


« Reply #19 on: January 21, 2014, 12:02:31 PM »
ReplyReply

high end greyscale medical displays are > 10 bits, are they not ... like 12 bits (4096 shades) ?

No idea, I never looked into greyscale screens. BTW I looked into sourcing screens several years ago so maybe things have improved recently.
Logged
Pages: [1] 2 3 ... 5 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad