Ad
Ad
Ad
Pages: « 1 [2]   Bottom of Page
Print
Author Topic: Dynamic range test  (Read 7789 times)
Doug Peterson
Sr. Member
****
Offline Offline

Posts: 2838


WWW
« Reply #20 on: October 07, 2012, 10:08:56 AM »
ReplyReply

IMO DR is a camera sensor feature. What a RAW processor can or cannot do is somewhat arbitrary and depends on how the software was implemented. My camera sensor's DR won't be worse just because a particular RAW processor can't extract the most of my RAW files. DR is a hardware feature; how we manage and output it to the final image is a software process.

If an image detail is rendered in the woods and no one is around to view it - does it count as DR?

There is no inherent photographic truth in a raw file. Unless you plan on displaying the original Red-only, Blue-only, Green-only pixels, 1:1, with a monitor tuned to specifically match the sensors original pixels, then everything you do to view a raw file is an interpretation. That interpretation may be geared towards neutral (in the same way Astia was geared towards neutral) but it is an interpretation none the less.

Moreover the improvement had by better raw processors does not benefit all cameras equally. Some cameras are on relatively equal footing when you look only at very basic raw-processing of their files, but become radically different in quality when each raw file is handled as well as possible (with sophisticated math catered to get the most out of each).

You can't judge any chain by an individual link.  It's neither the sensor alone, nor the software alone which defines photographically useful DR: it is only the entire image-chain (filter>lens>sensor>software and a dozen other factors) that produces a viewable image in which you can judge useful DR.

If two cars have the same exact engine/tires/chassis etc but have different algorithms controlling their automatic transmission they will finish the 0-60 in different times. Is the one that finishes earlier not faster? You don't drive an engine; you drive a car. You don't produce images with sensors, you produce them with camera systems.

Camera systems should always be judged and compared as systems. Because that's what you produce images with.
« Last Edit: October 07, 2012, 10:22:13 AM by Doug Peterson » Logged

DOUG PETERSON (dep@digitaltransitions.com), Digital Transitions
Dealer for Phase One, Mamiya Leaf, Arca-Swiss, Cambo, Profoto
Office: 877.367.8537
Cell: 740.707.2183
Phase One IQ250 FAQ
Guillermo Luijk
Sr. Member
****
Offline Offline

Posts: 1291



WWW
« Reply #21 on: October 07, 2012, 10:10:39 AM »
ReplyReply

If an image detail is rendered in the woods and no one is around to view it - does it count as DR?

Of course, things don't need to be viewed to exist (e.g. air).
Logged

Doug Peterson
Sr. Member
****
Offline Offline

Posts: 2838


WWW
« Reply #22 on: October 07, 2012, 10:27:12 AM »
ReplyReply

Of course, things don't need to be viewed to exist (e.g. air).

Ah! Then we can agree to disagree.

Because in photography I really do not care about sensors, microlens design, signal-to-noise ratios, gamut coverage, spectral responses, diffraction, or any of the other highly technical things I post frequently about. Knowing, testing, and discussing these things simply helps me provide better advice to customers about what tools will help them best achieve their photographic goals.

So from that point of view I think we are both right. You want to know about, compare, and debate the engineering characteristics of various sensors and I want to know about, compare, and debate how far into the shadow or highlight I (or a customer) can place a subject and still get beautiful result.

An avid reader might thoroughly study grammar and punctuation but judge books on whether the ideas were effectively conveyed. The grammar, punctuation and other technical-writing elements do matter - they can enhance or muddle the ideas. In fact just like in photography the experience of reading a book can be enhanced by "poor" grammar or punctuation (e.g. a rambling stream of consciousness may be better received by skipping periods and semicolons or capitalization); likewise a gritty emotionally wrenching image might be better shown with crushed featureless shadows and heavy grain. It helps if you understand grammar and punctuation so you know where and how you might leverage them appropriately. But I do not find much value in debating the Harvard comma as an academic question of grammatical semantics; I only care to debate it in the context of the readability and enjoyability of the writing it is used in.

I do believe I've gone off on a tangent. O well; off to take some pictures!
« Last Edit: October 07, 2012, 10:39:15 AM by Doug Peterson » Logged

DOUG PETERSON (dep@digitaltransitions.com), Digital Transitions
Dealer for Phase One, Mamiya Leaf, Arca-Swiss, Cambo, Profoto
Office: 877.367.8537
Cell: 740.707.2183
Phase One IQ250 FAQ
Dick Roadnight
Sr. Member
****
Offline Offline

Posts: 1730


« Reply #23 on: October 07, 2012, 12:00:42 PM »
ReplyReply

Hi,

I don't think that scene was very challenging. Of course, you need to expose for the highlights, and that may leave some noise in the shadows. With LR, PV2012 seems to achieve some magic in highlight restoration.

Best regards
Erik
I do not want to start a war, but is LR a patch on Phocus? ...and has anyone done any real-world testing, including using the raw converter to cope with real-world contrasts in landscapes?

¿You can use LightRoom on Hasselblad raw files, so it would be possible to do a hardware independent test?
Logged

Hasselblad H4, Sinar P3 monorail view camera, Schneider Apo-digitar lenses
FredBGG
Sr. Member
****
Offline Offline

Posts: 1651


« Reply #24 on: October 07, 2012, 12:30:09 PM »
ReplyReply

You don't produce images with sensors, you produce them with camera systems.

You produce images  in front of the camera.
This is done through composition, directing the subject or going to the right place at the right time.
Once the image is there to be photographed, THE LENS creates the image.
The sensor or film records that image. The recording is probably the most critical point of the process,
because it is what saves the image in time.
The camera (the box) itself is a facilitator rendering it easier or more functional to make this recording correctly.
In fluid situations the camera can dramatically change the recording possibilities.

The raw processor is simply the developing stage of the process, however it cannot produce dynamic range from this air.
It can only work with the recorded data.

On the other had there are methods of chemically developing film that alter dynamic range. For example the water batch process
for developing black and white film. However this process still has to work with the "recording" made during exposure.

While the lens is what creates the optical image... the projection of what is in front of the camera the recording device is
critical to a good result.

Many try to reduce the importance of dynamic range to how much you can pull out of the shadows, but it is more than that.
The more range you have the more recording capability you have. The more that is recorded the more flexibility you have later.

Another very significant improvement in sensor design is live view. Proper live view that is high quality and real time allows the photographer
to see things directly that would otherwise not be seen and would have to be predicted. It is also very effective if not essential to critical focus.
Another example of how useful it can be is shooting black and white. On many cameras you can set the camera to record black and white jpegs, but still record color raw files.
At the same time the camera can show you PROCESSED black and white in live view.
This is of particular importance if you are intending to do some heavy color filtration in the black and white conversion process.
Sometime certain skin tones can react strangely to strong color filtrations and it can be even worse with makeup.

While I still shoot a lot of film I will often use a digital camera as an advanced light meter and "scene analyzer" of sorts and polaroid replacement.

We have gone a long way form the simple ground glass screen, but at the same time there is a certain magic in looking at the image on
a ground glass screen... even if it is backwards and upside down Grin
Logged
ErikKaffehr
Sr. Member
****
Offline Offline

Posts: 7655


WWW
« Reply #25 on: October 07, 2012, 02:30:16 PM »
ReplyReply

Hi,

PV 2012 is much better at restoring highlight than previous versions. If you send me the raw image I will gladly check it out in Lightroom.

Yes, I of course have tested Lightroom on real life landscapes, but with my own images. Lightroom is what I use so I have tested it with something like 50000 mages and about half is landscape.

Yes, LR works on Hasselblad files. Alex Koskolov made some raw images available for testing and I have looked at those, this topic: http://www.luminous-landscape.com/forum/index.php?topic=69391.0

Best regards
Erik


I do not want to start a war, but is LR a patch on Phocus? ...and has anyone done any real-world testing, including using the raw converter to cope with real-world contrasts in landscapes?

¿You can use LightRoom on Hasselblad raw files, so it would be possible to do a hardware independent test?
Logged

MrSmith
Sr. Member
****
Offline Offline

Posts: 909



WWW
« Reply #26 on: October 07, 2012, 02:42:25 PM »
ReplyReply

If an image detail is rendered in the woods and no one is around to view it - does it count as DR?



Either you have got a handle on the subtle slightly obtuse British sense of humour or I LOL'd at something that wasn't intentionally funny.
Logged
deejjjaaaa
Sr. Member
****
Offline Offline

Posts: 965



« Reply #27 on: October 07, 2012, 03:36:22 PM »
ReplyReply

DCRAW applies them as well, in fact using Adobe's matrices. But this doesn't affect DR calculations since they are linear.


but ACR/LR, might apply not only/just a matrix - you actually need to see what is the particular profile selected for a particular camera model and you shall not assume that will be just a matrix.
Logged
deejjjaaaa
Sr. Member
****
Offline Offline

Posts: 965



« Reply #28 on: October 07, 2012, 03:42:14 PM »
ReplyReply

There are no true "zeros" in a raw processor/viewer. Every "zero" or "default slider location" is an active choice made by an image-processing engineer.

and in some raw converters "zero" is a true zero by exactlly an active choice made by developer... and if it an open source code you can actually see that... and there are converters that allow you by choice, not to apply any WB, not to apply any curves, not to apply any color transforms to cameras RGB values and not to apply demosaick even...
Logged
Wayne Fox
Sr. Member
****
Offline Offline

Posts: 2885



WWW
« Reply #29 on: October 08, 2012, 12:59:01 PM »
ReplyReply

and in some raw converters "zero" is a true zero by exactlly an active choice made by developer... and if it an open source code you can actually see that... and there are converters that allow you by choice, not to apply any WB, not to apply any curves, not to apply any color transforms to cameras RGB values and not to apply demosaick even...
Just curious which convertors?  I'm trying to figure out a way to get a straight image without anything other than de-mosaic and haven't had much success.  They all seem to apply at least some gamma adjustment.
Logged

Guillermo Luijk
Sr. Member
****
Offline Offline

Posts: 1291



WWW
« Reply #30 on: October 08, 2012, 02:09:01 PM »
ReplyReply

Just curious which convertors?  I'm trying to figure out a way to get a straight image without anything other than de-mosaic and haven't had much success. They all seem to apply at least some gamma adjustment.

Commercial developers usually do, because integer TIFF files are not well suited for post processing in linear state (shadows quickly posterize due to the lack of levels).

But there is actually no reason not to linearly develop a RAW file as long as you visualize the resulting image in a program that takes into account that linearity. For instance I built linear versions of sRGB, AdobeRGB and ProPhoto RGB (to do that you just need setting gamma to 1.0 in PS native profiles) to properly render linear images (obtained with DCRAW) in PS. Their histogram is strongly shifted to the left, but they look the same as the gamma developed version.

I am no expert but I guess floating point HDR formats are all linear as well, in this case to make tone mapping calculations easier and faster.
« Last Edit: October 08, 2012, 02:15:00 PM by Guillermo Luijk » Logged

deejjjaaaa
Sr. Member
****
Offline Offline

Posts: 965



« Reply #31 on: October 08, 2012, 04:43:02 PM »
ReplyReply

Just curious which convertors?  I'm trying to figure out a way to get a straight image without anything other than de-mosaic and haven't had much success.  They all seem to apply at least some gamma adjustment.

rpp ( http://www.raw-photo-processor.com/RPP/Overview.html ) ...

something like /Applications/Raw Photo Processor 64.app/Contents/MacOS/Raw Photo Processor 64 -NoCFA YES will run w/o demosaicking (useful for cameras w/ removed CFA filter for example or true mono... but I run it sometimes for a regular CFA raws if I need a particular look).

in addition you can

1) convert w/o applying any color transforms in camera's RGB (= w/o any camera "profile")
2) select gamma = 1
3) directly enter WB multipliers (and hence directly say - use UniWB during conversion, if you want so) - it does not operate w/ K/tint paradigm at all

for example (here still w/ demosaick applied) = no color transforms, gamma 1, no WB, output to 16bit camera's RGB tiff



if you want you can also dump 32bit floating w/ gamma 1
« Last Edit: October 08, 2012, 04:50:46 PM by deejjjaaaa » Logged
Wayne Fox
Sr. Member
****
Offline Offline

Posts: 2885



WWW
« Reply #32 on: October 08, 2012, 08:04:04 PM »
ReplyReply

Thanks.
Logged

deejjjaaaa
Sr. Member
****
Offline Offline

Posts: 965



« Reply #33 on: October 10, 2012, 08:33:30 AM »
ReplyReply

Thanks.

I forgot to account for spaces in the command line example, sorry  = "/Applications/Raw\ Photo\ Processor\ 64.app/Contents/MacOS/Raw\ Photo\ Processor\ 64 -NoCFA YES"
Logged
Wayne Fox
Sr. Member
****
Offline Offline

Posts: 2885



WWW
« Reply #34 on: October 11, 2012, 02:09:31 PM »
ReplyReply

I forgot to account for spaces in the command line example, sorry  = "/Applications/Raw\ Photo\ Processor\ 64.app/Contents/MacOS/Raw\ Photo\ Processor\ 64 -NoCFA YES"
Thanks.  Works great.  I was hoping the file would actually show each pixel as seen through it's corresponding filter (so basically a bunch of rggb dots). Showing each pixel as basically it's density should work well enough to demonstrate to the class what a raw file really is and why all raw files need "processing".  Results are a really bad, dark B&W with a wide contrast range, but very little midtones showing why a linear capture device needs correction to look normal to us.  I still have students that believe the conversion on the back of the camera is what the camera saw, and anything other than that is manipulation.
Logged

Doug Peterson
Sr. Member
****
Offline Offline

Posts: 2838


WWW
« Reply #35 on: October 11, 2012, 02:51:00 PM »
ReplyReply

Results are a really bad, dark B&W with a wide contrast range, but very little midtones showing why a linear capture device needs correction to look normal to us.  I still have students that believe the conversion on the back of the camera is what the camera saw, and anything other than that is manipulation.

Don't feel bad; we have plenty of professional photographers as clients who believe the same thing.

Hence my earlier rant.

Understanding that raw processing is an essential link in the creation of a digital image is vital to gaining the best control over that creation.
Logged

DOUG PETERSON (dep@digitaltransitions.com), Digital Transitions
Dealer for Phase One, Mamiya Leaf, Arca-Swiss, Cambo, Profoto
Office: 877.367.8537
Cell: 740.707.2183
Phase One IQ250 FAQ
deejjjaaaa
Sr. Member
****
Offline Offline

Posts: 965



« Reply #36 on: October 11, 2012, 03:53:35 PM »
ReplyReply

Thanks.  Works great.  I was hoping the file would actually show each pixel as seen through it's corresponding filter (so basically a bunch of rggb dots).

if you want that (colorized dots), then you need rawdigger - http://www.rawdigger.com - which is not a raw converter exactly, but good enough for illustration purposes... it will show you RGGB undemosaicked dots colorized... rpp was intended in such mode specifically for bw cameras (like bayer cameras with CFA removed or monochrome MF/Leica) where you natually do not need any demosaicking and image was supposed to be b/w/greyscale



« Last Edit: October 11, 2012, 03:57:16 PM by deejjjaaaa » Logged
deejjjaaaa
Sr. Member
****
Offline Offline

Posts: 965



« Reply #37 on: October 11, 2012, 04:01:51 PM »
ReplyReply

Don't feel bad; we have plenty of professional photographers as clients who believe the same thing.
but it is still an image... in that sense raw file is not different from any jpg or png or bmp... all of them are just bits and bytes and you need a software and certain math coded to display the content on your monitor... I 'd rather say that you need more math to display something from JPG actually vs to display something from raw data.
Logged
Wayne Fox
Sr. Member
****
Offline Offline

Posts: 2885



WWW
« Reply #38 on: October 12, 2012, 04:22:45 PM »
ReplyReply

but it is still an image... in that sense raw file is not different from any jpg or png or bmp... all of them are just bits and bytes and you need a software and certain math coded to display the content on your monitor... I 'd rather say that you need more math to display something from JPG actually vs to display something from raw data.

Rawdigger is perfect for what I'm trying to do!  Thank you very much.

The point I'm trying to make to the students in the class is that the sensor just records levels of brightness so a raw file in a straight conversion would just show a bunch of grey dots.  Since we know what color of filter was over each sensel, we can add that information and get a mosaic, but that's still not very useful.  Add the fact that it's linear and basically without some type of post processing, a digital capture is useless.  So every image captured goes through a complex routine that makes it useful to us as humans to visualize, and the process is very interpretive, and only the person taking the photograph knows what they saw, and what they are trying to capture.  The point being whether you let the engineers that designed the firmware make those decisions or whether you want to take over that process.  Often the comeback is this isn't natural like it is with film launches into a second discussion as to how all of these properties are built into the film, but they are still there. 

My challenge is getting students to understand the significance and importance of shooting raw when they want maximum quality.  The issue is exacerbated because the college offers this course at two different times (I'm substituting for a good friend who teaches the night class), and the day class is taught by the schools "computer" guy ... someone who teaches Word, Excel, Windows etc.  He's stated that photography is not an art, and some of the questions he has come up with for the tests show he isn't at all qualified (we had a really good laugh at some of his questions and his answers, but sadly it's not a funny situation because his students really aren't getting what they are paying for). because he's on staff and PHotoshop is a computer program, the school seems to think he knows what he's talking about.  He tells the students that jpegs are more than enough and don't waste the storage space for raw files.

This is a new course of study at a local business college, and my friend who is a very respected and knowledgeable photographer just met with the administrator and I think they understand better about what is going on.  They've asked him and not the computer guy to plan the curriculum for the Intermediate Photography  courses the want to introduce next semester.  Unfortunately even though my friend is imminently qualified to teach the material, they can't hire him as full time staff because he doesn't have a "masters" degree.
Logged

Pages: « 1 [2]   Top of Page
Print
Jump to:  

Ad
Ad
Ad