Ad
Ad
Ad
Pages: « 1 ... 6 7 [8] 9 10 »   Bottom of Page
Print
Author Topic: "Nikon D800 / D800E First Comparison"  (Read 56515 times)
RobertCubit
Newbie
*
Offline Offline

Posts: 16



WWW
« Reply #140 on: April 29, 2012, 02:53:36 PM »
ReplyReply

Oops...hold the presses! Ah shucks too late.

I got a PM from Bart this morning that there is an error in my Imatest analysis. He reminding me that the dng files provided for download were converted into AdobeRGB color space with gamma 2.2 already applied. Therefore, the data must be re-linearized by applying the reciprocal gamma value 1 / 2.2 or 0.4545 in Imatest to correct the data and plots. When I screw up, it has always been my policy to offer an immediate retraction and re-publish the corrected results, which are shown below. I've also edited the previous posts to point to this one. Thanks Bart, I should have caught that one! Sorry for the confusion.

Note that this has made a significant difference in the previously posted results. In absolute terms, the response from both cameras have improved considerably, but the relative differences, while still significant, were not effected as much. One exception is the CA chart (possible demosaic failure), where the deviation is still present for the D800E, but is much improved. I'm sure there will be additional changes in the results when the original files are processed with other raw converters and when capture sharpening is applied.

Hopefully these are now all correct. Note that I've expanded the Picture Height scale on the Acutance chart to include print heights up to 100 cm (40 inches). 

D800E SRF cycles/mm


D800E SRF LP/PH


D800E Chromatic Aberration


D800E Acutance


___________________________________________________________________________________________


D800 SRF cycles/mm


D800 SRF LP/PH


D800 Chromatic Aberration


D800 Acutance


Imatest Settings





I offer my favorite circular-logic quote from an unknown author, “I thought I made a mistake once, but I was wrong!”  Embarrassed

Kind regards,
Bob
Logged

marcmccalmont
Sr. Member
****
Offline Offline

Posts: 1734



« Reply #141 on: April 29, 2012, 06:14:07 PM »
ReplyReply

I'm trying to wrap my brain around this but don't have any backround with Imatest. IRT acutance if at 50cm picture height the E is 56% and the 800 is 45% is 11% a significant difference? as a comparison what would the acutance difference be between a high end lens and a kit lens? much more or less?

IRT edge profile is an infinite slope perfect? so the closer to 0 the better? again just to get a gut feeling what is the difference one would expect between a high end lens and a kit lens?



Thanks
Marc
Logged

Marc McCalmont
BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3907


« Reply #142 on: April 29, 2012, 07:18:04 PM »
ReplyReply

I'm trying to wrap my brain around this but don't have any backround with Imatest. IRT acutance if at 50cm picture height the E is 56% and the 800 is 45% is 11% a significant difference? as a comparison what would the acutance difference be between a high end lens and a kit lens? much more or less?

Acutance or SQF is expressed on a quality scale.



A blurry picture viewed from a large distance looks sharp and scores high (dotted line in the acutance plot). That same blurry picture shown at a close distance but also at a small size also scores high (solid line in the acutance plot). So to reach a better than 'very good' score of 80, the D800 image should be viewed from 71 cm distance and further and the D800E can be viewed from 66 cm distance and further for the same subjective quality. The same quality can be achieved by printing a 2.5 cm (approx. 1 inch) tall image (landscape orientation) or less and viewing it from some 13-15 cm distance (half of the normal reading distance).

Quote
IRT edge profile is an infinite slope perfect? so the closer to 0 the better? again just to get a gut feeling what is the difference one would expect between a high end lens and a kit lens?

Yes, steeper is better, but steeper than 1 pixel is not really possible. An edge profile (edge spread function, or ESF) shows how the abrupt transition from the dark to the light side of a sharp edge is captured. A perfect edge could be represented by going from dark to light in one pixel to the next if, and only if, the edge is positioned exactly between the sensels. Of course, if the edge falls exactly halfway a row or column of pixels, the sensel response would be 50% of the luminance difference, but Imatest measures at a sub-pixel accuracy.

The D800 takes 2.3 pixels to transition the edge, and the D800E takes 1.91 pixels to transition the edge, both unsharpened and with Capture One v6.4 as Raw converter. That means that the D800 needs to be sharpened with a sharpening radius of 0.9, and the D800E requires a sharpening radius of 0.75 to achieve the same sharpening effect in that Raw converter.

Cheers,
Bart
Logged
RobertCubit
Newbie
*
Offline Offline

Posts: 16



WWW
« Reply #143 on: April 29, 2012, 08:23:34 PM »
ReplyReply

I'm trying to wrap my brain around this but don't have any backround with Imatest. IRT acutance if at 50cm picture height the E is 56% and the 800 is 45% is 11% a significant difference? as a comparison what would the acutance difference be between a high end lens and a kit lens? much more or less?

IRT edge profile is an infinite slope perfect? so the closer to 0 the better? again just to get a gut feeling what is the difference one would expect between a high end lens and a kit lens.

Hi Marc,

Yes, 11% acutance increase is significant but probably only marginally perceptible in a 50cm print. Keep in mind that these charts were performed on the dng files provided by Bart and they have not been sharpened. My goal was to get a feeling for the base difference in MTF and acutance on the images as output from the cameras. Acutance values are highly affected by sharpening---in fact, sharpening is required on both D800 and D800E images to get optimum acutance in the final output (especially true with the D800 images due to the AA filter). Proper capture sharpening requires access to the raw files so I will refer to a newer thread started by Erik Kaffehr here:

http://www.luminous-landscape.com/forum/index.php?topic=66477.0

He apparently now has access to Michael’s raw files. So his Imatest results are more realistic than mine for comparing the acutance of the D800 and D800E images after sharpening is applied.

Yes, in general the steeper the edge profile or Edge Spread Function (ESF), the higher the MTF---but it can never go smaller than 1.0 pixels. Sharpening can increase the steepness of the ESF but will not increase the resolution.

Kit lens vs high-end lens is a hard one to answer since kit lenses vary so widely in image quality. Many consider the $100 Nikon 50mm f/1.8 a “kit” lens. But used at f/4 to f/5.6, I would not expect a huge difference in MTF compared to Michael’s 85mm (f/1.4G?) used for the tests in this thread (at least not at CoF).  The corners and edges would probably be a different story. For whatever lenses you have in mind, a good place to look strictly for comparing SQF/Acutance data at various print sizes for specific lenses is Popular Photography (but I find Pop Photo lens testing a bit light on other important details).

Looks like Bart beat me to it. But at least we both answered different parts of the questions.

I look forward to Bart and Erik's ongoing sharpening and (hopefully) noise tests.

King regards,
Bob
Logged

marcmccalmont
Sr. Member
****
Offline Offline

Posts: 1734



« Reply #144 on: April 29, 2012, 08:56:31 PM »
ReplyReply

I guess I was trying to gage whether the difference between the acutance/SQF between the 800 and the E is in the same ball park as the difference between a high end lens and a average kit lens or is it an order of magnitude less or more?

Also what jumps out is that for a given print sharpness (personal taste) the E needs less sharpening, in my past experiance I always preferred the look of a less sharpened AA/less print to a more sharpened AA print. It will be interesting to compare real world prints of the 800 sharpened at .9 and the E sharpened at .75.
Marc
Logged

Marc McCalmont
Fine_Art
Sr. Member
****
Offline Offline

Posts: 1157


« Reply #145 on: April 30, 2012, 12:33:14 AM »
ReplyReply

Well, I guess that's self-evident, Bart. If the sharpening were to increase noise to unacceptable levels, it couldn't be considered proper. Grin

What occurs to me is, if the D800 shot is always sharpened so that it looks as sharp as the D800E shot, could there be visibly more noise at least somewhere in the full size image, despite one's best efforts.

It may well be the case that at base ISO this is less likely to be a problem. But what about ISO 3200? What about small crops that one wants to print largish?


I agree that featureless areas can be masked, but detail in the deep shadows tends to have an unfavourable SNR. When I use Smart Sharpen, I usually fade the amount of sharpening applied to the shadows.

I also wonder what will happen at high ISO when noise may be a problem without any sharpening at all.

Maybe, but I don't have the cameras to try that. I have to make a decision as to which model to order. I'm currently favouring the D800E.

I was very impressed with Focus Magic, but I'm disappointed they are taking so long to develop a version compatible with 64 bit Windows OS.

Cheers!

Ray



Ray,

Consider Images Plus. It is a 64 bit multi-threaded program written by a math prof. for astronomy shots. I have been using it for years for regular photography. It has a very nice adaptive RL that limits noise buildup.
Logged
Fine_Art
Sr. Member
****
Offline Offline

Posts: 1157


« Reply #146 on: April 30, 2012, 01:57:41 AM »
ReplyReply

Probably because it keeps the distance between the mount and sensor exactly the same between the D800 and D800e?

Cheers,
Bernard

The distance from my eyes to my screen wont change if i take off my glasses. What will change is the angle of light entering my eye. In the same way the AA filter may have an impact that is a part of the design of the microlenses in front of the sensor. Without it the light might miss the microlenses.
Logged
BernardLanguillier
Sr. Member
****
Offline Offline

Posts: 8387



WWW
« Reply #147 on: April 30, 2012, 07:50:54 AM »
ReplyReply

The distance from my eyes to my screen wont change if i take off my glasses. What will change is the angle of light entering my eye. In the same way the AA filter may have an impact that is a part of the design of the microlenses in front of the sensor. Without it the light might miss the microlenses.

I guess that it depends on the set of phenomena you are trying to deal with.

I would think that complex reflexion can occur between the sensor and the rear elements of lenses and that complex coatings are applied to sensors to reduce these. I wouldn't be surprised if the distance were to be accurately preserved. Just guessing though.

Cheers,
Bernard
Logged

A few images online here!
hjulenissen
Sr. Member
****
Offline Offline

Posts: 1713


« Reply #148 on: April 30, 2012, 08:13:40 AM »
ReplyReply

The distance from my eyes to my screen wont change if i take off my glasses. What will change is the angle of light entering my eye. In the same way the AA filter may have an impact that is a part of the design of the microlenses in front of the sensor. Without it the light might miss the microlenses.
1. If your head is pressed up against the display using fixed-force screws, then adding any material in-between (such as glasses) would change the physical distance.

Not that I know how sensors are aligned to the camera frame/lense.

2. If a (flat, nonconcave/convex) object is inserted whose refractive index is very different from air, could it not alter the "optical distance"?

-h
« Last Edit: April 30, 2012, 08:17:47 AM by hjulenissen » Logged
pedro.silva
Newbie
*
Offline Offline

Posts: 19


« Reply #149 on: April 30, 2012, 09:58:28 AM »
ReplyReply

"One thing to bear in mind is that the whole camera is only as fast as the fastest card installed. "

i wonder whether you possibly meant  "as fast as the slowest card installed"...

cheers,
pedro
Logged
bjanes
Sr. Member
****
Offline Offline

Posts: 2882



« Reply #150 on: April 30, 2012, 11:48:46 AM »
ReplyReply

Oops...hold the presses! Ah shucks too late.

I got a PM from Bart this morning that there is an error in my Imatest analysis. He reminding me that the dng files provided for download were converted into AdobeRGB color space with gamma 2.2 already applied. Therefore, the data must be re-linearized by applying the reciprocal gamma value 1 / 2.2 or 0.4545 in Imatest to correct the data and plots. When I screw up, it has always been my policy to offer an immediate retraction and re-publish the corrected results, which are shown below. I've also edited the previous posts to point to this one. Thanks Bart, I should have caught that one! Sorry for the confusion.

Bob,

That was an easy error to make, since Imatest uses a gamma of 1/2.0 or 0.5 as a default for SFR as shown in Norman's doc page and many users (myself included) do not change the default. Did you use 2.2 for your original tests? On re-reading the docs, I see that Norman suggests including a Q-14 chart along with the slanted edge target so one can determine the actual gamma in use. With ACR using PV2010 one can use a linear tone curve (sliders on main tab set to zero and point curve to linear) and obtain reasonable results. However, with PV2012 it is not so easy to obtain a linear tone curve and the default is far from linear.

Shown below are my Imatest results for gamma using a Stouffer wedge and ACR 7 with a linear tone curve in PV1010 and the default tone curve in PV2012. The wedge was exposed with step 1 just short of clipping, but PV2012 uses a bright tone curve and the highlights appear washed out but are not actually blown. Bart's target does include 20 density steps ranging from 255 down to zero. On his OpenPhoto forum post, he states that the image has no attached color space, but on downloading it today, I see that it is in ProPhotoRGB and the steps have even pixel spacing. If one knew more about the encoding of the image, one could determine the gamma of an image of the target. Perhaps Bart will provide some additional information.

Regards,

Bill
Logged
BJL
Sr. Member
****
Offline Offline

Posts: 5182


« Reply #151 on: April 30, 2012, 02:00:18 PM »
ReplyReply

The distance from my eyes to my screen wont change if i take off my glasses. What will change is the angle of light entering my eye. In the same way the AA filter may have an impact that is a part of the design of the microlenses in front of the sensor. Without it the light might miss the microlenses.
The "optical" distance changes, meaning the distance and time of travel for the light, because of the different refractive index (lower speed of light) in the birefringent (Lithium Niobate) slabs compared to air, and the slightly longer path taken: light is turned a bit to the left or right by the first filter element, and then turned back by the second. This will slightly move the focus position, and the focus mechanisms (auto or manual) would need to be slightly adjusted for that. But I agree that the angle of incidence on subsequent optical elements could also be an issue.

Anyway, I am not sure why we are agonizing over this: rather clearly there is a reason, else Nikon would have had an obvious way to reduce the manufacturing cost by simply omitting those two not inexpensive birefringent slabs.
Logged
RobertCubit
Newbie
*
Offline Offline

Posts: 16



WWW
« Reply #152 on: April 30, 2012, 02:30:54 PM »
ReplyReply

Bob,

That was an easy error to make, since Imatest uses a gamma of 1/2.0 or 0.5 as a default for SFR as shown in Norman's doc page and many users (myself included) do not change the default. Did you use 2.2 for your original tests? On re-reading the docs, I see that Norman suggests including a Q-14 chart along with the slanted edge target so one can determine the actual gamma in use. With ACR using PV2010 one can use a linear tone curve (sliders on main tab set to zero and point curve to linear) and obtain reasonable results. However, with PV2012 it is not so easy to obtain a linear tone curve and the default is far from linear.

Shown below are my Imatest results for gamma using a Stouffer wedge and ACR 7 with a linear tone curve in PV1010 and the default tone curve in PV2012. The wedge was exposed with step 1 just short of clipping, but PV2012 uses a bright tone curve and the highlights appear washed out but are not actually blown. Bart's target does include 20 density steps ranging from 255 down to zero. On his OpenPhoto forum post, he states that the image has no attached color space, but on downloading it today, I see that it is in ProPhotoRGB and the steps have even pixel spacing. If one knew more about the encoding of the image, one could determine the gamma of an image of the target. Perhaps Bart will provide some additional information.


Thanks, Bill. Yes, unfortunately I did use gamma 2.2 for the first tests, so the results where way off mark, a simple brain fart. I thought of using the density steps in Bart’s chart to determine the gamma but was also unsure how to interpret them. I do have an old Kodak Q-13 (equivalent to the Q-14) that I use for my own testing.

Thanks for pointing out the differences in PV2012 vs PV2010; I was not aware of this issue. It will avoid some head scratching when I start working with raw files from the D800/E.

This may be straying a bit too far off topic into technical complexity for this thread, so I’ll try to start a new one this week to further explore Imatest methods and results. This is one of the few forums that provide such in depth discussions, with folks who have a deep understanding of camera/lens testing and software.

Kind regards,
Bob
Logged

Wayne Fox
Sr. Member
****
Offline Offline

Posts: 2955



WWW
« Reply #153 on: April 30, 2012, 02:32:05 PM »
ReplyReply

Anyway, I am not sure why we are agonizing over this: rather clearly there is a reason, else Nikon would have had an obvious way to reduce the manufacturing cost by simply omitting those two not inexpensive birefringent slabs.
That's sort of been my feeling about it and why I'm waiting for the e.  I would love to see some official word from Nikon (other than the market speak) as to why they opted for this instead of just omitting it.
Logged

Slobodan Blagojevic
Sr. Member
****
Offline Offline

Posts: 6295


When everybody thinks the same... nobody thinks.


WWW
« Reply #154 on: April 30, 2012, 02:33:10 PM »
ReplyReply

A question for squints: in another thread, about sharpening and desaturation, I posted a little demonstration how aggressive sharpening creates desaturation. So, the question is: if D800 requires more aggressive sharpening, wouldn't the collateral damage be desaturation?
Logged

Slobodan

Flickr
500px
BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3907


« Reply #155 on: April 30, 2012, 02:47:57 PM »
ReplyReply

Bart's target does include 20 density steps ranging from 255 down to zero. On his OpenPhoto forum post, he states that the image has no attached color space, but on downloading it today, I see that it is in ProPhotoRGB and the steps have even pixel spacing. If one knew more about the encoding of the image, one could determine the gamma of an image of the target. Perhaps Bart will provide some additional information.

Hi Bill,

The target was created without colorspace, and saved without colorspace/profile as a PNG (which IIRC doesn't have a colorspace but does have a 'gamma-chunk'). I have no control over what Photoshop writes in the PNG metadata, but I suppose the Gamma chunk is set, I don't know to which value. Where it gets the ProPhotoRGB colorspace from, I have no idea (the file was saved without profile).

In my instructions for use I suggest to assign the printer profile to the target before printing, which should by-pass gamma adjustments and such since we're just sending R=G=B values to the printer. When the target is photographed the data is also recorded as is, and upon Raw conversion there will be a gamma precompensation so we should see and measure the same (more or less) linear step spacing as in the original. That means that theoretically the image has a gamma close to 2.2 . A plot of the luminosities of the PNG version of the D800 shot I uploaded does indeed show an almost (but not perfectly) straight tone stepping (see attached ImageJ profile plot). Therefore the correct Imatest gamma setting to linearize the data would be approx. 0.4545 .

Of course the easiest way to deal with these things is to use Imatest on linear gamma Raw data, but as it is it is also usable. We are not operating in a laboratory here, but the tone scale patches can be used to calibrate the image before measuring (that's why the steps are included in my target). The gray background of the target can also be used to compensate for uneven lighting of the target (and the target was indeed not uniformly lit).

Cheers,
Bart

P.S. I used ImageJ's trend curve plotting, and it decided that the gamma variate curve with the c parameter representing the gamma component of the profile plot data is gamma 2.28352 (see attachment) . So besides some other non-linearities, the best gamma setting for linearization would be 0.438 .
« Last Edit: April 30, 2012, 03:20:08 PM by BartvanderWolf » Logged
BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3907


« Reply #156 on: April 30, 2012, 02:55:49 PM »
ReplyReply

A question for squints: in another thread, about sharpening and desaturation, I posted a little demonstration how aggressive sharpening creates desaturation. So, the question is: if D800 requires more aggressive sharpening, wouldn't the collateral damage be desaturation?

Hi Slobodan,

That nice demonstration also involved using a very (too) large sharpening radius. When the correct radius is used, then the risk you mentioned is not an issue.

That is also part of what I am driving at, using the correct sharpening settings (and they differ between the 2 Nikons). I'm close to publishing a tool with which users can determine their personal equipment's best sharpening settings. It does require some more squinting though ...

Cheers,
Bart
« Last Edit: April 30, 2012, 05:19:11 PM by BartvanderWolf » Logged
RobertCubit
Newbie
*
Offline Offline

Posts: 16



WWW
« Reply #157 on: April 30, 2012, 04:48:44 PM »
ReplyReply

My speculation on why the Nikon D800E does not simply omit the two (somewhat expensive) layers that make up the OLPF is that
- this would slightly change the optical path and thus where precise focus occurs, but
- this change would not affect the optical path to the PD AF sensors or the OVF,
so there would be a discrepancy, leading to slight focusing errors with both AF and MF. To correct that could require a slight mechanical change to the mirror/focusing/VF assembly, and having two versions of that assembly (one for cameras with anti-aliasing, one for cameras with aliasing) would have cost more.


Hi BJL,

I believe this is precisely the reason Nikon took this path with the OLP filter implementation (pun intended post writing). Thirty years in high-tech manufacturing has taught me that it’s all about production costs and (hopefully) continuous improvement, providing a better product with each generation, while lowering production costs and price to the end user.

I agree that the effective light path must be made effectively same for both cameras to maintain consistency and commonality in the production line and avoid mechanical differences or adjustments between the two cameras. However, this might also be accomplished by using appropriate thicknesses of less expensive optical glass plates in the D800E to replace the crossed birefringent elements in the D800 (similar to what companies like LifePixel do when they remove the OLPF).

But there is one possibility I haven’t seen in this discussion that may explain Nikon’s choice to re-use the D800 birefringent filter(s) on the D800E. A while back I read a technical white paper on trends in manufacturing of digital camera sensors. It wasn’t specific to Nikon, but did mention Sony. With the goal of simplifying the manufacture of both sensors and cameras, the optical components in the filter stack on the sensors are being reduced. One of those components is the optical glass cover normally installed on the sensor package during sensor fab. (presumably done by Sony for the D800/E cameras).

The clear sensor filter is being replaced with one of the two birefringent filters, making it integral with the CMOS sensor package (and not economically feasible to remove). If this is the case with the sensor Sony delivers to Nikon for the D800/E (and I think this highly likely), it would explain Nikon’s approach of simply rotating that second birefringent filter (along with the necessity of replacing the integral wave plate/IR filter set between the two birefringent filters to eliminate the un-need wave plate). So I think Nikon’s published diagram of the D800/E filter stack is essentially correct:

http://www.nikonusa.com/en_US/IMG/Images/Learn-And-Explore/2012/Camera-Technology/D-SLR-Series/Moire-D800-D800E/Media/OLPF_schematic.pdf

Except that it doesn’t show if the one of the birefringent plates is permanently installed on the sensor.

The cost savings here are obvious---Sony supply one identical part to Nikon for both cameras simplifying their sensor fab. line. And Nikon only has to change their line toward the end when the filter stack is installed to determine if the camera will be a D800 or D800E. Also, the reduction in components and air-glass interfaces should result in both lower production costs and higher final image quality. The savings should more than offset the possible extra cost of the birefringent filters in the D800E. A company like LifePixel could confirm if this is indeed the case for the D800 and other newer cameras. It will make a bit more difficult for them when converting such a camera to remove the OLPF. Their only choice when converting a D800 to eliminate the OLPF effect may be to do what Nikon does.

For future possibilities, there’s been a lot of progress over the past few years on the development of electrically-tunable birefringent filters. They are already being successfully used in high-end space and military imaging applications. The current costs to implement these in a high-volume consumer cameras may still be prohibitive. But I wouldn’t be surprised to see these in future digital cameras. Imagine having a dial or menu item where you can select a variable range of OLP filter effect from say 0 to 10, that you can set according to the subject!

EDIT:

And taking this concept further, since most tunable birefringent filters are based on LCD technology, with clear thin-film electrodes applied to the active LCD plates, there is no reason that the conductive layer couldn’t be patterned into a grid that would match the sensel grid (or the Bayer pattern). If placed directly over the sensel grid, the OLPF effect could be controlled for each individual sensel (or RGB group in the Bayer filter). If combined with an in-camera image processor smart enough to detect aliasing and moiré in the scene, it could dynamically, locally resolve any aliasing issues during capture for the affected areas of the image, while leaving the unaffected parts of the scene alone.

And I’ll offer one more trip into current science fiction land, based on the same concepts. It is also possible to construct electrically-tunable neutral density filters using LCD technology. Place such an ND filter grid over the sensor and it could be possible to individually control the sensitivity of each sensel. If combined with something like Tony Kuyper’s PS luminosity masks (but implemented in the camera’s image processor), the sensor response might be locally tuned to dramatically increase dynamic range during capture---operating as an adaptable, super-GND filter automatically reducing exposure in highlight areas.

Kind regards,
Bob

« Last Edit: April 30, 2012, 07:44:53 PM by RobertCubit » Logged

bjanes
Sr. Member
****
Offline Offline

Posts: 2882



« Reply #158 on: April 30, 2012, 05:35:21 PM »
ReplyReply

P.S. I used ImageJ's trend curve plotting, and it decided that the gamma variate curve with the c parameter representing the gamma component of the profile plot data is gamma 2.28352 (see attachment) . So besides some other non-linearities, the best gamma setting for linearization would be 0.438 .

Bart,

Please correct me if I am wrong, by my interpretation of the value to enter into Imatest is the actual gamma of the image taken of the target for testing, not the gamma of the file used to print the image for testing. That is why Norman suggests including a Q14 target along with the slanted edge target. In that way, the actual gamma can be computed with the Stepchart module.

Regards,

Bill
Logged
BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3907


« Reply #159 on: April 30, 2012, 05:59:35 PM »
ReplyReply

Please correct me if I am wrong, by my interpretation of the value to enter into Imatest is the actual gamma of the image taken of the target for testing, not the gamma of the file used to print the image for testing. That is why Norman suggests including a Q14 target along with the slanted edge target. In that way, the actual gamma can be computed with the Stepchart module.

That's correct, one uses the gamma of the image taken of the target (e.g. Adobe RGB = gamma 1/2.2, Prophoto RGB = gamma 1/1.Cool. To achieve a linear gamma for those images, Imatest requires those gammas as input.  Norman Koren suggests to use 0.5 when we don't have a better calibration (such as from a stepchart).

The confusion comes from the sloppy use of the gamma term (I'm also guilty of that sloppy use, but try to use the correct values to linearize). The images are brighter than their linear gamma version, so it's the reciprocal value that was applied to counter-act (pre-compensate for) the display gamma and adjust for human vision.

Cheers,
Bart
Logged
Pages: « 1 ... 6 7 [8] 9 10 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad