CMP software doesn't use the precise, 14(16) bit 3DLUT of NEC PA to calibrate the display. Instead it's using 8bit 1DLUT of a graphics card, so the calibration is not as precise and it spoils image quality.
The calibration maintains gamut as wide as it is, but you can't limit the gamut while calibration to a smaller one (AdobeRGB or sRGB for instance).
Can someone correct and bolster my flimsy understanding of the difference?
As I understand it, if you calibrate with the Munki device and the X-Rite software (each called "ColorMunki Photo"), the process uses your expensive NEC monitor as a light source only: the calculations are done by the software and stored and applied by the OS. If you use SpectraView II, the Munki device is used, as before, to measure the monitor's output, but the calculations are done by NEC's SpectraView II, the process is different (and presumably better -- among much, it communicates directly with the monitor), and the results are stored in the monitor and applied by the monitor. I'm also under the impression that SpectraView II provides wide-gamut calibration, and that ColorMunki does not.
Thanks in advance.