Found this on Eizo site:
"The ColorEdge CG220 incorporates a powerful new EIZO-developed ASIC (Application Specific Integrated Circuit) with 14-bit color processing capability (16 times more accurate than 10-bit). This allows a larger number of grayscale increments, for grayscale rendering that is on a par with high-end CRT monitors. The result is a much greater degree of color detail, especially in dark areas and shadows."
And further down specs :"Display colors: 16.7 million from palette of 1.06 billion". It also supports Adobe RGB color space natively.
If you're able to part with money necessary to get it, this LCD can probably fit your requirements.
I'm guessing on this as I don't own one, but (I presume) the CG220 can make the adjustments suggested by a hardware calibrator interally, instead of on the graphics card. To picture the advantage of this, imagine doing a few Curves and Levels manipulations on an 8 bit image, versus converting said image to 16 bits for the manipulations and then back to 8 bits after you're done. Both ways start and end with only 8 bits per channel, but one should give you a smoother histogram - and hence color gradients - at the end.
The point of all that is that Eizo's LCD will not actually display the other 2 bits worth of shades from a Matrox Parhelia. Yes, it will (should?) have better gradients than other LCDs, but you're still only putting 8 bits per channel in and getting 8 bits out.
Yes, other cards now have 10 bit DACs. These help when the card makes the calibration adjustments for a CRT in hardware (versus for an LCD where the adjustments are either made on the LCD a la Eizo or in software a la most others). However, Matrox is the only one I know of that offers a plugin to actually allow your software to take advantage of those 10 bits to display more than 256 distinct levels of intensity.