The hardware on most new separate component Nvidia and ATI cards is capable of 10 bit but the companies might still be blocking the capability within the driver, so you will need to confirm things with their customer support. I believe you need a displayport to get 10 bit versus HDMI (someone can check me on that). With a monitor that large with that many pixels to drive, I wouldn't go with a lower end card but you won't need to get one of the super expensive cards either unless you want to do gaming.
This is from the nVidia site:
NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI.
This appears to imply that every element of the combination of application software, operating system, video card, video connection and display must support 10 bit color in order to achieve true 10 bit color output. I use Lightroom - I'm guessing it is using the windows API and limited to 8bit - is this true?