Ad
Ad
Ad
Pages: [1]   Bottom of Page
Print
Author Topic: >>>))) GRAPHICS CARD (((<<<  (Read 5345 times)
JohnKoerner
Guest
« on: January 12, 2009, 08:14:12 PM »
ReplyReply

I am looking to get the NEC LCD2690WUXi + SV-II software (the older one, not the new WUXi2), but it seems like I should also get a top shelf graphics card to go with it.

Most of the PC magzine reviews always talk about "gaming" graphics cards, but I am not looking to get into gaming, I just want a top shelf choice to help my new monitor display colors to its best potential.

What are some of the top choices for the specific needs of a photographer trying to get the most out of his colors (I have a PC not a Mac)?

Thanks for any input,

Jack
Logged
DarkPenguin
Guest
« Reply #1 on: January 12, 2009, 08:26:39 PM »
ReplyReply

So far as I know it shouldn't matter.  The inputs are digital these days.  Most cards can output 127,188,204 as well as the next card.  Back in the day the output could be quite a bit different.

Edit: As I think about this CS4 is using OpenGL for a bunch of stuff.  So I don't know.
« Last Edit: January 12, 2009, 08:30:41 PM by DarkPenguin » Logged
Arlen
Guest
« Reply #2 on: January 12, 2009, 08:36:39 PM »
ReplyReply

If you are going to run Photoshop CS4, it can now utilize a 3D graphics card for some of its new effects. However, a lot of people are having trouble with incompatibilities whose origins are not clear. A number of threads on the Adobe Photoshop help forums have hundreds of posts about the problems and attempts to solve them. I myself experienced problems with a newly purchased 3D graphics card, so I took it back out of my computer and am sticking with my trusty 2D Matrox until the problems get sorted out.

I'm using my Matrox Millennium P650 PCIe 128 card with the NEC LCD2690WUXi + SV-II, and it seems to be working fine.
« Last Edit: January 12, 2009, 08:45:43 PM by Arlen » Logged
Paul Sumi
Sr. Member
****
Offline Offline

Posts: 1215


« Reply #3 on: January 12, 2009, 08:58:47 PM »
ReplyReply

Quote from: Arlen
I'm using my Matrox Millennium P650 PCIe 128 card with the NEC LCD2690WUXi + SV-II, and it seems to be working fine.

Actually, unless you are running PS CS4 or something else that needs Open GL, many video cards with at least 64 megs will drive the NEC 2690 (see the NEC site for minimum specs).  While I was waiting for my new PC, I was using the 2690 with a 6 year old NVidia card  
« Last Edit: January 12, 2009, 08:59:56 PM by PaulS » Logged

JohnKoerner
Guest
« Reply #4 on: January 13, 2009, 12:42:09 PM »
ReplyReply

Oh, okay, I guess I will be fine then
Logged
neil snape
Sr. Member
****
Offline Offline

Posts: 1432


WWW
« Reply #5 on: January 15, 2009, 04:45:20 AM »
ReplyReply

I'll bet that the majority of future image editing and reference monitors will use display port rather than DVI.

I don't know where the PC cards are with this, but MAc are shipping at least the portables and iMacs with it. The next gen Mac Pro should have it too.
It seems that the video card processors on certain models can output 10 bit per channel data, but few implemented the passing off of 10 bit over DVI.
The newer monitors can run 10 bit through Display Port, hence interaction with 10 or more bit data with 10-14 bit Luts onboard is probably the way this will go.

I do know the HP Dream Color works in 10 bit with 14 bit internal luts for calibration and control if using DisplayPort but not so if you have DVI (unfortunately all there is on the aging Mac Pro line).
Logged
jerryrock
Sr. Member
****
Offline Offline

Posts: 553



WWW
« Reply #6 on: January 15, 2009, 07:37:19 PM »
ReplyReply

Quote from: neil snape
I'll bet that the majority of future image editing and reference monitors will use display port rather than DVI.

I don't know where the PC cards are with this, but MAc are shipping at least the portables and iMacs with it. The next gen Mac Pro should have it too.
It seems that the video card processors on certain models can output 10 bit per channel data, but few implemented the passing off of 10 bit over DVI.
The newer monitors can run 10 bit through Display Port, hence interaction with 10 or more bit data with 10-14 bit Luts onboard is probably the way this will go.

I do know the HP Dream Color works in 10 bit with 14 bit internal luts for calibration and control if using DisplayPort but not so if you have DVI (unfortunately all there is on the aging Mac Pro line).

The Mac line is shipping with mini display port, not compatible with display port technology currently deployed in some monitors (like the HP LP2480zx DreamColor).

I own a MacPro with ATIX1900XT video card which can pass 10 bit color output through DVI. I also own the DreamColor monitor that can accept 10 bit input. It also accepts 8 bit color input and through the built in 2 stage 12 bit LUT can upconvert the signal to 10 bit per color.

Jerry
Logged

Gerald J Skrocki
skrockidesign.com
Plekto
Sr. Member
****
Offline Offline

Posts: 551


« Reply #7 on: January 15, 2009, 07:54:49 PM »
ReplyReply

I have an ATI X1900XT myself and it plays all the games as well as does great things with graphics.

I refuse to touch NVidia because everything that they make requires the use of their all-in-one Forceware app.  Basically it tosses on what it thinks is the right driver and that's it.  No choices, no overrides, and on their site - no options for manual drivers.  I've had numerous clients machines as well as one of my older ones hard crash(kernel panic/bsod on boot) and my only recourse was to use the CD that came with it to restore.   Catch 22.

ATI, otoh, I can roll back the driver a version or two and life is good.   NVidia used to be the seecond best company after 3dFX.  Nvidia bought out 3dFX and things were great for a while.  Then it all went south.  ATI isn't as good as NVidia when it comes to features or stability, but there are few system builders that I know of that chose NVidia products.

EDIT:
Get the X1900XT 512 GDDR3 version.  It's the best value for the dollar.  And way overkill for anything most sane people will ever use it for.  If you need Hi-def and so on, the HD4850 is basically the same thing but with HD and more features.  Both are well under $200, new.  ASUS makes the best quality ATI video cards that I know of.  But they tend to also cost 10-20% more.

If you want silent cooling, though, check this out:
http://www.silentpcreview.com/

I frequent the site myself and it's a godsend for those who want less noise in their computer life.  Yes, it is perfectly possible to make a full bore PC as quiet as a Mac for usually only about $150 more in modifications.(or about $50 more at the time you built it yourself)

http://www.silentpcreview.com/article851-page1.html
A review of this card as well.  If it sucks, this site will clearly say it. (it doesn't of course - it's a very good card)

http://www.arctic-cooling.com/vga2.php?idx=147
0db video card cooler.  Yes, it's an enormous radiator in essence.  But it really works.  For the HD4850 though, get the nearly silent(for real!) accessory fans for it - makes a huge difference.  I can't hear it at all over the hard drive, let alone the power supply fan in my machine.
« Last Edit: January 15, 2009, 08:16:33 PM by Plekto » Logged
neil snape
Sr. Member
****
Offline Offline

Posts: 1432


WWW
« Reply #8 on: January 16, 2009, 01:34:15 AM »
ReplyReply

Quote from: jerryrock
The Mac line is shipping with mini display port, not compatible with display port technology currently deployed in some monitors (like the HP LP2480zx DreamColor).

I own a MacPro with ATIX1900XT video card which can pass 10 bit color output through DVI. I also own the DreamColor monitor that can accept 10 bit input. It also accepts 8 bit color input and through the built in 2 stage 12 bit LUT can upconvert the signal to 10 bit per color.

Jerry
The mini is an abstraction but will adopt Display Port configurations if Apple and others have their way.
As I said DVI can use 10 bit per channel but is not often employed. I have the same card in my Mac, also have the Dream Color Display. There are PC cards , many Matrox top line cards that have been 10 bit for some years already. It is not however passing 10 bit per channel to the monitor for calibration which is only possible with Display Port. There are some adapters for the mini Display Port which are not only expensive but no proving reliable. I don't have the latest MacBook / Pro so I can't say if the mini port works or not, if or not it is just a software bugfix or is limited by hardware.
But I can say that I would want Display Port on a PCI e card for any future video cards from what I see coming.

Logged
jerryrock
Sr. Member
****
Offline Offline

Posts: 553



WWW
« Reply #9 on: January 16, 2009, 07:36:26 AM »
ReplyReply

Quote from: neil snape
I have the same card in my Mac, also have the Dream Color Display. There are PC cards , many Matrox top line cards that have been 10 bit for some years already. It is not however passing 10 bit per channel to the monitor for calibration which is only possible with Display Port.

Can you explain why you claim the ATI X1900XT  does not pass on a 10 bit signal to the DreamColor display?  You also left out  the HDMI connection on the DreamColor display which can also receive a 10 bit per channel signal.

Jerry
Logged

Gerald J Skrocki
skrockidesign.com
neil snape
Sr. Member
****
Offline Offline

Posts: 1432


WWW
« Reply #10 on: January 16, 2009, 08:35:06 AM »
ReplyReply

Quote from: jerryrock
Can you explain why you claim the ATI X1900XT  does not pass on a 10 bit signal to the DreamColor display?  You also left out  the HDMI connection on the DreamColor display which can also receive a 10 bit per channel signal.

Jerry

I have no way to verify if yes or no 10 bits are capable on output for calibration. Since this is what is documented unless someone like Graeme Gill, Dany Pascal , or Will Hollingworth can verify or confirm this I'll have to believe the documentation. Speaking with Tom Lianza at X-Rite tells me that the benefits are real if you can indeed send 10 bit but doing so isn't easy for the most part of host computers at this time.
I clipped the text which says HDMI can do 10 bits but I am not aware of video cards that are HDMI , mind you  as I said before I edit on Mac so have to wait for Apple to use cards that maybe PC already have...
Logged
Farmer
Sr. Member
****
Offline Offline

Posts: 1607


WWW
« Reply #11 on: January 16, 2009, 09:18:00 PM »
ReplyReply

I have to disagree with Pletko here, at least to the extent that I have always found nVidia cards to be rock solid and their driver method very easy and effective.  This is across XP, XP SP1,2,3 and Vista 32 and Vista 64 (both SP1).  I have ATI in my laptop and at work and they're fine.  At home it's nVidia in several machines (from workstation to games machine to home theatre PC etc).  I don't mind ATI at all, but I've not had issues with nVidia at all, either, and I tend to prefer their customisation option in the driver (but it's a minor thing).
Logged

Plekto
Sr. Member
****
Offline Offline

Posts: 551


« Reply #12 on: January 19, 2009, 08:59:13 PM »
ReplyReply

Yes, but when NVidia's driver software goes south, you're left with no options to manually install specific older drivers.   With ATI, I can tweak it back a version or two and usually that's all that is required.

That said, If Nvidia's card does work for you, they generally are slightly more reliable and robust.  I just haven't had good luck with them being very compatible with the latest direct X upgrades.
Logged
Vautour
Newbie
*
Offline Offline

Posts: 26


« Reply #13 on: January 22, 2009, 04:51:22 AM »
ReplyReply

I also wouldn't worry too muich about graphics cards choice. Signal differences (at least with digital output) are no longer of any concern between the various models. I would give more attention to things like silent operation and sturdy build. Usually cards made by Asus are quite good, Gigabyte and MSI too.
As to whether Nvidia or Ati, well, that depends. Ati has a long history of not so reliable drivers, Nvidea's been better in that respect. Both had problems getting out decent drivers for Vista at the beginning but now they're good (if you don't use SLI or Crossfire, which can still be a little tricky: Some have no problems, others are going mad). To be on the save side get the WHQL drivers and not the latest uncertified drivers unless they fix a problem you experience yourself.
Any card with more than 64MB video ram should be sufficient for 2d tasks, so it doesn't matter that much which GPU you're using for photo editing because these tasks are performed by the CPU and the CPU needs system RAM. The graphics card's only there for diplaying the output.
Of course, if you're doing things like video encoding or playing back HD video content then you should look for cards with GPUs that support these features. Ati has a slight edge here at the moment, but that changes about every three to six months. Could be Nvidea next time.
Now, if you're also going to use Photoshop CS4 then it's getting interesting. Some tasks can now be done on the GPU and here it's true: faster is better. These effects are OpenGL based and here Nvidea has traditionally been somewhat faster than competing ATI (ok, AMD now, sorry, can't get quite the hang of the new name ) cards but AMD has come close, sometimes even overtaking Nvidea (depends on the application). You don't necessarily need top-of-the-line Quadro cards. These cards are usually almost identical to consumer parts regarding the GPU. They've got more video RAM and special drivers that can be used by applications like 3dmax or AutoCAD and some can control more displays at once. But from a performance point of view traditionally (save in said applications) they're not (or at least not that much) faster than their consumer counterparts. But you'll be paying a hefty premium over the consumer cards.

I don't know how much you want to spent, but I think a 1GB card such as a Nvidea GTX 260 (Core 216 is a bit faster) or a AMD 4870 gpu would be best from a price/performance ratio. They're about equal in performance.

If you do not use PS CS4 an AMD 4670 based card (preferably passively cooled) would be a good alternative since it has decent video accelaration. But here it depends how much such are card costs and faster models might get interesting (AMD 48xx, Nvidia 98xx, 9600GT). Depends on the current pricing.

Of course, if you can afford it and use PS CS4 THE card to be bought would be Nvidea Quadro CX based. Basically the same as a GTX 260 but with more support for things such as video encoding in Premiere or accelerated RAW  and DNG decoding. Here in Germany such a card starts at about 1700 (GTX 260 Core 216 begin at about 260 ).

So, your choice
Logged
NigelC
Sr. Member
****
Offline Offline

Posts: 513


« Reply #14 on: January 22, 2009, 06:13:00 AM »
ReplyReply

Quote from: Vautour
I also wouldn't worry too muich about graphics cards choice. Signal differences (at least with digital output) are no longer of any concern between the various models. I would give more attention to things like silent operation and sturdy build. Usually cards made by Asus are quite good, Gigabyte and MSI too.
As to whether Nvidia or Ati, well, that depends. Ati has a long history of not so reliable drivers, Nvidea's been better in that respect. Both had problems getting out decent drivers for Vista at the beginning but now they're good (if you don't use SLI or Crossfire, which can still be a little tricky: Some have no problems, others are going mad). To be on the save side get the WHQL drivers and not the latest uncertified drivers unless they fix a problem you experience yourself.
Any card with more than 64MB video ram should be sufficient for 2d tasks, so it doesn't matter that much which GPU you're using for photo editing because these tasks are performed by the CPU and the CPU needs system RAM. The graphics card's only there for diplaying the output.
Of course, if you're doing things like video encoding or playing back HD video content then you should look for cards with GPUs that support these features. Ati has a slight edge here at the moment, but that changes about every three to six months. Could be Nvidea next time.
Now, if you're also going to use Photoshop CS4 then it's getting interesting. Some tasks can now be done on the GPU and here it's true: faster is better. These effects are OpenGL based and here Nvidea has traditionally been somewhat faster than competing ATI (ok, AMD now, sorry, can't get quite the hang of the new name ) cards but AMD has come close, sometimes even overtaking Nvidea (depends on the application). You don't necessarily need top-of-the-line Quadro cards. These cards are usually almost identical to consumer parts regarding the GPU. They've got more video RAM and special drivers that can be used by applications like 3dmax or AutoCAD and some can control more displays at once. But from a performance point of view traditionally (save in said applications) they're not (or at least not that much) faster than their consumer counterparts. But you'll be paying a hefty premium over the consumer cards.

I don't know how much you want to spent, but I think a 1GB card such as a Nvidea GTX 260 (Core 216 is a bit faster) or a AMD 4870 gpu would be best from a price/performance ratio. They're about equal in performance.

If you do not use PS CS4 an AMD 4670 based card (preferably passively cooled) would be a good alternative since it has decent video accelaration. But here it depends how much such are card costs and faster models might get interesting (AMD 48xx, Nvidia 98xx, 9600GT). Depends on the current pricing.

Of course, if you can afford it and use PS CS4 THE card to be bought would be Nvidea Quadro CX based. Basically the same as a GTX 260 but with more support for things such as video encoding in Premiere or accelerated RAW  and DNG decoding. Here in Germany such a card starts at about 1700 (GTX 260 Core 216 begin at about 260 ).

So, your choice
I've just got a new Vista 64 box I haven't even connected up yet - Hope I haven't made the wrong choice of card! - its got nVidia 9800GT with 512MB, because I didn't think I needed a high end card - I plan to uograde to CS4 soon - is my card not up to using all the features of CS4? (rest of system is Q6600 overclocked to 4x3GHz, 8GB Ram, 2x 640GB in RAID0, ACD 23")
Logged
Vautour
Newbie
*
Offline Offline

Posts: 26


« Reply #15 on: January 22, 2009, 10:21:48 AM »
ReplyReply

Quote from: NigelC
I've just got a new Vista 64 box I haven't even connected up yet - Hope I haven't made the wrong choice of card! - its got nVidia 9800GT with 512MB, because I didn't think I needed a high end card - I plan to uograde to CS4 soon - is my card not up to using all the features of CS4? (rest of system is Q6600 overclocked to 4x3GHz, 8GB Ram, 2x 640GB in RAID0, ACD 23")

You shouldn't be too concerned about that. That card is way faster than the minimum requirement stated by Adobe. So the accelerated effects will most likely run faster than they did in CS3. On Ars Technica the reviewer stated that even an 7300GT had a positive effect (it slowed the system down somewhat during other cases, but the 9800GT is so much faster than the 7300GT (or the 8600M in the MacBook Pro) mentioned). But generally: Faster is almost always better, as it is with CPUs.

Besides you can test the acceleration by turning it off and see the effect for yourself, but I don't think your card will hold your system back. It remains to be seen how many functions Photoshop will be able to use the GPU for in future version. And then there'll be a whole new generation of GPUs out there.
« Last Edit: January 22, 2009, 10:23:13 AM by Vautour » Logged
erick.boileau
Sr. Member
****
Offline Offline

Posts: 469


WWW
« Reply #16 on: February 05, 2009, 11:32:40 AM »
ReplyReply

with NEC LCD2690WUXi + a second 22" screen  (for tools) I am using NVIDIA GeForce 8800 GTS (512 Mo)
Logged
Pages: [1]   Top of Page
Print
Jump to:  

Ad
Ad
Ad