Ad
Ad
Ad
Pages: [1] 2 »   Bottom of Page
Print
Author Topic: Does the GPU really matter for photography?  (Read 10762 times)
Dan Wells
Sr. Member
****
Offline Offline

Posts: 376


WWW
« on: April 23, 2013, 01:38:57 PM »
ReplyReply

    I've finally gotten tired of waiting for Apple to upgrade the Mac Pro, and am building a Hackintosh - even though it's my first Hackintosh, I have a lot of experience with Macs, and with putting hardware together, so I feel very confident about the project overall. The one question I can't find an answer to is what video card (GPU) to use...The Hackintosh forums are full of people using $400-$500 GeForces and Radeons (and both seem to work), but they are buying their expensive cards for gaming or 3D work... Nobody seems to have looked at GPUs in the context of photo editing only (along with some video editing, but no After Effects or 3D work). Will I regret just throwing a $100 GPU in an expensive machine, or would I regret wasting money on a more expensive GPU, because my applications don't benefit?

            -Dan
Logged
AFairley
Sr. Member
****
Offline Offline

Posts: 1222



« Reply #1 on: April 23, 2013, 02:32:24 PM »
ReplyReply

My personal experience running Lightroom on a Windows machine and using a 30" monitor is that a more powerful GPU reduces the lag when you make adjustments with the sliders.  But screen refresh lag of that kind is not a problem if you are using a smaller monitor (or zooming the image smaller when you edit on the large monitor).
Logged

Craig Lamson
Sr. Member
****
Offline Offline

Posts: 776



WWW
« Reply #2 on: April 23, 2013, 03:42:51 PM »
ReplyReply

    I've finally gotten tired of waiting for Apple to upgrade the Mac Pro, and am building a Hackintosh - even though it's my first Hackintosh, I have a lot of experience with Macs, and with putting hardware together, so I feel very confident about the project overall. The one question I can't find an answer to is what video card (GPU) to use...The Hackintosh forums are full of people using $400-$500 GeForces and Radeons (and both seem to work), but they are buying their expensive cards for gaming or 3D work... Nobody seems to have looked at GPUs in the context of photo editing only (along with some video editing, but no After Effects or 3D work). Will I regret just throwing a $100 GPU in an expensive machine, or would I regret wasting money on a more expensive GPU, because my applications don't benefit?

            -Dan

Be REALLY careful what prats you buy.  Trying to make some stuff ok on a hack can be down right brutal. Tony Mac has the best site and a great parts list for you to follow.  I've built two  over the last few years, one with ATI 5000 gpu and the latest with a GTX670.  New ATI cards and now starting to be supported with 10.8.3 If you want to talk hacks, send me a private message.

I just bought some low end parts to help my 14 year old nephew build his first computer.  Test assembled it to make sure nothing was doa before I travel to then and loaded 10.8.3 using the Intel 4000 built in graphics.  Worked a treat!
Logged

Craig Lamson Photo
www.craiglamson.com
Dan Wells
Sr. Member
****
Offline Offline

Posts: 376


WWW
« Reply #3 on: April 23, 2013, 04:53:28 PM »
ReplyReply

I've been reading Tonymac for quite a while, and have chosen highly compatible parts - there are several Hackintoshes VERY closely related to what I'm doing that are already running very well. This is going to be a powerful machine, easily equivalent to a midrange 2013 Mac Pro (if such a thing were to exist), and faster than any present Mac Pro except for the fastest version of the 12-core. The final decision is whether to throw in a $100 graphics card (it has no motherboard graphics, because it's an X79/ i3930k 6-core), or install a powerful card - obviously, I wouldn't skimp on the GPU in a machine of this price and power if there was a real benefit to be had by going higher-end, but I'm not sure that there is any real benefit for a pure photo/video machine (I am a landscape photographer and photography professor, and I may very well start doing some video work - I've had some ideas about the moving image in the landscape - but I have no interest in gaming beyond the occasional round of Angry Birds while riding the bus to work, and I can't see myself doing 3D or CAD, either).

In case anybody's interested, I've posted a description of the proposed machine below (all parts from Tonymac, and there are machines configured almost exactly this way in highly successful builds - one of them is what they call a Golden Build, which is the most stable group of computers people have put together). This will run Mountain Lion (either 10.8.3 or 10.8.4 if that's out by the time this sees the light of day).

CPU: Intel 3930k. To go any faster than this $600 CPU requires either a $1000 chip (and it's only 5% faster) or dual $1500+ chips for a speedup of only 15-20%. It's just not worth it. On the other hand, photographic applications DO tend to be multithreaded, and this 6-core CPU is quite a bit faster than the i7-3770 in the fastest iMac (plus these 6 core Socket 2011 parts support 64 GB of RAM). I'll overclock the CPU modestly (I'm thinking 4.2 gHz, which is easy on this chip and motherboard, and still runs cool and relatively energy-efficiently), while emphasizing stability over a huge overclock.

Motherboard: Asus Rampage IV Extreme. This has a reputation as the most stable Socket 2011 board around,largely due to extremely high quality power regulation. Some Socket 2011 boards, including the Gigabyte that is the other popular board on Tonymac, have 8 RAM slots, but aren't always totally stable with RAM in all 8 slots - this one will run all day with 64 GB!

RAM: Either 32 or 64 GB of DDR3 1600 mHz RAM. I would start with 32 GB, then go to 64 only if necessary, except that there are several articles saying that the most stable way to get to 64 GB is with all the RAM purchased in a single set of 8 DIMMs, and I'd hate to throw out 32 GB of perfectly good RAM in a year or two...

CPU cooling: Closed-loop water cooling (probably a Corsair H100i, or maybe a Swiftech). A nice compromise between noisy air cooling and the complexity of open loop water...

Case: Fractal Design Define XL R2 - look at the disk configuration, and that's why the big case with a million drive bays...

Power Supply: Seasonic Platinum (final wattage to be determined once the graphics card is chosen). The Seasonic Platinum series are the best, most stable, most efficient power supplies on the consumer market. To do better would mean a MUCH more expensive server or lab instrument power supply.

Boot disk: Samsung 840 Pro SSD (512 GB). Fast and reliable. There will also be a boot disk clone on a commodity hard disk, just in case some update messes the system up - important on any machine, but especially a Hackintosh...

Primary storage: 8 Western Digital Caviar Red 3 TB drives in RAID 6, driven off a RocketRAID 2720SGL. About 16 terabytes of usable space after formatting and RAID! Not cheap, but a lot less expensive than this kind of storage would have been until recently.


All in all, this aims to be the Mac Pro Apple should be making - fast, reliable and capacious storage... They used to make 'em like this, but then started getting so obsessed with iToys that they forgot to make Macs for the people who kept them going in their darkest days - photographers and filmmakers who stayed loyal to the Mac when EVERYONE used PCs...
   

                     -Dan
Logged
Craig Lamson
Sr. Member
****
Offline Offline

Posts: 776



WWW
« Reply #4 on: April 23, 2013, 07:23:06 PM »
ReplyReply

I've been reading Tonymac for quite a while, and have chosen highly compatible parts - there are several Hackintoshes VERY closely related to what I'm doing that are already running very well. This is going to be a powerful machine, easily equivalent to a midrange 2013 Mac Pro (if such a thing were to exist), and faster than any present Mac Pro except for the fastest version of the 12-core. The final decision is whether to throw in a $100 graphics card (it has no motherboard graphics, because it's an X79/ i3930k 6-core), or install a powerful card - obviously, I wouldn't skimp on the GPU in a machine of this price and power if there was a real benefit to be had by going higher-end, but I'm not sure that there is any real benefit for a pure photo/video machine (I am a landscape photographer and photography professor, and I may very well start doing some video work - I've had some ideas about the moving image in the landscape - but I have no interest in gaming beyond the occasional round of Angry Birds while riding the bus to work, and I can't see myself doing 3D or CAD, either).

In case anybody's interested, I've posted a description of the proposed machine below (all parts from Tonymac, and there are machines configured almost exactly this way in highly successful builds - one of them is what they call a Golden Build, which is the most stable group of computers people have put together). This will run Mountain Lion (either 10.8.3 or 10.8.4 if that's out by the time this sees the light of day).

CPU: Intel 3930k. To go any faster than this $600 CPU requires either a $1000 chip (and it's only 5% faster) or dual $1500+ chips for a speedup of only 15-20%. It's just not worth it. On the other hand, photographic applications DO tend to be multithreaded, and this 6-core CPU is quite a bit faster than the i7-3770 in the fastest iMac (plus these 6 core Socket 2011 parts support 64 GB of RAM). I'll overclock the CPU modestly (I'm thinking 4.2 gHz, which is easy on this chip and motherboard, and still runs cool and relatively energy-efficiently), while emphasizing stability over a huge overclock.

Motherboard: Asus Rampage IV Extreme. This has a reputation as the most stable Socket 2011 board around,largely due to extremely high quality power regulation. Some Socket 2011 boards, including the Gigabyte that is the other popular board on Tonymac, have 8 RAM slots, but aren't always totally stable with RAM in all 8 slots - this one will run all day with 64 GB!

RAM: Either 32 or 64 GB of DDR3 1600 mHz RAM. I would start with 32 GB, then go to 64 only if necessary, except that there are several articles saying that the most stable way to get to 64 GB is with all the RAM purchased in a single set of 8 DIMMs, and I'd hate to throw out 32 GB of perfectly good RAM in a year or two...

CPU cooling: Closed-loop water cooling (probably a Corsair H100i, or maybe a Swiftech). A nice compromise between noisy air cooling and the complexity of open loop water...

Case: Fractal Design Define XL R2 - look at the disk configuration, and that's why the big case with a million drive bays...

Power Supply: Seasonic Platinum (final wattage to be determined once the graphics card is chosen). The Seasonic Platinum series are the best, most stable, most efficient power supplies on the consumer market. To do better would mean a MUCH more expensive server or lab instrument power supply.

Boot disk: Samsung 840 Pro SSD (512 GB). Fast and reliable. There will also be a boot disk clone on a commodity hard disk, just in case some update messes the system up - important on any machine, but especially a Hackintosh...

Primary storage: 8 Western Digital Caviar Red 3 TB drives in RAID 6, driven off a RocketRAID 2720SGL. About 16 terabytes of usable space after formatting and RAID! Not cheap, but a lot less expensive than this kind of storage would have been until recently.


All in all, this aims to be the Mac Pro Apple should be making - fast, reliable and capacious storage... They used to make 'em like this, but then started getting so obsessed with iToys that they forgot to make Macs for the people who kept them going in their darkest days - photographers and filmmakers who stayed loyal to the Mac when EVERYONE used PCs...
   

                     -Dan


Sounds like a very nice machine. I went z77x instead.  Mostly because of some issues with the x79 builds.  It has been strong a stable and was a startup from the first boot with zero problems. 

Best of luck.
Logged

Craig Lamson Photo
www.craiglamson.com
Dan Wells
Sr. Member
****
Offline Offline

Posts: 376


WWW
« Reply #5 on: April 23, 2013, 09:43:30 PM »
ReplyReply

I just discovered a VERY interesting difference from a photographer's standpoint between AMD and NVidia GPUs. All NVidia GPUs (except the Titan, which WILL drive a 4k display, but also costs $1000), are locked to no more than a present-day 30" display (2560x1600) over any single output. To drive a 4k display, or anything else with shockingly high resolution (Apple Retina Cinema Display?), you need to combine two or more outputs (probably the two DVI ports, because they are the only two similar ports). This hack works, but also causes the operating system to treat the single display as two displays - menu bars won't work right, you can't maximize a window except by dragging, etc... This doesn't matter in games, because the game takes over the display fully, or in full-screen video playback (again, the display is taken over and out of the hands of the operating system), but it DOES matter for high resolution Photoshop!!!! The most practical to run Photoshop above 2560x1600 on a NVidia card  is to use two monitors and have the high resolution one as a dedicated image-only secondary display with the menus and palettes on a lower-resolution screen that the OS recognizes as a single monitor - the cards DO have three outputs (actually four, but they're an odd combination), so it's possible.
       There are two setups where this becomes problematical - most obviously, any attempt to do photo editing with a single very high resolution display will be a problem - menus and such will display oddly (someone tried it, and it IS usable, but barely). Secondly, it's a problem for anyone who uses a Cintiq with a high-resolution display. The menu bar ends up on the Cintiq, which could very well not be where you want it, especially in the case of the smaller Cintiqs that are really meant as auxiliary displays showing a zoomed portion of the image. It takes an odd combination of cables indeed to get FOUR connections into a single card (Cintiq, palette monitor and a high-res screen using two connectors)not to mention the desktop clutter! A standard GeForce has 2 DVI ports, one DisplayPort and one HDMI. The high-res screen uses both DVI ports, so the palette monitor can't be DVI (neither can the Cintiq), and it HAS to be whatever the Cintiq isn't! A bit tricky, to say the least!
     AMD, on the other hand, runs 4000+x3000+ off DisplayPort or HDMI on any card above their $150 Radeon 7770. It's possible that the capability exists even lower in their line - I didn't check, because I wouldn't want to use a $100 video card on a 4k display anyway! DVI can't go above 2560x1600, but that is DVI, not the card...

                                           -Dan

Logged
Tom Montgomery
Jr. Member
**
Offline Offline

Posts: 78


WWW
« Reply #6 on: April 24, 2013, 07:26:36 PM »
ReplyReply

If you're planning to use Photoshop, then the GPU becomes more important, as Adobe is moving more scrolling, zooming and filter processing over to the GPU with each release.
Logged
BernardLanguillier
Sr. Member
****
Offline Offline

Posts: 8389



WWW
« Reply #7 on: April 25, 2013, 01:19:42 AM »
ReplyReply

That's even more true for if you ever venture into motion.

The Apple App store would not even let me install Final Cut Pro X 10.02 on my Mac Pro until I replaces the original ATI X1900 by a 5770...

In my view, the importance of components perfo is becoming:
1. Amount of memory
2. SSD
3. GPU
4. CPU

Cheers,
Bernard
Logged

A few images online here!
francois
Sr. Member
****
Offline Offline

Posts: 7002


« Reply #8 on: April 25, 2013, 01:45:04 AM »
ReplyReply

…In my view, the importance of components perfo is becoming:
1. Amount of memory
2. SSD
3. GPU
4. CPU

Cheers,
Bernard


I agree with that. GPU is useful but only if RAM and storage are already top notch.
Logged

Francois
Sheldon N
Sr. Member
****
Offline Offline

Posts: 810


« Reply #9 on: April 25, 2013, 09:33:43 AM »
ReplyReply

My personal experience running Lightroom on a Windows machine and using a 30" monitor is that a more powerful GPU reduces the lag when you make adjustments with the sliders.  But screen refresh lag of that kind is not a problem if you are using a smaller monitor (or zooming the image smaller when you edit on the large monitor).

My experience has been exactly the opposite, testing multiple video cards back to back on a Windows machine (cards were significantly different), there was no difference in screen refresh/lag on a large monitor. Looking at resource monitors in conjunction with this confirms that Lightroom doesn't even use the GPU. The lag/refresh from moving sliders is because of the CPU load from having to recalculate the image from the RAW file, so your video card should not have a noticeable effect.

In my view, the importance of components perfo is becoming:
1. Amount of memory
2. SSD
3. GPU
4. CPU

Possibly for video editing, but if you are looking at just Photoshop/Lightroom the speed of your CPU is *much* more important than the video card.
Logged

bill t.
Sr. Member
****
Offline Offline

Posts: 2711


WWW
« Reply #10 on: May 14, 2013, 10:01:50 PM »
ReplyReply

There are some third party, computation intensive programs like PhotoZoom5 and Resize 7.5 that make intensive use of GPU's to very good advantage.  Both those versions run about 5 to 10 times faster with a GPU present. As better GPU programming interfaces are evolving along with programmers who know how to exploit them, I think we will see a lot more GPU use in the coming years.  Right now Adobe Premiere is optimized for high end Nvidia chips, and the difference in performance between software-only and GPU systems is night and day.
Logged
dreed
Sr. Member
****
Offline Offline

Posts: 1291


« Reply #11 on: May 15, 2013, 08:53:51 AM »
ReplyReply

Possibly for video editing, but if you are looking at just Photoshop/Lightroom the speed of your CPU is *much* more important than the video card.

Yup. There have been many threads on this topic in the Lightroom forum on this website and time and again, Schewe or someone else points out that LR doesn't use the GPU and probably won't ever.
Logged
BartvanderWolf
Sr. Member
****
Offline Offline

Posts: 3911


« Reply #12 on: May 15, 2013, 09:03:33 AM »
ReplyReply

There are some third party, computation intensive programs like PhotoZoom5 and Resize 7.5 that make intensive use of GPU's to very good advantage.  Both those versions run about 5 to 10 times faster with a GPU present. As better GPU programming interfaces are evolving along with programmers who know how to exploit them, I think we will see a lot more GPU use in the coming years.

I agree with Bill. Because some current applications do not utilize the potential, doesn't mean that there are no great speed benefits to be had. Both the speed of complex real time updating previews, which is great for creative interaction with controls, as well as final rendering, can benefit tremendously.

Cheers,
Bart
Logged
Schewe
Sr. Member
****
Offline Offline

Posts: 5544


WWW
« Reply #13 on: May 15, 2013, 08:31:35 PM »
ReplyReply

...Schewe or someone else points out that LR doesn't use the GPU and probably won't ever.

Well, I said Lightroom doesn't (and LR5 doesn't) but I never said never!
Logged
Alan Goldhammer
Sr. Member
****
Offline Offline

Posts: 1739


WWW
« Reply #14 on: May 16, 2013, 02:56:25 PM »
ReplyReply

Well, I said Lightroom doesn't (and LR5 doesn't) but I never said never!
Well it does use it passively at any rate if your monitor is plugged into the video card with the GPU. Grin
Logged

madmanchan
Sr. Member
****
Offline Offline

Posts: 2110


« Reply #15 on: May 16, 2013, 06:58:17 PM »
ReplyReply

Yes, the card is essential if you want to see something on your screen.   Wink  But as Jeff said, Lr doesn't yet use the GPU for rendering acceleration.
Logged

Schewe
Sr. Member
****
Offline Offline

Posts: 5544


WWW
« Reply #16 on: May 16, 2013, 08:44:51 PM »
ReplyReply

But as Jeff said, Lr doesn't yet use the GPU for rendering acceleration.

Notice Eric said "yet"...which is why I never said ACR/LR would never use GPU. It's just that it's a tough problem to solve cross platform and it takes special code to put the processing on to the GPU instead of the main cpu. It's been a problem for Photoshop ever since GPU was put in and it's only now getting useful in certain functions. Capture One also added GPU but using it tend to make C1 a lot less stable...so, it's an evolving situation. I have hope that GPU could be used to make ACR/LR a bit faster (or a lot faster). So, hopefully, the elves on ACR/LR are working on it!
Logged
bill t.
Sr. Member
****
Offline Offline

Posts: 2711


WWW
« Reply #17 on: May 17, 2013, 04:00:07 AM »
ReplyReply

This highly visual program has exploited GPU's for almost a decade.  Shows you what's possible without getting bogged down in what's desirable.  Way cheaper than an IQ180, or even a D800e.  You may never go back to photography.
Logged
Schewe
Sr. Member
****
Offline Offline

Posts: 5544


WWW
« Reply #18 on: May 17, 2013, 04:23:13 AM »
ReplyReply

Shows you what's possible without getting bogged down in what's desirable. 

There's a fundamental reason why GPU for 3D is a totally different story than using GPU for dedicated image processing routines that takes over from the cpu. Routines sent to GPU must be optimized for specific and special image processing–which may be different than cpu processing. Which means stuff has to be written to run on GPU...that's the problem. GPUs were designed to run image processing routines, they were designed for 3D and gaming...
Logged
Rhossydd
Sr. Member
****
Offline Offline

Posts: 2008


WWW
« Reply #19 on: May 17, 2013, 07:11:30 AM »
ReplyReply

GPUs ... they were designed for 3D and gaming...
Adobe have been using GPU acceleration VERY effectively in Premiere Pro since version 5 (2010), that's only a 2D application.
Maybe the different departments need to exchange some ideas ?
Logged
Pages: [1] 2 »   Top of Page
Print
Jump to:  

Ad
Ad
Ad