Video card advice, please!

Status
Not open for further replies.
The good: I just picked up a Dell U2711 monitor.

The bad: It's a 2560x1440 monitor, but my antiquated video card only supports 1920x1080.

I suppose I should come out of the closet as a PC guy ;D That said, what video card would you fine folks recommend? Even after calibrating the monitor with my Spyder, the colors and brightness are WAY off. I'm hoping the video card has something to do with it.
 
Quick answer is no, there shouldnt be any difference in color output from video cards. The color settings are all software coded and you should be able to tinker with them in the video card utility, so i would start there and make sure you didnt set something weird.

Another thing is, if youre using VGA for the connection then you should be change to DVI/HDMI/Displayport instead. VGA isnt able to output that resolution.

As for video card suggestions, i need to know your price range and what you except to do on it to be able to help.
 
Upvote 0
Mar 25, 2011
16,847
1,835
Video cards are tough to recommend. There are many interfaces, power requirements, etc. All of those need to be known in order to recommend a card.
Many of the newer mid level cards want a 500 watt power supply, for example, and if your computer has a 350 watt supply, it might overload it and cause other strange problems.
I tend to stick with the type of card supplied with my PC, and upgrade the whole pc as a unit. Then everything plays well together. I've built a lot of computers and had a lot of high end power supply failures, learning the hard way.
 
Upvote 0
BruinBear said:
Quick answer is no, there shouldnt be any difference in color output from video cards. The color settings are all software coded and you should be able to tinker with them in the video card utility, so i would start there and make sure you didnt set something weird.

I'll try that. All I did was unplug the old monitor, and plug in the new one with an HDMI cable. I didn't mess with the video card utility at all, but the calibration is all f'd up now.

As for video card suggestions, i need to know your price range and what you except to do on it to be able to help.

Photo editing is the only thing I'll need a good video card for. I'm not a gamer, nor do I watch movies on my computer. I don't have a firm price range, but if a decent card is going to cost more than $250, I'll probably just buy a new computer with a good video card already in it.
 
Upvote 0
V8Beast said:
BruinBear said:
Quick answer is no, there shouldnt be any difference in color output from video cards. The color settings are all software coded and you should be able to tinker with them in the video card utility, so i would start there and make sure you didnt set something weird.

I'll try that. All I did was unplug the old monitor, and plug in the new one with an HDMI cable. I didn't mess with the video card utility at all, but the calibration is all f'd up now.

As for video card suggestions, i need to know your price range and what you except to do on it to be able to help.

Photo editing is the only thing I'll need a good video card for. I'm not a gamer, nor do I watch movies on my computer. I don't have a firm price range, but if a decent card is going to cost more than $250, I'll probably just buy a new computer with a good video card already in it.

Photo editing is not very video card intensive, it relies more on your processor power. So really any midrange card (150$ max) should be more than enough for your purposes. So for ati something like a 6770 or 7770, for nvidia something like 460 or 550 (sorry, but im not too familliar with the new 6XX series of cards so cant really comment). The one thing you have to be sure of, as the last guy commented on, is the power draw of these cards, make sure your system requirement is less than your power supply. Another thing to watch out for, some video cards require 2 pci-e power connections, so make sure your power supply has an extra if you choose one of those cards.
 
Upvote 0
The U2711 is a wide gamut monitor, right? That means you need a video card that is capable of doing 10 bit color if you want to get the most out of it. You'll have to get a professional card (like the ones made for CAD), either an Nvidia quadro or ATI Firepro. I've also read that the consumer Nvidia cards will do it, but I don't have any personal experience with them and I've only seen it a couple places. I know the comercial ATI cards won't do 10 bit color. It's stupid because they are more than capable, they're just crippled so that they can't.
 
Upvote 0
atvinyard said:
The U2711 is a wide gamut monitor, right? That means you need a video card that is capable of doing 10 bit color if you want to get the most out of it. You'll have to get a professional card (like the ones made for CAD), either an Nvidia quadro or ATI Firepro. I've also read that the consumer Nvidia cards will do it, but I don't have any personal experience with them and I've only seen it a couple places. I know the comercial ATI cards won't do 10 bit color. It's stupid because they are more than capable, they're just crippled so that they can't.

As far as i know the IPS panel in that monitor is only 8-bit.

Edit: it is 8-bit with ARC for an emulated 10-bit, but either way the pro workstation cards are well out of the price range.
 
Upvote 0
BruinBear said:
V8Beast said:
BruinBear said:
Quick answer is no, there shouldnt be any difference in color output from video cards. The color settings are all software coded and you should be able to tinker with them in the video card utility, so i would start there and make sure you didnt set something weird.

I'll try that. All I did was unplug the old monitor, and plug in the new one with an HDMI cable. I didn't mess with the video card utility at all, but the calibration is all f'd up now.

As for video card suggestions, i need to know your price range and what you except to do on it to be able to help.

Photo editing is the only thing I'll need a good video card for. I'm not a gamer, nor do I watch movies on my computer. I don't have a firm price range, but if a decent card is going to cost more than $250, I'll probably just buy a new computer with a good video card already in it.

Photo editing is not very video card intensive, it relies more on your processor power. So really any midrange card (150$ max) should be more than enough for your purposes. So for ati something like a 6770 or 7770, for nvidia something like 460 or 550 (sorry, but im not too familliar with the new 6XX series of cards so cant really comment). The one thing you have to be sure of, as the last guy commented on, is the power draw of these cards, make sure your system requirement is less than your power supply. Another thing to watch out for, some video cards require 2 pci-e power connections, so make sure your power supply has an extra if you choose one of those cards.

BruinBear - You are spot on with your comment, just about any video card with a DVI/HDMI output is going to be good enough for straight photo editing. You are MUCH better off spending extra money on RAM and an SSD for photo editing to speed up how fast your photos load and how fast edits can be accomplished.

The question you have to ask is if you want to run dual monitors (dual dvi) or edit video. If either of these are the case you will have to pony up for the better cards.

Just for reference, I'm running a quad core i7 with 16GB RAM and running on the embedded i7 video card. 1080 IPS monitor works great and with color calibration you would never know the difference from a more expensive graphics card.
 
Upvote 0
Thanks to everyone who responded. The models suggested thus far are:

- ATI 6770 and 7770
- Nvidia 460 and 550

I assume that any of these will suffice for my needs? I'll have to crack the computer case open to see if the power supply is up to snuff. Either way, a new computer isn't out of the equation. I have one of the very early quad-core Intel processors (not sure which one exactly), but my RAM is maxed out a 8gb. That worked fine with the 5DC's files, but the 5D3's raw files are pushing this system pretty hard.
 
Upvote 0
V8Beast said:
BruinBear said:
Quick answer is no, there shouldnt be any difference in color output from video cards. The color settings are all software coded and you should be able to tinker with them in the video card utility, so i would start there and make sure you didnt set something weird.

I'll try that. All I did was unplug the old monitor, and plug in the new one with an HDMI cable. I didn't mess with the video card utility at all, but the calibration is all f'd up now.

As for video card suggestions, i need to know your price range and what you except to do on it to be able to help.

Photo editing is the only thing I'll need a good video card for. I'm not a gamer, nor do I watch movies on my computer. I don't have a firm price range, but if a decent card is going to cost more than $250, I'll probably just buy a new computer with a good video card already in it.

its the hdmi thats limiting your resolution
you have to use a display port connection to get that resolution
so if your existing video card has a display port output then try that it will probably get full res

I have 2 of these dell u2711 and have been searching for a way to drive both off 1 macbook pro at full res
the best usb video adapters cant reach that resolution :( so they are out

also the colours are strange i have found once calibrated with the colormunki then photoshop and lightroom have good colour but web and everything else is way off and oversaturated basically i have found these screen only any good for editing
 
Upvote 0
If you're going to get an NVidia graphics card, then get one of the new 600 series. I built a PC from scratch in 2011 with quad-core i7 16GB 1600MHz RAM etc. + an NVIDIA GTX-570 with 1.28GB of GDDR5 VRAM. I specifically choose that card because it was one of the few Adobe approved for the Mercury Playback Engine in Premiere Pro, plus it works great with Photoshop and other GPU-accelerated 64-bit software applications.

The downside - it uses too much electricity (it's rated at 217 Watt but it sucks more) and I'm using a 550W PSU. So I tune it down by -100MHz (also adjust the voltage too) and that stops my PC from restarting.

The new GTX-660Ti or the GTX-670 cards are both more powerful (processing power) + have more dedicated memory, but USE LESS ELECTRICAL POWER, a lot less. So get one of the latest versions that use less juice.

My GTX-570 is driving 2 monitors (1 x 40-inch and 1 x 25-inch) both @ 1920 x 1080 resolution, however the new cards will easily run 4 monitors and you can push the resolution up to 2560 x 1600 with a digital connection (VGA cord is a little lower) and you can choose TrueColor 32-bit rate + good refresh rate of 60Hz
 
Upvote 0
Update: I got me a Nvidia Gforce 610, and got the resolution issue under control. Thanks to everyone for your help.

wickidwombat said:
its the hdmi thats limiting your resolution
you have to use a display port connection to get that resolution
so if your existing video card has a display port output then try that it will probably get full res

I have 2 of these dell u2711 and have been searching for a way to drive both off 1 macbook pro at full res
the best usb video adapters cant reach that resolution :( so they are out

also the colours are strange i have found once calibrated with the colormunki then photoshop and lightroom have good colour but web and everything else is way off and oversaturated basically i have found these screen only any good for editing

My old card was so ghetto that it didn't even have a DVI output. I got the DVI hooked up with the new card now, but the old Spyder2 doesn't seem to play nice with the new card. It supposedly calibrates the screen, but the profile never loads onto the card. I'm going to borrow my friend's Spyder4 and see what happens.

And you're right about the colors. With the factory calibration, the colors are WAY over saturated, straight Ken Rockwell style ;D It's ridiculously bright, too. Does your screen make a strange humming noise?
 
Upvote 0
The problem is: at the very first look HDMI only supports 1920x1080px resolution. You need a modified driver or a graphic card with DVI (dual-link must be supported!) or Display-Port-connection.

AND you don't need a card like NVIDIA 660 Ti oder 670 - they are way too expensive and, as already mentioned, the card has nearly nothing to do with photo-editing.

The price for a display-port-card starts at around 50 bucks.
 
Upvote 0
Oct 15, 2010
778
0
wickidwombat said:
its the hdmi thats limiting your resolution
you have to use a display port connection to get that resolution
so if your existing video card has a display port output then try that it will probably get full res
I am looking at buying a laptop with nVIDIA 600m or 670m. The one with the 670m ($300 more) have a display and DVI ports and both units have HDMI 1.4a. I just Googled around and found that the newer HDMI spec appears to support higher resolutions - can anyone confirm that?

http://www.hdmi.org/manufacturer/hdmi_1_4/hdmi_1_4_faq.aspx#1

Here is a link to the two notebooks (desktop replacements, really)
http://www.reflexnotebook.ca/laptops-notebooks/screen-size/15-display/sager-np6350.html
http://www.reflexnotebook.ca/laptops-notebooks/screen-size/15-display/sager-np9150.html

I thought I would be fine with the 6350 model for my needs and think the NP6350's HDMI port will support monitors beyond 1920x1080.
 
Upvote 0
wickidwombat said:
I just got a 30" apple cinema display and damn that screen has some nice natural colour

Did you have to calibrate the Apple monitor much? It sounds like it has nice color right out of the box. This Dell was nowhere close, but it looks good now. Stuff on the web looks great after tuning it up with the Spyder as well. I've always like the Apple monitors, but I'm not a fan of the glossy screen. Not sure if they work on a PC, either?
 
Upvote 0
BruinBear said:
As far as i know the IPS panel in that monitor is only 8-bit.

Edit: it is 8-bit with ARC for an emulated 10-bit, but either way the pro workstation cards are well out of the price range.

Where did you find that this is an 8-bit panel?

The ATI v4800 is a workstation card for $160 that outputs 10bit color over DisplayPort: http://www.newegg.com/Product/Product.aspx?Item=N82E16814195096
 
Upvote 0
V8Beast said:
wickidwombat said:
I just got a 30" apple cinema display and damn that screen has some nice natural colour

Did you have to calibrate the Apple monitor much? It sounds like it has nice color right out of the box. This Dell was nowhere close, but it looks good now. Stuff on the web looks great after tuning it up with the Spyder as well. I've always like the Apple monitors, but I'm not a fan of the glossy screen. Not sure if they work on a PC, either?

the apple is pretty sweet straight up I havent run a calibration on it yet either, i will do and see how it looks after this is the older one with the anti glare screen not the new glossy one
 
Upvote 0
Status
Not open for further replies.