What's a good computer monitor?

cayenne

EOS 5D Mark IV
Mar 28, 2012
2,307
312
Hi all,

Ok, I saved my nickles and dimes for a couple years and I got a new workstation...a Mac Pro.

So far I"m loving it......and I'm looking next for a good computer monitor.
I can't afford the Apple offering any time soon so, wondering what other options are out there.

I'll be doing mostly stills, but also video too.....

My current monitor, Dell U2711...was pretty decent, bang for the buck in its day, but I think it is starting to age fade on me....

I like 27" at least, maybe larger....something that will hold calibration, etc.

Any recommendations out there what to look at?


Thank you in advance,

cayenne
 
  • Like
Reactions: Maximilian

Codebunny

EOS RP
Sep 5, 2018
479
427
DELL P2415Q is a fave of mine at the moment. At least as a screen to read text on until you get a colour correct screen, though it isn't bad. I run it on retina so it appears 1080 but using 4:1 pixels. This is great for text and of course images are shown in 1:1 on it when editing.
 

Jack Douglas

CR for the Humour
Apr 10, 2013
6,609
1,970
Alberta, Canada
I bought a 32" 4K BenQ PD3200U and am really pleased with it for the price. I owned a BenQ 20" from way back and was always impressed with it, again for the price. It compared well to my Samsung 24", which I wasn't too impressed with.

Jack
 

YuengLinger

EOS R6
Dec 20, 2012
2,932
1,214
Southeastern USA
Thumbs up for 32" 4k. Viewsonic also has some great monitors, very reliable, high-quality displays. Mine is almost three years old now, wouldn't want to go smaller size or lower res again.
 
  • Like
Reactions: Jack Douglas

privatebydesign

I don't preorder, I'm not a paid beta tester!
Jan 29, 2011
8,860
2,791
120
BENQ SW321C, for stills orientated work plus some video you have to spend a lot more money to beat it. Native USB-C for the Mac Pro, hardware calibration, 16 bit LUT's, 10 bit output, comes with test certificates of standardization and performance to Delta E ≤ 2, customizable puck for easy control, heck you can even do fancy stuff like split screen in different color spaces.

 
Last edited:

Maximilian

The dark side - I've been there
Nov 7, 2013
2,980
1,014
Germany
Hi Guys!

I'd like to jump in into cayennes question.

What would you recommend, if I'd add the following constraints/limitations:
  • 27" is big enough
  • full HD is good enough
  • calibration tool like spyder pro is already there
  • budget is limited to 500-600 bucks, and cheaper would be nice, too
Thanks in advance for answers on this.
 

mkamelg

EOS 5DS R
Feb 1, 2015
32
12
I'm NOT from the United States
Hi all,

Ok, I saved my nickles and dimes for a couple years and I got a new workstation...a Mac Pro.

So far I"m loving it......and I'm looking next for a good computer monitor.
I can't afford the Apple offering any time soon so, wondering what other options are out there.

I'll be doing mostly stills, but also video too.....

My current monitor, Dell U2711...was pretty decent, bang for the buck in its day, but I think it is starting to age fade on me....

I like 27" at least, maybe larger....something that will hold calibration, etc.

Any recommendations out there what to look at?


Thank you in advance,

cayenne
Best 4K and 5K Displays for Mac – Cost-Effective Alternatives to the Apple Thunderbolt Display

5K's Dell UP2715K and Philips 275P4VYKEB are discontinued. As for the other models, before potential buying check opinions about them e.g. on Amazon.

If you are looking for monitors for general applications, photography and graphics, then read (or just look at monitor list) this annually updated article published on the Polish forum dedicated to the subjects on creating, registering and processing the image:


Monitors with the note -CT and COLOR are versions available only on Polish market. What is the difference between them and the factory models?


Personally I use the 27" Asus PA279Q monitor (for what I remember since 2014), connected to the Mac mini (Late 2012).
 
Last edited:
  • Like
Reactions: cayenne

privatebydesign

I don't preorder, I'm not a paid beta tester!
Jan 29, 2011
8,860
2,791
120
Hi Guys!

I'd like to jump in into cayennes question.

What would you recommend, if I'd add the following constraints/limitations:
  • 27" is big enough
  • full HD is good enough
  • calibration tool like spyder pro is already there
  • budget is limited to 500-600 bucks, and cheaper would be nice, too
Thanks in advance for answers on this.
 

cayenne

EOS 5D Mark IV
Mar 28, 2012
2,307
312
BENQ SW321C, for stills orientated work plus some video you have to spend a lot more money to beat it. Native USB-C for the Mac Pro, hardware calibration, 16 bit LUT's, 10 bit output, comes with test certificates of standardization and performance to Delta E ≤ 2, customizable puck for easy control, heck you can even do fancy stuff like split screen in different color spaces.


Oh wow...that looks like a monster on first glance....right up my alley.

I'll start researching a bit...but wow, that looks good!!

Thank you,

C
 
  • Like
Reactions: Maximilian

Bert63

EOS RP
Dec 3, 2017
611
1,211
I bought a 32" 4K BenQ PD3200U and am really pleased with it for the price. I owned a BenQ 20" from way back and was always impressed with it, again for the price. It compared well to my Samsung 24", which I wasn't too impressed with.

Jack

Ha! I just bought this same monitor over the winter and love it...

I have it connected between a Mac Mini and my main desktop and it's a fantastic piece of equipment.

I have 27" BenQs on either side of it.. :)
 

kten

EOS M6 Mark II
Oct 3, 2015
50
44
one thing to consider with 10bit + output that sometimes gets overlooked due to how marketing represents things is how you're feeding the monitor as most consumer level gpu's aimed at gaming folks wont manage 10bit (in none full screen applications ie. games) so you'll need workstation card like nvidia quadro or fire pro/radeon pro. You may get some slight benefit to gradients in theory with higher bit internal LUTs and true 10bit or more panel vs regular 8 bit panels (or 6+frc) but it won't be true benefit and weak link in an otherwise 10bit end to end workflow.

I should clarify I'm talking about real time high bit depth workflows, I know a few black magic cinema cards allow via exporting but you can't work real time that way thus is limiting, at least I've never found a way so I'd happy to be proven wrong if someone can tell me a way around it. There may be others but only familiar with BMC). If you're not after colour critical design and photo work or need to work in none sRGB spaces such as aRGB and DCI P3 etc workflows there are plenty of accurate enough for most use (including pro) panels out there. Just thought I'd mention it in case.
 

gwooding

I'm New Here
Oct 2, 2014
19
23
Johannesburg - South Africa
one thing to consider with 10bit + output that sometimes gets overlooked due to how marketing represents things is how you're feeding the monitor as most consumer level gpu's aimed at gaming folks wont manage 10bit (in none full screen applications ie. games) so you'll need workstation card like nvidia quadro or fire pro/radeon pro. You may get some slight benefit to gradients in theory with higher bit internal LUTs and true 10bit or more panel vs regular 8 bit panels (or 6+frc) but it won't be true benefit and weak link in an otherwise 10bit end to end workflow.

I should clarify I'm talking about real time high bit depth workflows, I know a few black magic cinema cards allow via exporting but you can't work real time that way thus is limiting, at least I've never found a way so I'd happy to be proven wrong if someone can tell me a way around it. There may be others but only familiar with BMC). If you're not after colour critical design and photo work or need to work in none sRGB spaces such as aRGB and DCI P3 etc workflows there are plenty of accurate enough for most use (including pro) panels out there. Just thought I'd mention it in case.
I am not sure about AMD but as far as I know Nvidia has supported 10-bit output on geforce cards in non full screen applications for a while now. You just need to use their studio driver instead of the game ready driver.I unfortunately don't have a 10-bit display to actually test this with though.

 
  • Like
Reactions: kten

kten

EOS M6 Mark II
Oct 3, 2015
50
44
I am not sure about AMD but as far as I know Nvidia has supported 10-bit output on geforce cards in non full screen applications for a while now. You just need to use their studio driver instead of the game ready driver.I unfortunately don't have a 10-bit display to actually test this with though.

Awesome and much appreciated, I didn't know about that and must have completely missed the change.
 
Feb 21, 2017
7
10
Hi Guys!

I'd like to jump in into cayennes question.

What would you recommend, if I'd add the following constraints/limitations:
  • 27" is big enough
  • full HD is good enough
  • calibration tool like spyder pro is already there
  • budget is limited to 500-600 bucks, and cheaper would be nice, too
Thanks in advance for answers on this.
Full HD is not good enough. On a 26" 1900x1200 monitor, I couldn't easily tell the difference between my sigma art 35mm and my canon 35mm f2. I sold my Sigma and kept the Canon. Once I upgraded to a 5k display, I strongly regret selling my sigma. The difference was very noticeable. I can definitely recommend against the LG 5k on Apple.com. It is the worst purchase of my life...expensive, no audio output, can't work with any non-thunderbolt GPU.

I'd recommend a 32" 4k display. Make sure it supports multiple display inputs and audio out. USB-C is very much worth it because it can charge your laptop and allow 1-cable access. I don't think 5k makes a noticeable difference in image quality, but it will noticeably impact your performance and limit your options. If you're a laptop user, you have far superior eGPU options if your monitor has non-Thunderbolt/USB-C inputs (DisplayPort or HDMI).

Also, keep in mind, larger display means more strain on your computer. Lightroom Classic is painful to use on any macbook (I have the 2019 now with the best GPU/CPU and it's tolerable, but not pleasant) with 5k.

For Windows desktops, they have superior hardware options, so this is not an issue, but for Mac Users, save some money for a machine upgrade after you decide it's not worth it to wait 10 seconds for Lightroom to switch from 1 photo to another (it can take that long on a top of the line 2016 Thunderbolt 3 macbook connected to the 5k display).
 

LookingThroughMyLens81

EOS M6 Mark II
Jun 12, 2013
55
2
Awesome and much appreciated, I didn't know about that and must have completely missed the change.
It was done because the HDR testing for Windows 10 required it and because creatives, who by and large do not use Quadro cards anymore and use the high-end gaming cards due to their superior performance and price ratio, needed it to use the new high-end pro displays for video color-correction and grading work.
 
  • Like
Reactions: kten

kten

EOS M6 Mark II
Oct 3, 2015
50
44
It was done because the HDR testing for Windows 10 required it and because creatives, who by and large do not use Quadro cards anymore and use the high-end gaming cards due to their superior performance and price ratio, needed it to use the new high-end pro displays for video color-correction and grading work.
Yeah I'm in that camp but totally missed the support and on find price to performance they work better for me for speed of workflow with premier and aftereffects in particular vs the lower end workstation cards. I use it hooked to 10bit panel albeit not wide gamut one although it'd be nice doing semi critical colour work although accuracy matters more to me than wider spaces. To get the same performance on the quadro line cost more than it was worth to me and there is other parts of my chain I'd rather spend it on before that.
 

Codebunny

EOS RP
Sep 5, 2018
479
427
I am not sure about AMD but as far as I know Nvidia has supported 10-bit output on geforce cards in non full screen applications for a while now. You just need to use their studio driver instead of the game ready driver.I unfortunately don't have a 10-bit display to actually test this with though.

AMD supported 30 bit colour for quite a while. Nvidia for a long time had made it only available to the Quadro line, it is fantastic that it is now just a driver change.