geekpower said:
dak723 said:
scyrene said:
And the old 'more MP will harm image quality' - can we lay that one to rest once and for all? If you are viewing at the same size, whether on screen or in print, increasing MP will not harm image quality. Noise per pixel might go up, but image noise does not. Sharpness is not harmed, and may be helped. Diffraction and camera shake are again only affected viewed at 100%. The only objective downsides are lower frame rates (compared to a camera of lower resolution from the same era) and higher file sizes. They are not insignificant, but they don't affect image quality.
We can not lay it to rest because physics will not let it lay. The 5Ds & r have IQ (noise, DR) closer to a crop body because they have the same smaller pixels and greater pixel density compared to a 20 something MP FF body. Camera shake is real and Canon spent time and money redesigning their mirror system trying to reduce shake for the 5Ds and r exactly because it has high MPs. Numerous photographers have recommended using faster shutter speeds because they are necessary to avoid the blur. These things are real - not just at the pixel level. The 50 MP 5Ds and r have given us real life examples of both the positive (higher resolution and sharpness) and the negative (more noise, lower DR, more camera shake blur). That's why some of us don't want more than 20 something MPs in their FF cameras.
Facepalm... You can't say you have physic on your side, and then not use any physics.
Obviously a 50 Mpix image cropped to 20 Mpix then printed in the same size as an original 20 Mpix image will look worse, because it is in effect magnifying any problems that are seen at the pixel level. But a 50 Mpix image printed at hig res, or downsampled and properly dithered, will look at least as good, and probably better than a 20 Mpix image printed at the same size, because it will be in effect compressing any pixel level issues.
50 Mpix may not be practical or ideal for everyone, but the claim that more pixels = worse quality is counter to all logic.
Well said, geekpower. Sorry, dak, but you're just wrong on this one. The *whole captured image*, i.e. the output from the whole sensor, viewed the same size, will show the same amount of shake (the same applies to image-level noise, but that's also dependent on the sensors being from the same generation, using the same technology, the shots taken at the same ISO, processed the same etc). Think of it this way: let's say camera shake is 10 microns. That's the same distance compared to the sensor no matter how the sensor is subdivided into pixels. Higher resolution sensors show it more *at 100%* because they can resolve more detail. But as I clearly said, we're talking about images viewed at the same size, not at 100%.
Even two images taken with the same camera may be indistinguishable in terms of camera shake until you view them at a high enough magnification. When I upload a batch of images, I can discard ones with major shake-induced blur (or softness due to misfocusing) without zooming in. But for very fine differences, I have to view at full resolution, because the shake is below the level that is detectable when viewed at normalised scale (whatever it happens to be). And indeed, if you downsize an image with blurring below a certain threshold, the blurring is no longer detectable. The principle is the same.