Dpreview of the 80D

dilbert said:
Monchoon said:
scyrene said:
dilbert said:
Using italics correctly means not using them for emphasis.

What on earth are you talking about?

It's Dilbert, what more can one say. Maybe he might be able to understand this.

Italics are a way to emphasise key points in a printed text, or when quoting a speaker a way to show which words they stressed.

If you read "What on earth are you talking about?" aloud, would you say the world "earth" louder?
Or would you just use a different inflection in its pronunciation?
Do both represent emphasis?

In fact it can be used in the opposite way too.

But I'm not sure if anyone else has seen that style used.

Maybe I still read newspapers too much (or at least more than most others here) to be comfortable in using italics for emphasis.

Maybe. I think of it more as a literary form, I see it much more in books than on the internet - but then, italics still aren't possible on many communication platforms (texts, tweets, some instant messaging programs). I personally go for asterisks (what on *earth* are you talking about?) because it's quicker to type. But they all work for this purpose - including capitals. Although I'd still interpret the latter as more emphatic and shouty. But norms are still a developing.

I rather like it when camera talk gets derailed like this :)
 
Upvote 0
rishi_sanyal said:
rrcphoto said:
rishi_sanyal said:
Mr. Low Notes said:
It looks like in the review the shooter doesn't get an initial lock...so the camera and lens are then struggling to re-adjust. This shouts of a poor technique from the reviewer...he's only a press hack I guess.


Leaving aside the fact that that's simply insulting to our reviewer (not me), we repeat every test at least 3x. We always initiate focus on the biker while he's static, then ask him to start coming towards the camera. If it didn't get an 'initial lock' on 3 tries on a static biker, you might say there's a problem. And that was with single-point 'One Shot'.


so you test AI servo by being in one shot mode? or are you in single point AI Servo?

or you acquire your initial AI servo lock on a stationary object and then have that object move? for AI servo which is predictive distance and speed tracking, that's a test that is assured to fail, or the camera (or lens) to struggle with, especially depending on your AF case selection.


Oops, it was late. I meant AI Servo single point.

For subject tracking tests we use the full AF array with 'Manual' initial point selection- where you pre-specify your subject by initiating focus on it.

Why would the scenario you described fail? Surely if an object is stationary and then starts moving the camera's subject tracking system might be expected to follow it?


Here's an example of one of many fundamental problems with you guys. Good luck, I'm not reading this garbage anymore. My blood pressure just goes way up everytime I do so I suppose I'm the real idiot for even reading this crap.
 
Upvote 0
bdunbar79 said:
rishi_sanyal said:
rrcphoto said:
rishi_sanyal said:
Mr. Low Notes said:
It looks like in the review the shooter doesn't get an initial lock...so the camera and lens are then struggling to re-adjust. This shouts of a poor technique from the reviewer...he's only a press hack I guess.


Leaving aside the fact that that's simply insulting to our reviewer (not me), we repeat every test at least 3x. We always initiate focus on the biker while he's static, then ask him to start coming towards the camera. If it didn't get an 'initial lock' on 3 tries on a static biker, you might say there's a problem. And that was with single-point 'One Shot'.


so you test AI servo by being in one shot mode? or are you in single point AI Servo?

or you acquire your initial AI servo lock on a stationary object and then have that object move? for AI servo which is predictive distance and speed tracking, that's a test that is assured to fail, or the camera (or lens) to struggle with, especially depending on your AF case selection.


Oops, it was late. I meant AI Servo single point.

For subject tracking tests we use the full AF array with 'Manual' initial point selection- where you pre-specify your subject by initiating focus on it.

Why would the scenario you described fail? Surely if an object is stationary and then starts moving the camera's subject tracking system might be expected to follow it?


Here's an example of one of many fundamental problems with you guys. Good luck, I'm not reading this garbage anymore. My blood pressure just goes way up everytime I do so I suppose I'm the real idiot for even reading this crap.

+1

I bought a 7D2......
less than 10 percent of pictures in focus..... crappiest AF system EVER!
Then I read the AF system booklet.....
I practiced and played with it.....
AF hit rate is now greater than 90% and I still don't know it as well as I should....

The problem with complex AF systems is that if you do not take the time to learn them well, you are better off with the camera left in auto mode......
 
Upvote 0
Don Haines said:
bdunbar79 said:
Here's an example of one of many fundamental problems with you guys. Good luck, I'm not reading this garbage anymore. My blood pressure just goes way up everytime I do so I suppose I'm the real idiot for even reading this crap.
+1

I bought a 7D2......
less than 10 percent of pictures in focus..... crappiest AF system EVER!
Then I read the AF system booklet.....
I practiced and played with it.....
AF hit rate is now greater than 90% and I still don't know it as well as I should....

The problem with complex AF systems is that if you do not take the time to learn them well, you are better off with the camera left in auto mode......

Well, that's what happens when you try to set your Canon dSLR to AF-C mode.....
 
Upvote 0
neuroanatomist said:
Don Haines said:
bdunbar79 said:
Here's an example of one of many fundamental problems with you guys. Good luck, I'm not reading this garbage anymore. My blood pressure just goes way up everytime I do so I suppose I'm the real idiot for even reading this crap.
+1

I bought a 7D2......
less than 10 percent of pictures in focus..... crappiest AF system EVER!
Then I read the AF system booklet.....
I practiced and played with it.....
AF hit rate is now greater than 90% and I still don't know it as well as I should....

The problem with complex AF systems is that if you do not take the time to learn them well, you are better off with the camera left in auto mode......

Well, that's what happens when you try to set your Canon dSLR to AF-C mode.....

Neuro and Don H. you guys are a breath of fresh air!
For the sunshine-only shooter posting earlier in this thread about gray skies being poor shooting conditions...he better stick to a smaller paintbrush in his statements. Here's why:
 

Attachments

  • DPP_0818.JPG
    DPP_0818.JPG
    1.7 MB · Views: 148
  • DPP_0834.JPG
    DPP_0834.JPG
    123.3 KB · Views: 155
  • DPP_0931.JPG
    DPP_0931.JPG
    2.1 MB · Views: 150
Upvote 0
FramerMCB said:
For the sunshine-only shooter posting earlier in this thread about gray skies being poor shooting conditions...he better stick to a smaller paintbrush in his statements. Here's why:

Oh, that was just dilbert. He gave reality a sidelong glance a few years back, didn't like what he saw, and has just ignored it ever since.

Nice shots!
 
Upvote 0
FramerMCB said:
For the sunshine-only shooter posting earlier in this thread about gray skies being poor shooting conditions...he better stick to a smaller paintbrush in his statements. Here's why:
It's true!

One of the things that I have learned from the collective wisdumb is that you can only shoot under sunny skies and with a camera that has more than 14 stops of DR.....
 

Attachments

  • StormySunset.jpg
    StormySunset.jpg
    742.6 KB · Views: 151
  • Storm2B.jpg
    Storm2B.jpg
    186.8 KB · Views: 147
  • Storm2a.jpg
    Storm2a.jpg
    334.9 KB · Views: 151
Upvote 0
neuroanatomist said:
FramerMCB said:
For the sunshine-only shooter posting earlier in this thread about gray skies being poor shooting conditions...he better stick to a smaller paintbrush in his statements. Here's why:

Oh, that was just dilbert. He gave reality a sidelong glance a few years back, didn't like what he saw, and has just ignored it ever since.

Nice shots!

Thanks!
 
Upvote 0
Don Haines said:
FramerMCB said:
For the sunshine-only shooter posting earlier in this thread about gray skies being poor shooting conditions...he better stick to a smaller paintbrush in his statements. Here's why:
It's true!

One of the things that I have learned from the collective wisdumb is that you can only shoot under sunny skies and with a camera that has more than 14 stops of DR.....

Love these! Especially #1 and #3. Mine were shot using the venerable Canon 40D with a 70-200mmL f2.8 IS (first gen.). In June of 2014, Silverstar Mountain in SW Washington, not too far from Battleground.
 
Upvote 0
FramerMCB said:
Don Haines said:
FramerMCB said:
For the sunshine-only shooter posting earlier in this thread about gray skies being poor shooting conditions...he better stick to a smaller paintbrush in his statements. Here's why:
It's true!

One of the things that I have learned from the collective wisdumb is that you can only shoot under sunny skies and with a camera that has more than 14 stops of DR.....

Love these! Especially #1 and #3. Mine were shot using the venerable Canon 40D with a 70-200mmL f2.8 IS (first gen.). In June of 2014, Silverstar Mountain in SW Washington, not too far from Battleground.
#1 was shot out back, 60D and the 17-55 lens, 8 shots stitched together in a panorama.... just before I got soaked.
#2 was also shot with the 60D and the Sigma 10-20mm wide angle zoom.... just before I got soaked.
#3 was shot with a 5D2 and 24-105..... and I didn't get soaked :)

I like number 2 of yours a lot...... there is a certain quality to the light when it is raining and things are wet.
 
Upvote 0
So in an attempt to get back to reviews.....

Much earlier I mentioned that I would like to see reviews not fixate so much on ISO 100 and to include testing and usage at more extreme settings. I thought the reason was clear, but let me state it again: The vast majority of camera owners shoot jpg images in automatic mode. The vast majority of camera owners shoot with entry level DSLRs, point/shoot cameras, and phones. The vast majority of these people are perfectly happy with their images, and will never go online to read review sites or visit places like Canon Rumours.

The target market for these review sites (and CR) are enthusiasts. Some of us are pros, some of us amateurs, but we are all enthusiasts. As enthusiasts, we like to "compare the numbers" and debate things. We have grown beyond the point where we leave our camera in AUTO and let it do all the work all the time. Most of us are using (or learning to use) our camera systems under more challenging conditions. As such, we go out and shoot in the cold and rain (snow?), we shoot under sub-optimal lighting like overcast skies, after sunset, poorly lit performances, fast moving creatures, and night shooting. We are going to be pushing the ISO high, we are going to go past where IS helps us, We are going to go where 5 stops of DR is all we need and we are going to go where 15 stops is not enough. We are going to go where AF systems can't keep up....

To do a camera review and not step out into more challenging conditions is to do a great dis-service to your target audience. If all we wanted was a camera that took great pictures under perfect conditions, we would all be shooting with phones or kit cameras. We are looking at cameras like the 1DX 2, the 5D4, the 7D2, and yes, the 80D, because those cameras are more capable when we start pushing things.

Quite frankly, under perfect conditions, it really does not matter which camera you get.... they all work great and few can tell the difference. Go push things until they break! Report on it! That's where the story lies....
 
Upvote 0
bdunbar79 said:
rishi_sanyal said:
rrcphoto said:
rishi_sanyal said:
Mr. Low Notes said:
It looks like in the review the shooter doesn't get an initial lock...so the camera and lens are then struggling to re-adjust. This shouts of a poor technique from the reviewer...he's only a press hack I guess.


Leaving aside the fact that that's simply insulting to our reviewer (not me), we repeat every test at least 3x. We always initiate focus on the biker while he's static, then ask him to start coming towards the camera. If it didn't get an 'initial lock' on 3 tries on a static biker, you might say there's a problem. And that was with single-point 'One Shot'.


so you test AI servo by being in one shot mode? or are you in single point AI Servo?

or you acquire your initial AI servo lock on a stationary object and then have that object move? for AI servo which is predictive distance and speed tracking, that's a test that is assured to fail, or the camera (or lens) to struggle with, especially depending on your AF case selection.


Oops, it was late. I meant AI Servo single point.

For subject tracking tests we use the full AF array with 'Manual' initial point selection- where you pre-specify your subject by initiating focus on it.

Why would the scenario you described fail? Surely if an object is stationary and then starts moving the camera's subject tracking system might be expected to follow it?


Here's an example of one of many fundamental problems with you guys. Good luck, I'm not reading this garbage anymore. My blood pressure just goes way up everytime I do so I suppose I'm the real idiot for even reading this crap.


I don't follow - what's the fundamental problem? That it was midnight and I accidentally typed 'One Shot' instead of 'AI Servo'? Or that we expect a Canon DSLR to be able to follow a subject that goes from stationary to moving, something most advanced cameras today can do quite successfully, including Canon's own 5DS (https://www.youtube.com/watch?v=E1eBgQt9sOU)?
 
Upvote 0
Don Haines said:
To do a camera review and not step out into more challenging conditions is to do a great dis-service to your target audience. If all we wanted was a camera that took great pictures under perfect conditions, we would all be shooting with phones or kit cameras. We are looking at cameras like the 1DX 2, the 5D4, the 7D2, and yes, the 80D, because those cameras are more capable when we start pushing things.

? But that's exactly why we have our studio and dynamic range tests that test both low and high ISO. And our bike and closer distance human face/mannequin AF tests that people love to mock: they actually do differentiate the various cameras and show when certain AF abilities break. We're continuing to work to develop more challenging, rigorous AF tests that are repeatable, that particularly stress things like initial AF acquisition, or subject tracking acquisition - things that are particularly important to sports/action photographers. But whatever we do, it needs to be a repeatable, controlled test, else you'll never be able to compare camera A released today to camera B from 3 years ago.

Every now and then we supplement with a real-world reality check - which is why we have our 'real-world dynamic range' shootouts from time to time. Or real-world sports shootout (coming for the 1DX II vs D5). But those real-world tests can only A/B test; they can never provide repeatable tests you can refer back to compare any camera to any other camera. For that you need controlled comparisons.

That's just a fundamental reality of testing.

I think we're exactly on the same page for what photographers need, perhaps just not seeing eye-to-eye on how to get there.
 
Upvote 0
rishi_sanyal said:
Don Haines said:
To do a camera review and not step out into more challenging conditions is to do a great dis-service to your target audience. If all we wanted was a camera that took great pictures under perfect conditions, we would all be shooting with phones or kit cameras. We are looking at cameras like the 1DX 2, the 5D4, the 7D2, and yes, the 80D, because those cameras are more capable when we start pushing things.

? But that's exactly why we have our studio and dynamic range tests that test both low and high ISO. And our bike and closer distance human face/mannequin AF tests that people love to mock: they actually do differentiate the various cameras and show when certain AF abilities break. We're continuing to work to develop more challenging, rigorous AF tests that are repeatable, that particularly stress things like initial AF acquisition, or subject tracking acquisition - things that are particularly important to sports/action photographers. But whatever we do, it needs to be a repeatable, controlled test, else you'll never be able to compare camera A released today to camera B from 3 years ago.

Every now and then we supplement with a real-world reality check - which is why we have our 'real-world dynamic range' shootouts from time to time. Or real-world sports shootout (coming for the 1DX II vs D5). But those real-world tests can only A/B test; they can never provide repeatable tests you can refer back to compare any camera to any other camera. For that you need controlled comparisons.

That's just a fundamental reality of testing.

I think we're exactly on the same page for what photographers need, perhaps just not seeing eye-to-eye on how to get there.
And if you go way back in this thread to an earlier comment, I think that of the review sites, you are doing the best job.....

Because camera systems are so complex, there is no fair way to do a quantative comparison between brands, and sometimes even within brands. Feel and impression become very important. Things like your standard test scene and ISO comparisons provide great standardized comparisons......but moving beyond that is a nightmare if you wish consistency.....but is consistency really needed when you move beyond? After the standardized tests, could you have something extra that is specific to just that camera where you push things and play with it?

Just thinking out loud....
 
Upvote 0
Don Haines said:
rishi_sanyal said:
Don Haines said:
To do a camera review and not step out into more challenging conditions is to do a great dis-service to your target audience. If all we wanted was a camera that took great pictures under perfect conditions, we would all be shooting with phones or kit cameras. We are looking at cameras like the 1DX 2, the 5D4, the 7D2, and yes, the 80D, because those cameras are more capable when we start pushing things.

? But that's exactly why we have our studio and dynamic range tests that test both low and high ISO. And our bike and closer distance human face/mannequin AF tests that people love to mock: they actually do differentiate the various cameras and show when certain AF abilities break. We're continuing to work to develop more challenging, rigorous AF tests that are repeatable, that particularly stress things like initial AF acquisition, or subject tracking acquisition - things that are particularly important to sports/action photographers. But whatever we do, it needs to be a repeatable, controlled test, else you'll never be able to compare camera A released today to camera B from 3 years ago.

Every now and then we supplement with a real-world reality check - which is why we have our 'real-world dynamic range' shootouts from time to time. Or real-world sports shootout (coming for the 1DX II vs D5). But those real-world tests can only A/B test; they can never provide repeatable tests you can refer back to compare any camera to any other camera. For that you need controlled comparisons.

That's just a fundamental reality of testing.

I think we're exactly on the same page for what photographers need, perhaps just not seeing eye-to-eye on how to get there.
And if you go way back in this thread to an earlier comment, I think that of the review sites, you are doing the best job.....

Because camera systems are so complex, there is no fair way to do a quantative comparison between brands, and sometimes even within brands. Feel and impression become very important. Things like your standard test scene and ISO comparisons provide great standardized comparisons......but moving beyond that is a nightmare if you wish consistency.....but is consistency really needed when you move beyond? After the standardized tests, could you have something extra that is specific to just that camera where you push things and play with it?

Just thinking out loud....

Thanks Don, and I totally hear you. It is a nightmare beyond a certain point, but perhaps still doable to a certain extent. Without it, quality is just so dependent on the reviewer and aggregate knowledge. I suppose what we're trying to do is quantify what we can quantify, then leave the rest to the reviewer and maybe even user opinions. I kind of wonder if we could have a portion of the reviews dedicated to user tests - where readers are encouraged to submit their tests, especially if there's some collective wisdom that can serve as guidance for how to do certain tests (I'm mostly thinking AF here - since that's the really tough one to tackle).

But at least things like AF acquisition times for different lenses with a master body, and different bodies with a master lens, under different controlled light - stuff like that we should standardize just like we do our studio scene.

Just thinking out loud here as well...
 
Upvote 0
rishi_sanyal said:
Don Haines said:
rishi_sanyal said:
Don Haines said:
To do a camera review and not step out into more challenging conditions is to do a great dis-service to your target audience. If all we wanted was a camera that took great pictures under perfect conditions, we would all be shooting with phones or kit cameras. We are looking at cameras like the 1DX 2, the 5D4, the 7D2, and yes, the 80D, because those cameras are more capable when we start pushing things.

? But that's exactly why we have our studio and dynamic range tests that test both low and high ISO. And our bike and closer distance human face/mannequin AF tests that people love to mock: they actually do differentiate the various cameras and show when certain AF abilities break. We're continuing to work to develop more challenging, rigorous AF tests that are repeatable, that particularly stress things like initial AF acquisition, or subject tracking acquisition - things that are particularly important to sports/action photographers. But whatever we do, it needs to be a repeatable, controlled test, else you'll never be able to compare camera A released today to camera B from 3 years ago.

Every now and then we supplement with a real-world reality check - which is why we have our 'real-world dynamic range' shootouts from time to time. Or real-world sports shootout (coming for the 1DX II vs D5). But those real-world tests can only A/B test; they can never provide repeatable tests you can refer back to compare any camera to any other camera. For that you need controlled comparisons.

That's just a fundamental reality of testing.

I think we're exactly on the same page for what photographers need, perhaps just not seeing eye-to-eye on how to get there.
And if you go way back in this thread to an earlier comment, I think that of the review sites, you are doing the best job.....

Because camera systems are so complex, there is no fair way to do a quantative comparison between brands, and sometimes even within brands. Feel and impression become very important. Things like your standard test scene and ISO comparisons provide great standardized comparisons......but moving beyond that is a nightmare if you wish consistency.....but is consistency really needed when you move beyond? After the standardized tests, could you have something extra that is specific to just that camera where you push things and play with it?

Just thinking out loud....

Thanks Don, and I totally hear you. It is a nightmare beyond a certain point, but perhaps still doable to a certain extent. Without it, quality is just so dependent on the reviewer and aggregate knowledge. I suppose what we're trying to do is quantify what we can quantify, then leave the rest to the reviewer and maybe even user opinions. I kind of wonder if we could have a portion of the reviews dedicated to user tests - where readers are encouraged to submit their tests, especially if there's some collective wisdom that can serve as guidance for how to do certain tests (I'm mostly thinking AF here - since that's the really tough one to tackle).

But at least things like AF acquisition times for different lenses with a master body, and different bodies with a master lens, under different controlled light - stuff like that we should standardize just like we do our studio scene.

Just thinking out loud here as well...

++1
Rishyi...you are a brave soul. These waters are shark infested to say the least. But kudos to you sir! And to Mr. Don H. and a few others for being sane, rational, and logical. Oh, and civil!!! I'm old school, and too many putzes on social media think that they can be as rude and uncivil as they want.

I think you guys are both on to something with your prognostications concerning testing "outside-the-box" so to speak. The bottom line is, reviews and review sites, are a mixed bag. So I typically rely on a few different ones I trust. For lenses I rely on Brian Carnathan over at, thedigitalpicture.com and Dustin Abbott at, dustinabbott.com. And Lensrentals, I like it when they take lenses apart. I also like DPreview because I can read a broad spectrum of different kinds of camera equipment and brands.

Anyhoo...my two cents.
 
Upvote 0
Don Haines said:
To do a camera review and not step out into more challenging conditions is to do a great dis-service to your target audience. If all we wanted was a camera that took great pictures under perfect conditions, we would all be shooting with phones or kit cameras. We are looking at cameras like the 1DX 2, the 5D4, the 7D2, and yes, the 80D, because those cameras are more capable when we start pushing things.

Well, DPReview showed high ISO studio comparison shots for 1DX2 http://www.dpreview.com/news/8090146652/canon-eos-1d-x-mark-ii-studio-tests/1.

Their observation:
"Although the 1D-X II shows significant increase in dynamic range at low ISOs in our dynamic range tests, high ISO Raw performance remains fairly similar to its predecessor, falling behind the Nikon D5, and even slightly behind the 42MP Sony a7R II, at very high ISOs."

My observations based on the above test shots are consistent with what this poster said:
"Having looked at every square inch of the test scene at ISO 25600 (RAW) if anyone can show me a measurable difference between the 1DXII and D5 I'm a monkey's uncle. The detail resolved and noise levels are so close it's not funny. The Nikon jpg engine is better for sure, but in RAW it evens out. Also how they can say the A7RII shows less noise at high ISO is beyond a joke. If you downsample to 20MP it'll be close of course, but at 42MP the luma and chroma noise is much higher."
- http://www.fredmiranda.com/forum/topic/1428206/1#13543956

That's 'unbiased' DPReview for you.
 
Upvote 0
rishi_sanyal said:
Don Haines said:
rishi_sanyal said:
Don Haines said:
To do a camera review and not step out into more challenging conditions is to do a great dis-service to your target audience. If all we wanted was a camera that took great pictures under perfect conditions, we would all be shooting with phones or kit cameras. We are looking at cameras like the 1DX 2, the 5D4, the 7D2, and yes, the 80D, because those cameras are more capable when we start pushing things.

? But that's exactly why we have our studio and dynamic range tests that test both low and high ISO. And our bike and closer distance human face/mannequin AF tests that people love to mock: they actually do differentiate the various cameras and show when certain AF abilities break. We're continuing to work to develop more challenging, rigorous AF tests that are repeatable, that particularly stress things like initial AF acquisition, or subject tracking acquisition - things that are particularly important to sports/action photographers. But whatever we do, it needs to be a repeatable, controlled test, else you'll never be able to compare camera A released today to camera B from 3 years ago.

Every now and then we supplement with a real-world reality check - which is why we have our 'real-world dynamic range' shootouts from time to time. Or real-world sports shootout (coming for the 1DX II vs D5). But those real-world tests can only A/B test; they can never provide repeatable tests you can refer back to compare any camera to any other camera. For that you need controlled comparisons.

That's just a fundamental reality of testing.

I think we're exactly on the same page for what photographers need, perhaps just not seeing eye-to-eye on how to get there.
And if you go way back in this thread to an earlier comment, I think that of the review sites, you are doing the best job.....

Because camera systems are so complex, there is no fair way to do a quantative comparison between brands, and sometimes even within brands. Feel and impression become very important. Things like your standard test scene and ISO comparisons provide great standardized comparisons......but moving beyond that is a nightmare if you wish consistency.....but is consistency really needed when you move beyond? After the standardized tests, could you have something extra that is specific to just that camera where you push things and play with it?

Just thinking out loud....

Thanks Don, and I totally hear you. It is a nightmare beyond a certain point, but perhaps still doable to a certain extent. Without it, quality is just so dependent on the reviewer and aggregate knowledge. I suppose what we're trying to do is quantify what we can quantify, then leave the rest to the reviewer and maybe even user opinions. I kind of wonder if we could have a portion of the reviews dedicated to user tests - where readers are encouraged to submit their tests, especially if there's some collective wisdom that can serve as guidance for how to do certain tests (I'm mostly thinking AF here - since that's the really tough one to tackle).

But at least things like AF acquisition times for different lenses with a master body, and different bodies with a master lens, under different controlled light - stuff like that we should standardize just like we do our studio scene.

Just thinking out loud here as well...

AF performance means more to me than DR and I'm glad to see AF being discussed. I'm Not saying it isn't or hasn't been, But I would like to see it given a little more priority. An OOF photo is no good no matter how good the camera's DR but that's a no brainer of course.

Rishi I want you to know that my being critical of DPR isn't and has never been solely directed at you or anyone in particular at DPR. I feel that the DR obsession in general takes away from other important things. And the trolls feed the flames....

You talking about AF performance put a smile on my face. :-) When I go to buy a new camera that is something that will be more important to me that DR. And I will be going to DPR along with a few other sites to read reviews of cameras that I might be interested in buying. I think we are on the same page in general. I know we agreed in the past that all cameras have a target audience and their strengths and weaknesses. A camera is just a tool. But to some I think it's a status symbol.

Patrick aka- MLN
 
Upvote 0
dilbert said:
How do you do a meaningfully comparison between two sets of results if there is no consistency?

I do agree with dilbert on this one :o.

But people are treating this as though DPR is the only review they should read.
I like to read DPR because it gives me an idea of the camera's technical capability and I read other sites for the user experience and both complement each other. So for example, their noise tests show what the camera can do in studio conditions, but I have only ever bought a camera after the field tests have taken place, and I have a group of photographer sites whose opinions I trust. I can then see their comments on whether that difference in the studio makes a difference to me in the real world.

As an example from direct experience, one review of the 7D2 showed how the camera had only a 1-stop difference in noise over the original 7D (not too impressive) but a field review explained that for him the way the noise rendered on screen made it more visually acceptable and was effectively almost a 2-stop improvement when it came to usable photos. And that is what I found when I bought the 7D2.

I find that after reading 'benchmark' sites like DPR, subjective comments from very experienced photographers along the lines of 'I did not compare side by side but felt the AF hit rate was higher' or 'the raw files were nicer to work with' make much more sense.
 
Upvote 0