« on: May 22, 2013, 02:09:58 AM »
Yes, it is supposed to be as sharp. (Note that this compares Nikon vs. Canon versions, but results should be comparable because they're using the old D3s to evaluate the Nikon lens.)
At f/1.4 the Sigma ranges from 3K to 3.2K. And at f/2.0 the Sigma ranges from 3K to 3.7K. At f/2.8 the Sigma ranges from 3.2 to almost 3.9K (line widths per picture height).
At f/2.8 the Canon 70-200 ranges from 3.1K to 3.5K. The all around peak of the Sigma is 3,960 @f4.0 while for the Canon it is 3,721 at f/5.6 at 70mm.
Ummmm....no. Ignore for the moment the fact that Klaus specifically states, "Please note that the tests results are not comparable across the different systems." The Sigma 35/1.4 was not tested on 'the old D3s' (a 12 MP FX camera) but on the D3x, a 24 MP FX camera. Those extra 3 MP directly translate to an increase in LW/PH compared to the 21 MP Canon 5DII, all else being equal.
Compare the TDP ISO 12233 crops (where the Canon version of the Sigma lens was tested on the same camera as the Canon lens), is the Sigma as sharp? Nope.
Your correction about the D3s vs. D3x is valuable. I meant to write D3x and I was thinking of that model, but out of habit without noticing my fingers typed D3s.
However, my respect went down (a little) for you, neuroanatomist, because you suggest ignoring Imatest results in favor of subjective pictures which, no matter how carefully made, cannot be used for valid comparison.
It makes no difference whether it was an ISO 12233 chart, whether was originally shot in RAW, whether it was made on concrete floor with a $25K setup, etc.
All those things may be true, and yet the sharpness selected on output has more to do with how sharp it looks to our eyes than even a 33% difference in lens sharpness. Also, I know for a fact that some of the cheaper lenses were not properly focused on the point shown in the cropped image area, by accident, I'm sure (I know that it is an immense time sacrifice, and I am not trying to criticize TDP in any way) while the more expensive lenses like the 200mm f/2.0 were properly focused. That alone makes a night and day difference when pixel peeping.
Additionally, some of the lenses shown by TDP have curvature of field which makes them look more blurry than they would if a focus point was actually used to focus at that point in the frame. This is just another reason why looking at a photo and allowing our imperfect senses to judge sharpness just doesn't work.
As a case in point, the Canon 85mm f/1.8 images show terrible softness in the images at TDP (they never look anywhere near as sharp as some of the other lenses do, even when stopped down to f/8!), but I know for a fact that they are not so bad.
Once again, I know that it is an immense time sacrifice to make those visualizations of lens quality, and I am not trying to criticize TDP in any way. But I am urging that proper scientific metrics should be used when discussing lens sharpness, not looking at a subjective picture and saying, "Nope."
We have to use logic first before allowing ourselves to make a decision based on a picture. Case in point: Some disreputable websites have for years used a sleight of hand trick to "prove" to people that RAW is sharper, by using "default" sharpening on JPEG output that is actually a lot weaker than the "default" sharpening used for RAW. Saying that RAW is sharper makes no more sense than saying that RAW has better white balance. Yes, RAW has more data in it to be used for any purpose, but both the sharpness and white balance are determined by processing after the fact. (You can never see RAW data. You can only see a visualization of RAW data.) So any comparison saying RAW is sharper than JPEG is by definition a comparison of two different levels of sharpening of RAW data and therefore make no logical sense other than to say that one amount of sharpening looks different from another. The same is true for comparing sharpness of lenses subjectively by looking at pictures on a website with our eyes. It just makes no sense.