...the MA value on the 7D was something like -92...
I ran into this value as well, but can't really tell if it actually overwrites the MFA - value or if it's just a glitch. It however was a bit of a surprise to find that value in the menu
So far I'm quite happy with the software, but as an "early adopter" there is problems with stability. Not while testing, that went fine, but the crashes still are pretty frequent. Hopefully that'll change with the future releases.
I did at least two runs on each focal length and on my 24-70 I tested it at 24mm, 35mm, 50mm and 70mm and again all of them at least twice. So even though the process is pretty much automated, it still took a fair amount of time. I also saved the test analysis curves for each test and I'm going to do a writeup on my blog about my findings. I'll post a link once it's done, but that's going to take a while since I'm quite busy with other things than photography at the moment.
Anyway, here are the values I got:
Canon EF-S 10-22mm f/3.5-4.5:
- @10mm: -7, -7
- @22mm: -3, -2
All in all pretty consistent results. I had no MAF dialed in for this lens, so I went from 0 to -5. That's close enough of a compromise since DOF isn't really an issue. This may change when I get to try this new value in real life.
Canon EF 24-70mm L f/2.8:
- @24mm: -8, -6 (I wasn't sure about the -6 test run and did a third one, which came at -9)
- @35mm: -10, -5 (same uncertainty here, but I didn't run the third test)
- @50mm: -11, -11 (prediction was at -12, but -11 was the selection)
- @70mm: -14, -15 (I changed to the bigger target here, so I'm not sure about the values)
This again was pretty inconsistent which I did expect from this lens. Good news is that the results weren't all over the chart and it's consistently needing a minus adjustment. I'll have to do more testing with this lens, but since I didn't have anything dialed in, needless to say it's gotten way better. So this one went from 0 to -10. I'll fiddle with this in real situations and do more testing when I have the time.
Canon EF 70-200mm L f/2.8 IS II:
- @70mm: -2, -2
- @200mm: -2, -2
As consistent as they come, I had this at -3 before, so I'll leave it at -2 and see if it's better.
Canon EF 300mm f/4:
- @300mm (duh): -8 (the only good run)
This focal length combined with the 15m testing range proved problematic to say the least. This would have been better if I had better lighting on the target. I'm guessing there were too much vibrations to get decent readings, so I'll have to do this again with far better light. For the record I was using daylight balanced fluorescent lights on both sides of the target to get an even lighting, but to get a decent shutter speed for the 300mm it wasn't enough.
Sigma 30mm f/1.4:
- @30mm: -2, -2
This one is interesting since I had it at +3 before and +1, +2 and +3 are higher on the chart than -2, but they are way off of the predicted MFA curve. So I was "right" to set it at +3 since it's a sharper option, but it's probably been an inconsistency in the AF - motor. So hopefully this setting of -2 gives consistent results even though it's not as sharp as +3. Will have to try it out in real life.
In conclusion, even though the software makes the testing easy and at least somewhat repeatable, there still are decisions to be made when it comes to setting the "optimal" value for MFA. I can say I'm pleased with the software since it does what it promises and at least gives us a point of reference and the means to do an educated guess about the right value (if there ever is one).
When looking at the testing from a bit further away, there's consistency about every lens needing a minus adjustment, so there was no wild measurement errors or seriously whacked lenses. I'll probably end up doing more testing with much brighter lighting on the target and see if that makes a difference, but so far I'm happy to be pushed in the right direction.