I keep asking for an example where the software correction is a visible let-down even in pixel-peeping 1:1 and haven't seen an example of it yet. Instead someone will say, something like "oh gosh, no, we don't want our stars to be stretched, I'll never buy this lens" but I haven't actually seen what this stretched star problem even is. It's not as they stretching them OUT of shape, but rather simply stretching them back INTO shape. Granted this kind of software correction will always cost you about a half-pixel of sharpness on average, but it also gives the designers far more freedom with the lens design, and with reduced constraints on geometry it's not clear to me that the image quality wouldn't be enough sharper that the result is better than an uncorrected lens.
To be clear I have no experience and would love to see a side-by-side where any uncorrected lens took better stars than a corrected one. It's certainly theoretically possible, but I just wonder if that's what actually happens with actual lenses in actual photos of actual subjects.
I was also mystified about whether the software-corrected vignetting was ever noticeable in actual photos and I can't remember how that conversation ended, but if someone happens to have some photos that show how the vignetting correction is causing an actual problem that is actually visible in an actual photo of an actual subject I'd be really interested in seeing that too (and apologies as I know one or two members--maybe yourself koenkooi--was helping me understand that but whether we got to a result or not is slipping my mind.)