The issue may be that on chip phase detection is not inherently accurate. It likely also has to be calibrated and adjusted. The tiny dimensions involved have tolerances as well, a millionth of inch error could be significant.
Have you seen anything that indicates that the onchip phase detection is inherently accurate? Does Canon have perfect dimensioning with zero tolerances? I've wondered if averaging over many sensors would mean that errors mostly cancel out, but its also possible that if they all have the same error that errors in dimensions would be additive.
Unless someone has found a white paper explaining the accuracy, I assume that onchip AF has errors as well, but I hope not.
Its certainly possible for Canon to have a utility to adjust phase detect AF by comparing it with contrast detect, so there is nothing new that way. Of course, contrast detection can have errors as well, as some have discovered.
IIRC, the reason we need to AFMA is that the distance that light takes from the lens to the camera sensor does not exactly match the distance it takes from the lens, through the mirror, and to the AF sensor due to manufacturing tolerance reasons. In other words, the camera's AF system correctly focuses the lens to the AF sensor and then assumes that focus distance also applies to the camera's image sensor. The reality is that sometimes fine-tuning is needed.
Since this new dual-pixel phase detect AF is using the actual image sensor to detect correct focus it is inherently free from needing any fine-tuning. Therefore it should be possible for the camera to self-microadjust the traditional phase detect AF. It would first obtain what it thinks is correct focus the traditional way, then check that focus using the dual-pixel phase detect AF. The difference between the two is the amount to micro-adjust. Even if neither the traditional nor the dual-pixel phase detect AF are perfectly accurate, the correct micro-adjust value can still be determined by making multiple comparisons and using some statistics.