As for the double layer of microlenses...sure, you could read a full RGBG 2x2 pixel "quad" and have "full color resolution". Problem is, that LITERALLY halves your luminance spatial resolution...
Thus you start with an 80MP sensor to get a nice 20MP image.
No, that is fundamentally incorrect. You start with a 20mp sensor, which has 40mp PHOTODIODES. The two are not the same. Pixels have photodiodes, but photodiodes are not pixels. Pixels are far more complex than photodiodes. DPAF simply splits the single photodiode for each pixel, and adds activate wiring for both. That's it. It is not the same as increasing the megapixel count of the sensor.
And, once again...I have to point out. There is no such thing as QPAF. The notion that Canon has QPAF is the result of someone seeing something they did not understand. Canon does not have QPAF. Their additional post-DPAF patents do not indicate they have QPAF technology yet...however there have been improvements to DPAF.
BTW, what your describing is called super-pixel debayering. That, too, is a common option in astrophotography image stacking...instead of basic or AHD debayering, you usually have the option to either super-pixel debayer, or "drizzle" (which, if you have enough subs...such as a couple hundred...is a means of achieving superresolution, and can increase your output image resolution by two to three fold.) You don't even need another microlens layer to do super-pixel debayering...you could use a tool like Iris or maybe even DarkTable/RawThearapy, to do it on any image you want.
Finally, even if you do super-pixel debayering, your not going to ever have "hard edges". Statistically speaking, the chances if a white/black line pattern you wish to photograph perfectly lining up with your pixels, regardless of how large or small they are, is so excessively remote that it is statistically impossible. Not in any real-world situation. You might be able to build some kind of contraption and AI software to eventually achieve it, but that is well beyond the realm of practicality. If you remove the AA filter, use super-pixel debayering, you might have larger pixels with full color fidelity...but your going to have a massive amount of aliasing. Those white and black lines would have some nasty stair-stepped edges, they would just look atrocious.
Wow, it looks like superpixel debayering (http://pixinsight.com/doc/tools/Debayer/Debayer.html) is exactly what I'm after. Make a 128MP sensor and use superpixel debayering and you'll have a nice compact, super accurate 32MP image.
Again, really, I'm fine with just shooting on a 128MP sensor and dealing with 100MB+ RAW files, the trick is to get a similar result in a format that's going to be acceptable to the majority of photographers who refuse to deal with large file sizes.
As long as your final image is around 32MP I don't think people are going to notice the stair stepping, unless you're standing right next to something like a 40" high quality print.
Well, someday we may have 128mp sensors...but that is REALLY a LONG way off. DPAF technology, or any derivation thereof, isn't going to make that happen any sooner.