14-bit RAW on Canon 5D Mark III vs. factory default - Night Image Quality & Dynamic Range on Vimeo
The A/B test to do is the RAW version vs. the HDMI out using Cinestyle, recorded to the Ninja 2 in 220Mbps ProRes HQ and graded in post with the proper LUT. Because anyone who cares about getting the most IQ out of the camera will be using that setup (or something very similar) rather than internal...of course the RAW is going to kill the miserable internal codec.
And BTW, all my testing has repeatedly shown ALL-I and IPB are 100% identical IQ on the 5D3 internal. Haven't seen anything credible to refute that...I think it's just Mbps marketing to counter the GH2 hack.
That is alot of misinfo.
1. The internal codec isn't what does much damage. Point at a static scene and record with Ninja 2 at ProRes HQ and record internally 1.2.1 and there is NOT much difference you can see at all. A little but it is all very subtle to be honest. It's vastly smaller compared to the difference between what this ML recorded stuff looks like compared to normally internally recorded video.
2. For static scenes or ones with just bits moving around in the frame all-i and IPB are pretty much the same and all-i is just a horrible waste of space, perhaps worse if anything. If you pan around or the entire scene is changing frame to frame then ipb totally falls apart and all-i holds up much better (as does say pro res on ninja 2).