I don't think that people should make too much of the "ISO 12800 maximum without expansion" on the D4 versus the "ISO 51200 without expansion" on the 1D X; I think this is just a difference between the two companies' nomenculture. The term 'native ISO' is misleadingly used by many people; each sensor only has one native ISO (its base ISO); in Canon's case this is ISO 100 and in Nikon's ISO 200. Every other ISO is achieved by either on-chip amplification, or by under/over-exposure with post capture software signal amplification. I had always thought that the switch over point was where the ISO expansion (boost) settings kicked in, but recent forum posts by others have led me to believe that I was wrong in this assumption. It seems that where this transition occurs is somewhat of an opaque subject that is not revealed by the manufacturers. Thus, one canot tell from the quoted ISO range specifications what one can expect in terms of signal to noise ratio for each sensor. It seems that each company has its own standards for what it considers acceptable signal to noise for its cameras "normal" ISO range. The only way to determine the sensors' signal to noise performance will be to test the actual sensor post-release (i.e. we'll have to wait for DXO Marks analysis). Of course, some are suggesting that the camera manufacturers are starting to 'cook' their RAW files by subjecting them to a certain amount of post-capture noise reduction during the initial on-chip stages of image read out.
Something here also bothers me. If the native ISO of the D4 is 1 stop above the 1DX then naturally the stop per stop comparisons are a lot closer that the number suggest.
It's difficult to compare ISO 100 to ISO 200 but a lot easier to compare at the top end where the most noise it.
TBH, I don't care, I use the 1Ds3 for everything and the 1DX is 4 stops higher in ISO. We're getting spoilt with these cameras.