Strange because when people calculate MP/sec, it is generally fps * MP/f and the bit depth never comes into it.
You could calculate CM/w (cartons of milk per week), but if you want to compare milk consumption between households, you need to know the volume of the carton. Assuming that analogy is not too complex for you, are you suggesting that there's no difference in file size or data content between a 12-bit and a 14-bit RAW file?
But given that Canon's 1DX doesn't have more than 12 stops of DR and that stops of DR are bits then it even seems pointless for Canon to have 14bits of raw, don't you agree?
Given that file size is generally proportional to megapixels, I don't see how that helps.
The file formats used remain the same. At a guess it is likely that Samsung does something similar to what happens with conversion to MP3 and sacrifices two bits to make the compression faster. If it was more complex than that then they'd have to have two separate ADC paths which is also more expensive. If they were chopping bits off the bottom then it would not much different to the effectiveness of Canon's files for the 5D3 where the bottom 2 or 3 bits are useless due to noise.
That said, JPEG files are all 8 bits so file sizes on the Samsung would be proportionally bigger.
With respect to raw, see above.
I'm just using published web specs. Feel free to come up with your own calculations based on whatever other numbers you imagine to be relevant.