So, to summarise, what is happening is:
|Camera Setting ||Image EXIF Reports ||Actual sensor gain ||Actual metering gain ||net effect |
|ISO 50||ISO 50||ISO 100||ISO 50||Brighter shadows; blown highlights|
|ISO 100||ISO 100||ISO 100||ISO 100||Normality|
|ISO 200 HTP||ISO 200||ISO 100||ISO 200||Highlights preserved; shadows burned|
ie - ignoring any small shifts due the possibility that the 'native sensor base ISO is not exactly ISO100 - the only differences between the three modes are the metering and the subsequent processing to correct for the metering error.
If you shoot JPEG, the camera automatically compensates for the difference between the sensor and metering gain. If you shoot RAW, the file contains a flag that allows your RAW processor to do this. You can achieve a similar effect by simply using the exposure compensation setting and correction later - though the in-camera options will give better quality if you shoot JPEG.
The only real difference is in the exposure, metering and post processing. The metering gain changes really do mean that the number of photons hitting the sensor are affected by the mode setting - relative to the actual sensor ISO gain setting [edited for clarity].
Can anyone point out where this is incorrect?