Not looking for example images, I'm looking for the more technical reasoning and explicit knowledge of the HOW the metering systems work, not images you've taken.
The images help with the explanation.
I'm attaching three of them.
The first is a linear UNIWB development of a half-a-second exposure of a ColorChecker Passport. That is, this is what the sensor actually recorded dumped to a TIFF and scaled down for the Web. Because of the 1.0 gamma (you're used to seeing 2.2 gamma), it's very contrasty. And, of course, it's rather green, thanks to there being as many green pixels on the sensor as red and blue combined. But, aside from those two caveats, you can see that it's properly exposed. You can also guess that the sky is quite blown.
The second is the same file but white balanced and with a 2.2 gamma. But no digital gain or tone curve is applied. You can see that the colors are a bit muted, which is because there's no ICC profile associated with the image and the camera has a wider gamut than your monitor. But the exposure is clearly correct.
Last is the 1/8 second exposure similarly developed. That's what the meter read, if I recall right. It's clearly underexposed, but the sky's no longer blown.
In other words, the half-second exposure caused the sensor to record the scene such that an 18% gray object was rendered as 18% gray (that'd be 45.9 on a 0 - 255 scale, again remembering the 1.0 gamma; with 2.2 gamma, that works out to the much more familiar 118) in the digital file. The 1/8 second exposure caused that same object to be rendered instead as 4.5% gray, or 11.5 on a 0 - 255 scale with 1.0 gamma. In other words, it's two stops underexposed.
And now you should also have an idea of why shadow noise has always been such a problem for digital cameras....