Ironically, one of these big features of digital cameras - changing ISO - is actually still an analog process.
The DxOMark article is great reading, you should find it.
The essence of it is that the silicon sensor indeed is much more sensitive than film - but it has a "base sensitivity." There is a range (that looks to be getting even bigger these days) where the sensor is most sensitive - ISO increases don't change the physical sensitivity of the sensor, which is fixed, but increase the gain applied in the process of reading off the chip (by the analog to digital converter, aka ADC, which may be integrated into the "CPU Processor" aka DIGIC but doesn't have to be and is a distinct step from the usual "image processing" steps).
Daniel Browning on Photography-On-The-Net has come up with some amusing terms for various kinds of on-camera ISO indicators. There's 'smart' ISO settings which include the ISO setting which most closely matches the default sensitivity of the sensor (ISO 100 or 200 is usually the "cleanest" looking ISO) and ISOs which are clean doubles of it - I thought that analog gain could be applied essentially as much or as little as wished but apparently it doesn't really work that way (although it is wrong to think that the "real" sensitivity setting always is halved or doubled - the actual curve is slightly different as DxOMark says).
Then there are "faked" ISO settings, which are not even step increments - i.e. a "ISO 640" step (between 400 and 800) could be ISO 400 pushed (not likely) or could be ISO 800 "held back" (more likely) which may keep good highlight detail but add noise to the shadows. Nevertheless, some shooters swear by the "faked" ISO settings for shooting video, at least on some cameras, so it helps to experiment and find out what works best for you.
For some clues to get you started, I find this site
to be a good starting point. It takes the DxOMark data and mines it for recompilation into a different kind of presentation. According to this data, my humble T1i (i.e. 50D sensor) does pretty well up to ISO 800. In practice, I like to limit it to at most ISO 400 - ideally to ISO 100 or 200 (there's a big jump in noise from 200 to 400). The read noise plot demonstrates this.
One thing you can learn from the Sensorgen.info charts is that at a certain point the effect of using in-camera ISO may be worse than manually or "artificially" brightening the image later (since more light = brighter photograph). You also can see that dynamic range and saturation are impacted by higher ISO selections.
To tie it all together with a neat list of some observations:
- The sensor has a set sensitivity
- ISO settings introduce tradeoffs to attempt to capture more data at higher ISOs
- The ISO scale is defined scientifically, but real camera settings are probably based at least partly on the appearance and marketing factors to make the very edge of what's reasonable with ISO increments appear to be more of a gain than it may really be
- When you attempt to map the actual sensitivity of the camera to the real ISO scale, the plot will probably be skewed one way or another, and it is in fact totally arbitrary because the manufacturers can explicitly define one camera ISO setting to be equivalent to something way off the mark - and sometimes it has happened, apparently. Probably a bigger issue with older cameras, although I wonder if newer sensor tech can cause "cheating" the other way (i.e. underreporting an on-camera ISO setting's sensitivity to be more conservative, maybe?)
- Your camera might be "missing" low ISO modes because it makes no sense to throw away data hitting the sensor - just darken the image afterwards (since ISO is essentially just brightening the image).