Pedro's black and white shots bring another factor to light (sorry for the pun). I've been thinking about overall sensor efficiency so far, but we also have to factor in per-pixel efficiency. So long as we are involving color in our photographs, there will be some rather significant limits on how much light we can really capture. If we think about sensors more from the angle of astrophotography CCD's, many of which cost a couple thousand dollars (for what is basically a sensor and some readout electronics...far from the advanced machinery we get in a full DSLR), such sensors are often already pushing 80% Q.E. and often require cooling. Part of the reason they achieve such high resolutions and sensitivities is they are monochromatic devices.
In a Bayer sensor, we have a color filter over each pixel, which really limits how much light each pixel can receive, and intrinsically limits the total light the sensor can record. We might be able to achieve 50-60% Q.E. today on per-color basis, but 100% Q.E. for a red pixel...which ultimately means that a red pixel can convert every single photon that strikes it into an electron...is actually still 33% of the total light incident on that pixel. We would have to drop the color filter, and preferably drop any kind of low-pass filter and as much other filtration above the sensor as possible, to really push both per-pixel as well as overall sensor Q.E. to their maximums. With say multi-layered microlenses, low-noise electronics, efficient readout electronics that introduce little of their own noise, unity gain to eliminate quantization errors, backilluminated sensor design, in a monochrome sensor...well now we are really talking. Not only are we maximizing the surface area of the sensor for optimal light sensitivity, we also maximize resolution, or alternatively allow the use of pixels with four times as much surface area as a color sensor. With pixels twice as large, we can now gather four times as much light per pixel at effectively the same resolution as we had with a color sensor, so extremely high ISO at low signal levels should be much more viable. One could also design a sensor that could either alternate color filters over the sensor to record and blend a full color image at full resolution, or even use some kind if prism to split light to three sensors simultaneously to capture full color, full resolution images for the same exact exposure.