I like the idea of the equipment getting smarter, in this case, can it avoid burning up highlights or unduly darkening shadows due to the limited DR of the sensor. While advanced in lithography are the current roadblock, there needs to be a rethink of the way Sensor data is recorded and handled.
Could they develop local dimming on the sensor (or local illuminating):
So lets say you are shooting a scenery of the sea with horizon in the middle. If you expose for water, Sky is over exposed and if you expose for sky, the sea is underexposed etc. (Yes there are Grads but I don't want to carry them everywhere).
What if the sensor can locally dim the sky to bring it down to levels that do not distort (burn out). This will definitely need a faster data processor, but similar things have been done in the Audio with Dynamic range compression/ expansion. The concept is the same. Get the signal past the limitations of the equipment and then later if need be, restored to it's original glory if the right displays are available (with High DR).
So there are 2 concepts to think about:
1) Reduction of DR to fit the sensor's capabilities (sort of what HDR does but without taking multiple exposures)
2) Restoration of DR under right conditions (so that the exposure is not permanently captured in reduced DR but can be rendered however which way).
In some ways this is similar to the inability of LCD displays to show absolute blacks, so manufacturers came up with local dimming schemes. In the good old days of Audio cassettes, the issue of limited Dynamic range also existed and there came along dbx...
Once the information is captured in this compressed format, PP can be used to render it anywhich way the user desires. Current technology renders each photosite anyway, they would just need a pre-layer to determine exposure levels before bayer demosaicing etc. It will take a lot redesigning, including perhaps a new RAW like format to handle this variable DR and decoders to process it..
This could be revolutionary if implemented correctly.