The 30D is a 12 bit camera so the later 14 bit processed images are going to produce better dynamic range. I have a thing that too many people concentrate on the sensors and not anything like enough on the image processor which is the real keystone of a digital camera. Manufacturers didn't name their sensors (well OK some did) but all of them name their image processors. Good image processing affects so many camera functions, and yet hardly anyone ever seems interested. Sad really.
BTW the Cokin - Lee etc graduated filter system were made in the days of film exactly for this problem, and they're still useable with digital, you should consider using them. Then there's HDR which you could never do with film this way you can manage ridiculous dynamic ranges.
Our eyes can see a huge dynamic range, and they acheive this by scanning a scene and adjusting the pupils, persistence of vision means we see the whole scene, and don't even realise what our eyes have done. A camera cannot do this, but one interesting solution is adaptive Iso where the sensor is read using different Iso settings.
If the sensor is saturated with high light or noise, the processor cannot do anything about it. On paper, a 12 bit processor will give us at least 12 stops of dynamic range, a 14 bit processor will give us at least 14 stops of dynamic range. None of us can get that kind of dynamic range out of our DSLR. This is a good proof of the sensor dynamic range is the limitation, not the processor.
Graduated filter is an excellent idea to make the picture look better under certain condition by compressing the dynamic range of the scenery to a point that the sensor can handle it easier.
As for human eyes, the dynamic range is much higher than the sensor. couple with our brain's "cheating power" (We still see a white shirt under the ordinary light bulb while the digital camera sees as yellowish even with automatic white ballance. The camera will see it really yellow if the white balance is set to day light) We will have dynamic range that no camera plus software correction can match.
There are posts talking about diffraction limitation. We should also look at it from a different angle. The diffraction limit is cause by the lens, not the sensor. At lower resolution sensor, we just never see it. With high resolution sensor, the diffraction limitation just reduce the high resolution sensor to a lower resolution sensor. Example: 18 Mp diffraction limit is f 6.3 while 10 mp diffraction limit is f 10.6 (??) . If you set the lens at f11 with a 18 mp sensor, you just reduce the actuall resolution of the 18 Mp to be 10 Mp, assuming a good lens is used.