unfocused said:Excuse my ignorance, but are dynamic range and noise interchangeable? Can a sensor have better noise control without having more dynamic range? I'm unconcerned about dynamic range, but I would like cleaner files.
Not exactly, but close enough if you keep sensor size fixed. Dynamic range in a camera sensor is capped from the top by photosite well capacity (hard cap; the sensor just saturates and no extra information beyond that can be recovered because there isn't any) and from the bottom the noise floor (soft cap; the weaker the signal you're trying to recover the more you amplify noise as well, resulting in lower and lower signal/noise ratio).
At medium to high ISOs nowadays, the noise component of sensor signal is entirely dominated by photon shot noise: the statistical variation expected simply because photons are discrete packets of energy randomly hitting the sensor. The weaker the signal to be amplified, the fewer photons and more random the pattern. The only way to improve that is to gather more photons. This means either using a lower ISO and a longer exposure time (or a bigger aperture), using a larger sensor with larger photosites (or more of them so their signal can be averaged), or making the sensor more effective at gathering light.
If we want to keep exposure and sensor size fixed, the last one is our only option. Not all of the sensor surface is light-sensitive; around every individual photosite there has to be room for wiring and support structure. The photodiode:scaffolding ratio has climbed over the years, but there's a natural cap so diminishing returns are inevitable. Light-focusing microlenses and, more recently, back-side illuminated sensors are some of the ways the effective light-sensitive surface area has been improved.
Most digital cameras utilize Bayer filters over monochrome CMOS sensors to allow color information to be recorded. The color matrix necessarily blocks some of the light as each Bayer cell filters out perfectly good photons that just happen to have a wrong wavelength. Improvements here are based on coming up with dyes that are optimally transparent to photons of the "right" wavelength range. Or you can of course opt to rid of the color filter altogether (see Leica Monochrom and many astro cameras).
Even of those photons that are lucky enough to hit a photodiode, not 100% can be converted to useful signal. The quantum efficiency of a modern CMOS photodiode is somewhere between 50 and 90 percent, depending on wavelength. CCD sensors can have high (~90%) quantum efficiencies which is why they're popular in astro imaging.
Adding all of this up, state-of-the-art sensors are already well over 50% efficient at converting photons exiting the lens to useful signal. So there's less than one stop of improvement possible even in theory, and returns are definitely diminishing.
Upvote
0