Too much dynamic range?

Status
Not open for further replies.
nightbreath said:
As far as I understand each camera applies it's own tone curve to the image, or am I wrong?

Initially I wanted to be brand-agnostic and instead of discussing specific sensors, I want to identify what really matters for my needs (and maybe many others). I'm not able to tell what it is right now, so everyone's input is appreciated :)

Not really: each camera applies its own tone curve, but that happens *after* the ADC has done its job, and, as far as I know, all cameras have linear ADC (ok, some have piecewise linear, but that's actually a change for exposure, not for the ADC, which is still linear; and I'm not sure any of the ones we're talking about actually does that).

The fact that light is linear but you see it as log makes this very inefficient, and it is the reason that, for example, some Nikon cameras (low to mid-end) apply a log curve even in their RAW files: with linear, you have way more gradation than you need in the highlights, and may still be struggling in the shadows. But even this curve happens after the ADC, so it's not what we are talking about, I think.



As for the need for DR, as I said, it depends on what you do, how much time you have to do it, and how much margin of error. My friend was carrying an extra 5-year-old 12 Mpix camera just in case, because he thought he needed to do so. If you don't think you need it, well, good for you, what can I say?

Now go ask any cinematographer if they think 11 or 12 stops of DR is enough.
(hint: I hardly ever shoot stills, I'm a vidiot)
 
Upvote 0
I don't quite understand all the technical background, but I can attest to the advantages of high DR in my work. I do a lot of interiours, and most of them are high-contrast situations, often necessitating trips to PS for better noise control (I bought Noiseware Professional which used to be better than LR, maybe still is), occasionally blending in different exposures (I always do bracketed shots for interiour work), etc.

With the d7000 and now the d800, my trips to PS noticeably decreased. In fact, I don't recall any situation in the past year where I had to use blending. My only comparison is the 5D MK II and Rebel T2i, and my gut feeling is that you can push the d7k at least a stop more without image degradation (colour shift, noise, etc.) In Lightroom terms, this is about 30-40 points on the shadows slider, or being able to push both blacks and shadows on the Tone Curve significantly more. For me, it means staying in Lightroom for 99% of my workflow. You can just feel how much more malleable are NEF files then CR files.

I'm not sure any of this matters if you're shooting JPEG. Except maybe if you have ALO enabled. With Canon's, I never used ALO (and I didn't use ADL it with the d7000), but when I bought the d800, I just left ADL (the Nikon equivalent to ALO) on Auto setting. I'm a raw shooter, but lately I've been experimenting with JPEG+RAW because I started shooting events now (not much interiour work lately). As far as I can tell, ADL works well, I don't see any image degradation, and the change is quite subtle actually, but for the better as far as I can tell. Where it truly matters is when you shoot RAW.

I heard a good description of RAW somewhere - RAW is like a box of light. When you look at a RAW shot from either Canon or Nikon you basically see the same image and DR. The difference becomes prevalent when you're start messing with it, changing exposure or using curves. With a high DR camera, you can push shadows more without significant image degradation. It is as simple as that. You have a bigger box of light with a high DR camera ;)

By the way, your photos are magnificent! I don't even know what most of my favourite photographers shoot. I know some shoot Canons, others shoot Nikons - and the end results are all magnificent. Once you have invested in either system, I don't see much reason to change. I didn't have a huge investment (Sigma 10-24 and a Rebel) when I bought the Nikon d7000 (and the new Sigma 8-16 which was to replace the 10-24 anyway). The 5D MK II wasn't mine. For the things I shot back then, the high DR of the d7k did make a difference, made my life easier. But for the things you shoot, it might make zero difference. As I said, what you get with high DR is a bigger box of light, but if you don't see the limits of your current box, then you don't need a high DR camera ;) And I think for JPEG shooters it doesn't matter at all.
 
Upvote 0
NormanBates said:
...Let me add a twist: the ADC works linearly, but what you see is log

So, if you have a 14-bit ADC (you can count up to 16384)and can record 14 stops of DR, here is how those values will be distributed:

14th stop: 8192 to 16383
13th stop: 4096 to 8191
12th stop: 2048 to 4095
11th stop: 1024 to 2047
10th stop: 512 to 1023
9th stop: 256 to 511
8th stop: 128 to 255
7th stop: 64 to 127
6th stop: 32 to 63
5th stop: 16 to 31
4th stop: 8 to 15
3rd stop: 4 to 7
2nd stop: 2 to 3
1st stop: 0 to 1
...

This is incorrect - if you replaced "bit" with "stop" then it is correct (and it's then obvious why ETTR works too.)

It's easier to consider a 3- or 4-bit digitiser. Let's say it offers 4 bit resolution then the possible counts are 0000 through 1111. This translates to 2^4 or 16 levels. Written the way that Norman stated it, there would only be four distinct levels - this is incorrect (but I understand he meant there would be 16 levels.)

Some things appear to have been glossed over in the discussion. First, the ADC operates on a per-pixel basis.

I've read the DxO tests on various sensors. It is important to remember that the notional dynamic range is referred back to an 8 mp standard. This means that the D800's quoted 14 bits dynamic range is significantly less than 14 bits at a per-pixel level. The 36 mp > 8 mp conversion gains the sensor sqrt(4.5) = 2.1x notional improvement in dynamic range. The 2.1x is slightly more than 1 stop in quoted dynamic range. This means that the true per-pixel dynamic range is about 12.9 stops.

In order to read those 12.9 stops, the ADC needs a bit more resolution than the 13 bits required by the pixel. I'm quite surprised because the 14 bits in the ADC suggests that the entire detection chain has ~1 bit of noise.... It sounds improbable.

Photon shot noise has been commented on briefly. If we assume 13 bits dynamic range and (say) 10% quantum efficiency (pidooma), then the number of photons required to fill a pixel is 10 x 2^13 or 80k. Since shot noise varies with the square root of the number of photons, the pixel could have shot noise of up to 280 photons (rms).

Since 1 bit translates to 80k/8192 = 10 photons and we must have about 6 bits of photon noise at the upper end of the sensor's dynamic range. At the bottom end, the quantum efficiency sets the performance and there must be ~3 bits of noise.

Shot noise alone suggests that the true dynamic range of an image cannot be more than about 8-10 bits. It seems that the only way to improve on this is by greatly enhancing the sensor's quantum efficiency.

To answer the OP's question - photon noise alone suggests that there's not a whole lot of benefit to high resolution ADC. It does allow for more sophisticated noise filtering - presumably at the expense of resolution.

<---- physicst / astronomer
 
Upvote 0
* I definitely meant stops. I chose an example with a 14-bit ADC and a sensor with 14-stops of DR to make my life easier, but I meant stops. "One stop brighter" means "twice as many photons", and the ADC is linear (it counts electrons using a linear scale), that's what makes both sides match.

* I'm talking about per-pixel DR. Matching resolution makes a lot of sense when you're comparing cameras, since it's the final image that you care about, not each individual pixel. But it makes the technical discussion a lot more complex, because you have to take into account how downsampling reduces noise. I'd rather leave that out right now.

* The per-pixel DR measurement that dxomark got was 13.44 stops http://www.dxomark.com/index.php/Cameras/Compare-Camera-Sensors/Compare-cameras-side-by-side/(appareil1)/834%7C0/(brand)/Nikon/(appareil2)/795%7C0/(brand2)/Canon)
I'm also puzzled at how the D800 can manage that with a 14-bit ADC. As I said above, I'd be expecting to see a lot of posterized noise in the shadows, but it's not there.

* No idea about how quantum efficiency affects all this. I never thought about it.
 
Upvote 0
nightbreath said:
An interesting thought came to me before I went to bed. Below you'll find an assumption that came suddenly to my head, so please don't take it too seriously.

So... Let's assume there are two cameras with similar color tones reproduction abilities, but with different possible lightness level capturing ability. For example:
- sensor of camera A has 12 stops of DR, 16 billion tones it can distinguish
- sensor of camera B has 10 stops of DR, 16 billion tones it can distinguish

Having a flat scene (i.e. low DR scene) on a shot we'll push an image with, say, 8 DR to be captured with both sensors. And then both images will be edited in post to retrieve lacking contrast. So we need to add:
- 4 stops for 12-stop camera
- 2 stops for 10-stop camera

So my point is: with lower DR camera we'll have lower tone delta (difference of the initial color tone in the scene with reproduced tone by the sensor) when processing the low DR shot made using lower DR sensor. That happens because of decreased amount of modifications made to the file to achieve required result.

What do you guys think about that?

No. You are thinking about it wrong. It doesn't work like that at all.

All it means is that the camera with more DR has less noise in the lower tones than the other camera. There is no way you can ever lose tones because of that. Whatever you are trying to do you can always exactly match what the other camera can accomplish (plus more things). In fact, since you captured with less noise you have captured MORE distinguishable tones and the captures are linear there is no different expansion you need to do with one camera vs the other, as you compress it to a screen maybe you don't use the extra tones but you won't end up with less and you might end up with more.
 
Upvote 0
LetTheRightLensIn said:
nightbreath said:
An interesting thought came to me before I went to bed. Below you'll find an assumption that came suddenly to my head, so please don't take it too seriously.

So... Let's assume there are two cameras with similar color tones reproduction abilities, but with different possible lightness level capturing ability. For example:
- sensor of camera A has 12 stops of DR, 16 billion tones it can distinguish
- sensor of camera B has 10 stops of DR, 16 billion tones it can distinguish

Having a flat scene (i.e. low DR scene) on a shot we'll push an image with, say, 8 DR to be captured with both sensors. And then both images will be edited in post to retrieve lacking contrast. So we need to add:
- 4 stops for 12-stop camera
- 2 stops for 10-stop camera

So my point is: with lower DR camera we'll have lower tone delta (difference of the initial color tone in the scene with reproduced tone by the sensor) when processing the low DR shot made using lower DR sensor. That happens because of decreased amount of modifications made to the file to achieve required result.

What do you guys think about that?

No. You are thinking about it wrong. It doesn't work like that at all.

All it means is that the camera with more DR has less noise in the lower tones than the other camera. There is no way you can ever lose tones because of that. Whatever you are trying to do you can always exactly match what the other camera can accomplish (plus more things). In fact, since you captured with less noise you have captured MORE distinguishable tones and the captures are linear there is no different expansion you need to do with one camera vs the other, as you compress it to a screen maybe you don't use the extra tones but you won't end up with less and you might end up with more.
Ok. But it doesn't explain me the difference in 1D series and 5D series color integrity when you push image colors, so there's still feeling of incompleteness I was left with :)


I've already got one valuable comment on that:
jukka said:
1D series has more expensive electronic chain and also when it comes to shielding etc and 1d series is probably also better matched in terms of RGB
The old 1dsmk3 has a better response regarding middle tones than 5d mk2 mk3 series an can be seen in a even colored surface.
There also different CFA in the old 5d compared to 5dmk2 mk3 and some experiencing the colors better in the old 5d
Canon changed their color filters (not so dense ) in order to gain more light/ increasing sensitivity
But there were no refrences to official resources / tests that could tell more.
 
Upvote 0
nightbreath said:
@!ex said:
I hope that someday I could get a shot with this much dynamic range in a single exposure. I bracketed 7 shots at 3 EV spacing per bracketed shot. That is 18 EV spread, but each shot has it's total EV range (minus clipping) so the DR spans almost from pure black to pure white. My eye saw these images like this, but with a single exposure (including using ND filters) I could never get these shots without increasing the DR of the camera via multiple exposures. Sorry, not trying to go off topic, just thought it was relevant to the subject.

Shot #1
Shot #2
No offense, but these scenarios look uninspiring to me. And I believe it's not about how you or I see it, it's about everyone's way of thinking towards DR that makes HDR overused by lots of photographers around the world.

I believe that HDR imaging has its own niche, but it should be used when the result doesn't tell you whether it's HDR or not. So better scenes is what really matters for me (rather than increased DR):

That is what I'm talking about. We are all going to have a different eye for what compositions touch us the most, but my argument was about the utility of high DR and of fusion/hdr. There are a lot of scenarios where they are indispensable in getting the shot we (as artists/photographers) want. I was just showing a few extreme examples of what extreme DR can be useful. If you don't like those particular shots that is fine, as I was mainly using them to illustrate a point. I agree with you that HDR as a techniques is overused, and used as an effect to over emphasize details, rather than used to achieve dynamic range that would otherwise be unachievable without either unwieldy supplemental lighting or impossible (non linear) and creativity inhibiting ND filters. Here are a few more examples, maybe one of them will resonate with you more...


TiVo by @!ex, on Flickr


They breath profits; they eat the interest on money... by @!ex, on Flickr


Home on the range... by @!ex, on Flickr


Electric Sunset at City Park by @!ex, on Flickr


Drought by @!ex, on Flickr
 
Upvote 0
I've enjoyed following this discussion even though I haven't understood any of the technical stuff. As far as I am aware, you press the shutter, magic happens, and the picture appears on the back of the camera.

But some interesting points have been raised in relation to dynamic range of digital cameras. When we take a photo, if our subject is only lit by incident light - the ev level falling on the subject, then the ev range is not as great as you might think. In England bright summer sun as an ev value of about 14.5, and in that situation the luminosity in even the darkest shadows would struggle to be less than 5, so the dynamic range in terms of stops is about 9.

With regard to the classic wedding picture - brides white and grooms black, there's a big difference in reflected light, but "correct" exposure should still enable correct detail in both black and white, eased substantially but the use of a reflector or fill in flash. In the film days the reason any wedding photographer worth their salt used medium format was to have the higher sync speed. This need was reduced when focal plane shutters increased to 200 sec sync.

The situation changes dramatically once you start to include the light source in your frame, such as bright skys, the sun itself, bride lit by bright window and that included in the frame. Then the ev range goes of the scale.

The Op's wedding pic is lit by incident light, and his camera has DR to spare, and when the DR of the Canon technology is pushed in this situation I've found certainly the 5D mki to be fine.

The first pic attached is a snap at a friends wedding, and I've used it as an example because it was it mid afternoon bright sun, the little boy has a plain bright white shirt, and the guy a dark (ish) suit. I wasn't exposing for the high lights, so part of the boy's shirt has gone to 255, but it was reflecting the sun straight back at me, so it's going to be bright white. This first pic is straight off the camera.

The second is 200% of the shadow on guy's leg. The third is the pic how I would produce it, lifting the dark shadows. The fourth is 200% of this. There is no noise in lightened area at all. ( even on the full size data).

I've then lifted the shadow so it's almost gone - and we have noise coming in, but who would want a pic like that - can't post it - had my four !

As I have said, once you try to include the light source itself in your frame the situation changes. At the present time it is impossible for a camera to record in one exposure what the eye can see...............because.............the eye doesn't see, your brain does. Your eye just gathers and focuses the light - the ( camera ) lens - our brains deal with interpreting this, prospective, field of view, dynamic range. ( This why people can "see" things that aren't really there ).

So our brains do instantly what !ex has spent some time doing with his beautiful pictures - the best of HDR technique - that makes you believe this is how we would have seen it. The dynamic range in this type of situation will be well over 14 stops. A picture that has been produced from a 14 stop DR camera in one frame, with the majority of the picture vastly under exposed due to exposing for the light source, and then pulling back the shadows, will never have the colour, luminosity and general "brio: of the technique that !ex has used. ( Well I say never, but not in the near future ).

So if you have a camera with 11 stops of DR and really good colour, tonal graduation from black to white etc, your pictured wont be held back by technology. Having a further 3 stops of DR would be no disadvantage as long as it doesn't compromise any of the other facts that are more critical to picture quality - the OP's original question I believe.

Incidentally I still maintain the the image quality that I have seen from 5D mkiii and D1X shows that these cameras can be superb. And an older camera that I always though could produce very good tonal graduation was the D200 - with it's 10 stops of DR.

Anyway I have probably bored the pants off anyone who has read through this, but I've got nothing better to do on a very wet afternoon !

( I forgot to change the pictures to sRGB for the web )
 

Attachments

  • _MG_1490.jpg
    _MG_1490.jpg
    439.5 KB · Views: 879
  • _MG_1490200%.jpg
    _MG_1490200%.jpg
    14.9 KB · Views: 854
  • Wedding.jpg
    Wedding.jpg
    543.5 KB · Views: 958
  • Wedding200%.jpg
    Wedding200%.jpg
    17.6 KB · Views: 1,016
Upvote 0
Yes, that's a good example. The sun may be brighter here in southern Spain, and you may not want a wedding dress to blow up like that kid's shirt, and the broom suit may be black instead of gray (look at the shoe; that's black, the suit is not), and then you may be in trouble if your camera is DR-challenged.

Not the most common scenario, by a long shot, but I never said you always need lots of DR. Just that you may sometimes need it.
 
Upvote 0
Status
Not open for further replies.