d5 MkII exposure problem ?

Status
Not open for further replies.
Hi I was doing some HDR brackets and I got a whole bunch of pictures that looked like the picture in the attachment, with a large black part in the lower third.

For these pictures I used a Sigma 12-24mm HSM II lens shot at 12mm and f16 so the resulting pictures are very wide from the angle of view and I took them with the camera being on a tripod about 2ft above the water.

Where does the black part in the lower part of the picture come from. Is it a problem of the lens, the metering or the positioning of the camera or sth. else ?

Would be thankful for any advice as I have a whole bunch of pictures that look like this ...
 

Attachments

  • IMG_9500.JPG
    IMG_9500.JPG
    126.3 KB · Views: 473

Fleetie

Watching for pigs on the wing
Nov 22, 2010
375
5
52
Manchester, UK
www.facebook.com
Ok, you weren't using a polarising filter. But actually, the blue sky will under the right conditions (angles), provide the same effect:

Blue sky light is significantly polarised, especially 90 degrees away (in the sky) from the sun.

Interfaces between different non-metallic media of different refractive indices also polarise light, maximally at the Brewster angle (which is about 53 degrees from the normal, for an air-water interface).

With the right orientation of everything, the polarising effect of the water surface could conspire with the polarised light from the blue sky, to reduce significantly the intensity of light reflecting off the water in certain directions.

So, in effect, the blue sky could act as a kind of polarising filter without you having such a filter fitted to your camera.


Another effect would likely have come into play here:


Any reflecting surface (e.g. water) becomes more reflective as the angle of incidence of light gets closer to grazing, so in this case, the water far away and a way off to the side would appear more reflective.


So, given the combined effects of the above two phenomena, even though you exposed at a level that made the less reflective nearby water appear quite dark, the more reflective distant water was still bright enough to nearly saturate the sensor.

Cameras do not have anything like the dynamic range that human eyes do, so if you were viewing the scene, you would not even notice a difference in reflectivity/brightness of the water from nearby to distant. Given this more limited dynamic range, cameras can exaggerate differences in light levels, relative to what we see. Also, our brain tends to compensate for these effects, and so we often don't even notice them.

Regarding polarisation, if you're curious, you might want to type "Haidinger's Brush" into wikipedia, and have a read. Fascinating, and it is relevant to much of the above. The brush is easily observed in blue sky when you know what you're looking for. (Look directly up, and spin around to cancel the brain's tendency to smooth things out when it thinks things "should" be uniform. You'll see the yellow part of the brush easily then. Try not to fall over while spinning, and ignore the fact that you'll look an idiot!) It's been there your whole life, and you almost certainly never noticed it!

In fact, here's the link: http://en.wikipedia.org/wiki/Haidinger%27s_brush


Martin
 
Upvote 0
+1 for this answer

Fleetie said:
Ok, you weren't using a polarising filter. But actually, the blue sky will under the right conditions (angles), provide the same effect:

Blue sky light is significantly polarised, especially 90 degrees away (in the sky) from the sun.

Interfaces between different non-metallic media of different refractive indices also polarise light, maximally at the Brewster angle (which is about 53 degrees from the normal, for an air-water interface).

With the right orientation of everything, the polarising effect of the water surface could conspire with the polarised light from the blue sky, to reduce significantly the intensity of light reflecting off the water in certain directions.

So, in effect, the blue sky could act as a kind of polarising filter without you having such a filter fitted to your camera.


Another effect would likely have come into play here:


Any reflecting surface (e.g. water) becomes more reflective as the angle of incidence of light gets closer to grazing, so in this case, the water far away and a way off to the side would appear more reflective.


So, given the combined effects of the above two phenomena, and the fact that you exposed at a level that made the less reflective nearby water quite dark, but the more reflective distant water was still bright enough to nearly saturate the sensor.

Cameras do not have anything like the dynamic range that human eyes do, so if you were viewing the scene, you would not even notice a difference in reflectivity/brightness of the water from nearby to distant. Cameras, with more limited dynamic range, can exaggerate differences in light levels, relative to what we see. Also, our brain tends to compensate for these effects, and so we often don't even notice them.

Regarding polarisation, if you're curious, you might want to type "Haidinger's Brush" into wikipedia, and have a read, Fascinating, and it is relevant to much of the above. The brush is easily observed when you know what you're looking for. It's been there your whole life, and you almost certainly never noticed it!

In fact, here's the link: http://en.wikipedia.org/wiki/Haidinger%27s_brush


Martin

Great answer, thank you for your help, it sounds like a viable explanation of the problem and seems to be supported by the fact that the 12-24 has the most extreme angle of view available on FF for non fisheyes ... So this is quite possible just showing a large portion of black which would be nor so obvious on a narrower lens ...
 
Upvote 0
Status
Not open for further replies.