Filter for direct sun photography

Status
Not open for further replies.
Hi!
I also use the Baader filters http://www.baader-planetarium.de/sektion/s46/s46.htm that can be cut in all sizes for use IN FRONT of eyes, lenses and telescopes...

Attached you find some examples of use on
1) EF 100-400 (no sunspots during that eclipse)
2) EF 500/4 (during sunrise with "solarpower" foreground)
3) same eclipse but later with the sun higher up, leading to colour change!
4) 6" telescope showing the now much more active sun a few weeks ago

Always EOS 7D or 5DII...

With such a protection view and makeing pictures are save!!!

Cheers
Franz
 

Attachments

  • 1IMG_6474_fb.jpg
    1IMG_6474_fb.jpg
    125.3 KB · Views: 656
  • 2IMG_1155r_fb.jpg
    2IMG_1155r_fb.jpg
    203.4 KB · Views: 696
  • 3IMG_1270r_fb.jpg
    3IMG_1270r_fb.jpg
    232.4 KB · Views: 699
  • 4WO158_4153mm_111002_m_fb.jpg
    4WO158_4153mm_111002_m_fb.jpg
    72.1 KB · Views: 667
Upvote 0
The commonly available H-alpha photographic filters are NOT for photographing the sun. They're primarily intended to be used with deep sky objects. You need a much narrower bandwidth than the deep sky H-alpha filters to do that. Look at the dedicated solar scopes for an idea. I have an entry level H-alpha filter at 12nm bandwidth which I also tried with the sun just to see what happens. It doesn't resolve any detail you don't already get without the filter. The narrowest one of that type I've seen is 3nm. A dedicated filter in a solar scope is 0.1nm or less. You need it to filter out the other stuff and only look at the sun's H-alpha.

So unless you do go for the dedicated setup for the sun, you might as well go cheap with the reflecting filters for safety.
 
Upvote 0
Shooting video of the sun with my 400mm f2.8 has not damaged my sensor at all, and most of us have included the sun in a photo from time to time without effect on the sensor.

Eyes on the other hand do deserve more protection. I use a welding shield I have, and can stare at the sun all day long.
 
Upvote 0
PeterJ said:
I had a play around with various focus options yesterday taking a few test shots with manual / AF and viewfinder / liveview. Apart from liveview / contrast detect AF which was a dodgy as usual I didn't find much difference between all of them, so ended up just taking an initial phase-detect focus on the center AF point making sure it was on the edge of the sun. Would that have technically been an OK thing to do, the lens seemed to be spot-on the infinity mark for 200 but would the practical focus change over 30 mins or so?

If you change nothing else, the focus should stay fairly constant (it can change if e.g. temperature or humidity changes appreciably). For accurate focus I use live view with magnification and manual focus. Phase detection can be a bit challenging sometimes in difficult conditions, but if it worked for you, then fine.

I would highly recommend you to attempt to catch the 2012 total solar eclipse, it's much more impressive and a completely different experience that does not compare to a partial eclipse!
 
Upvote 0
nubu said:
I also use the Baader filters http://www.baader-planetarium.de/sektion/s46/s46.htm that can be cut in all sizes for use IN FRONT of eyes, lenses and telescopes...

Very nice Franz, in particular the one in front of the power station! I looked at your link, the films seems to have quoted neutral densities between 3.4 and 5. That seems a bit low, are you using multiple layers? Are the quoted numbers photographic stops or magnitudes?
 
Upvote 0
TexPhoto said:
Shooting video of the sun with my 400mm f2.8 has not damaged my sensor at all, and most of us have included the sun in a photo from time to time without effect on the sensor.

You have shot the sun with a 400/2.8 with video without a filter at all? Without destroying your sensor? I have a hard time believing that, and I will not be tempted to prove you wrong by trying it out myself... ;)
 
Upvote 0
epsiloneri said:
I looked at your link, the films seems to have quoted neutral densities between 3.4 and 5. That seems a bit low, are you using multiple layers? Are the quoted numbers photographic stops or magnitudes?

ND 5 (17 apertures) is for visual work, ND 4 (13 A) for photography to give you short exposures. One may combine these filters with one of their solar continuum filters to have higher contrast:
http://www.baader-planetarium.de/sektion/s37a/s37a.htm. For Visual work I use the ND 4
plus a narrow band continuum. Its very nice and save since they all have IR/UV blocking.

I used the later for the detail pic with the 6" telescope shown above...

Franz
 
Upvote 0
epsiloneri said:
TexPhoto said:
Shooting video of the sun with my 400mm f2.8 has not damaged my sensor at all, and most of us have included the sun in a photo from time to time without effect on the sensor.

You have shot the sun with a 400/2.8 with video without a filter at all? Without destroying your sensor? I have a hard time believing that, and I will not be tempted to prove you wrong by trying it out myself... ;)

Take a look at my video from about 0:28 - http://www.youtube.com/watch?v=hUp7fmM2wY0 Have shot similar scenes 10-20 times with 5DII and 7D. Think about it though. If the sun was going to damage your sensor, it would do it with a 50mm, or a fisheye. It would just damage a smaller area. And an f1.4 50mm would put a much brighter if smaller spot on the sensor tan a 400 f2.8. And again we all include the sun in a photo from time to time, and I'm sure those who shoot video do so with video. And yet there are not 1000s of reports of sun damaged cameras out there.

The only light source I've ever heard of damaging a DSLR sensor is lasers, typically at concerts/ laser light shows.
 
Upvote 0
I still blame the sun for the dead pixels on my 7D, it was just after I took photos of the eclipse of 04/01/2011 that I noticed all the dead pixels. Was with my 70-300 non-L and 15-85, at 300mm and 85mm, so both f/5.6 for framing but f/18-22 for the pics, some at f/36, speeds from 1/640s for the wide-shots but 1/2500-6400s for the 300mm shots.
I started off with the viewfinder but gave up after half a second and moved to live-view, I tried to keep my finger on the DOF-preview button but couldn't always. If I had an ND, that would have been when to use it, but I didn't at the time.
It was 2 weeks later doing long-exposures that I noticed the dead spots and put my camera in *3* times for warranty-fix (took them that many times to actually get rid of them all). I've just had another look, and I can even see the dead pixels in high-iso shots I took that night after the ecplise, and I can't see them in night shots taken a week before.

So yeah, I'm not pointing my camera at the sun ever again, without a pinhole or 20+ stop ND. Feel free to do it to your own camera though...
 
Upvote 0
One interesting thing as a side-note is that after the filter first arrived I took a a snap as above, minus the eclipse and the histogram wasn't like anything I'd seen before:

x6kbqg.jpg


I was expecting the peaks to be evenly spaced, I wondered if the filter and/or lens was doing anything funky but it does much up well with the Wikipedia "color vision" article that has a diagram showing human eye response:

287px-Cone-fundamentals-with-srgb-spectrum.svg.png


Still the peaks seem to drop off more rapidly, I guess just a sensor limitation where the silicon / bayer filter response doesn't match the eye exactly? It looks pretty predictable so I thought that might have been the sort of thing that would be corrected, although that was a RAW import into lightroom so maybe it does happen during the display and/or export stages?
 
Upvote 0
Never unfiltered Live-View with sun in the field of view, especially when on tripod!!! Normal exposures (no tele..) are no problem. Its the longer exposure time (the shutter does not close with live view) that kills the pixels...
 
Upvote 0
PeterJ said:
One interesting thing as a side-note is that after the filter first arrived I took a a snap as above, minus the eclipse and the histogram wasn't like anything I'd seen before:

x6kbqg.jpg


I was expecting the peaks to be evenly spaced, I wondered if the filter and/or lens was doing anything funky but it does much up well with the Wikipedia "color vision" article that has a diagram showing human eye response:

287px-Cone-fundamentals-with-srgb-spectrum.svg.png


Still the peaks seem to drop off more rapidly, I guess just a sensor limitation where the silicon / bayer filter response doesn't match the eye exactly? It looks pretty predictable so I thought that might have been the sort of thing that would be corrected, although that was a RAW import into lightroom so maybe it does happen during the display and/or export stages?

I don't think you're comparing like with like. The x-axis on your photo histogram is NOT frequency/wavelength. But it is on the spectrum from wikipedia. There is no reason why the graphs should match. As I understand it, the histogram's x-axis is "pixel intensity (black at the far left, and saturated/maximum at the right)", and the y-axis is "number of pixels at this intensity".

But if I have misunderstood what you were saying, forgive me.
 
Upvote 0
Status
Not open for further replies.