October 20, 2014, 10:34:12 PM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - jrista

Pages: 1 ... 86 87 [88] 89 90 ... 298
1306
EOS Bodies / Re: Canon EOS sensors, and technology
« on: April 09, 2014, 05:17:11 PM »
Quote
First, you put too much weight on DXO's numbers. As far as their sensor tests go, they do not actually measure "sharpness" or anything like that. It's actually extremely difficult to objectively test a sensor in terms of sharpness, as you have to use a lens to do so, in which case your not testing a sensor, your testing a sensor and lens combined, which totally changes the outcome (and the reasons why you get that outcome). The other problem with lens+sensor tests is they are bound by the least capable component...if the lens is the weak point, then no matter how good the sensor is, your output resolution is limited by what the lens is capable of...you can never resolve more than the lens resolves, period. Similarly, if the sensor has limited resolution and the lens is a powerhouse (like the Zeiss Otus 55mm f/1.4), then your output resolution is limited by the sensor...you can never resolve more than the sensor resolves, period. That makes determining how sharp a sensor is a very muddy issue, one that cannot be definitively pinned down. Hence the reason DXO measures things like SNR and dynamic range and color sensitivity in it's sensor tests...that's all they CAN measure.

This is not news. Lets leave DxO out. I am sorry I even mentioned them for this discussion. I mixed some info and used them as a point of reference. Its something that can happen with the amount of info I go through. Please accept my blunder as a simple error as DXO is often the reference point for sensor quality, and I understand it is not of sharpness of the sensor, and more with DR and ISo, and the related.
[/quote]

It's not a problem to use sources like DXO as a point of reference. It just helps to have all your facts strait before doing so, so you don't mislead or confuse or otherwise sidetrack readers with incorrect or inconsequential information. ;P

Quote
Regardless of what DXO has to say about the D800 or D800E sensors, the removal of an AA filter does not increase image quality. Actually, in all too many cases (quite possibly the majority of cases), removal of the AA filter is guaranteed to REDUCE image quality, thanks to increased aliasing in general, moire specifically. This is clearly evident by all the numerous standardized image tests done with cameras over the years...while sharpness has increased in some newer cameras by a small amount, so too has moire. DPReview has plenty of examples where the removal of AA filters in Nikon cameras, or even just the weakening of the AA filter in many brands (including Canon) has greatly increased the amount of moire that occurs. (A great baseline for comparison on DPR is the 7D...it has an appropriately strong AA filter and doesn't suffer from moire at all. You can compare any newer camera with a sensor that is supposedly "better" than the 7D because of the removal or weakening of the AA filter...those images will be sharper, but they are usually riddled with moire.)

Quote
while sharpness has increased in some newer cameras by a small amount, so too has moire

The moire is subjective. I'm not too interested in the DPReview samples showing loads of moire issues. I have plenty personal samples I can stand by to tell you otherwise. Many samples in those cases are looking to show moire, and samples of it.

Actually, moire is a concrete, immutable artifact of repeatable patterns near nyquist interfering with the sensor grid. It not only leaves behind funky color and monochrome patterns...they are neigh impossible to correct in post...there IS no full moire removal in any RAW editor for a very good reason: It's impossible. You can reduce color moire, however depending on the tool, you might end up with color desaturation, blurring, etc. as a result. Even after removing color moire, the underlying monochrome moire pattern remains, and it cannot be removed (at least, not without significant blurring.)

We aren't talking about a subjective factor if IQ here, we are talking about a detrimental, and permanent, factor of IQ that gets introduced when the AA filter is removed. The DPReview sample images are not intentionally trying to show moire...they are simple sample shots of their standardized test scene. Moire occurs in their samples as a CONSEQUENCE of weak or missing AA filters. You can't simply brush moire and aliasing to the side and call it a non-issue...it is a critical issue to a great many photographers.

Quote
removal of the AA filter is guaranteed to REDUCE image quality


Be more specific. As with this statement, in this discussion you are saying that fullframe or larger sensors that are not using AA have lower image quality. How do you figure?

I explained it in the text you failed to quite. :P

Quote
If the things you photograph have no regular/repeating patterns, and do not contain any elements with clearly defined edges, then increased aliasing due to having no AA filter is not an issue. There are not very many forms of photography where that actually turns out to be the case...landscape photography is probably one of the very few. Even say insect macro photography, for example, will suffer from the removal of the AA filter...things like antenna, feelers, legs, wing veins, anything thin, strait, with high contrast to it's surroundings will end up with clearly aliased edges, and not even a highly optimized AHD demosaicing algorithm will be able to hide that fact.

The underlined falls under EXACTLY what I shoot on a regular basis, and I, with all the respect I have for your knowledge as I have read much of your posts, I think you are simply flat wrong about this. I have worked with about 20 digital camera systems in the past 24 years. I certainly don't have the understanding of sensors, and electro engineering you do, or even in the realm of it.  I know I have shot just about everything there is to shoot, and I specialize in macro work WITH dealing of " thin, strait, with high contrast to it's surroundings ". I uesd the Kodak 14mpixel SLRc camera, and if it didn't have issues with handling light, I would continue using it. The images from that didn't suffer the things you claim. Nor do the MF backs, tossing the optional AA filter aside. (never used one to this day).  Has moire EVER happened? Yes. Can I remember it being a problem or can I even count on my 10 fingers vs over 400K frames (with half using filter free cameras)? NO.

It's fine to have personal preferences. To base the entire discussion of "Canon EOS sensors and technology" solely on your personal preferences kind of makes it difficult to have a coherent discussion. Your personal preferences should really be left out of an objective discussion of the fundamental technology behind sensors, otherwise were just in the muddy territory of subjective muck, and anyone can make any argument to justify their own personal opinions. I personally try to remain objective when discussing technology, and leave my own personal preferences out of the discussion.

Regarding whether moire is a problem on MF cameras, Leicas, etc. If you do a few quick web searches, you'll find that they are a huge problem. There are countless threads on the subject, dating back many years, with MF and Leica users (and increasingly Nikon users) complaining about how bad the moire and aliasing can be on their incredibly expensive cameras. The solution, for many, is to use the lens so act as the AA filter. Either stopping down beyond the diffraction limited aperture of the sensor, or slightly defocusing, etc. One way or another, people have to deal with moire and aliasing if it occurs. If you have to constantly perform a very slight defocus, that makes using an autofocusing system very tedious. Concurrently, having to stop down more than you really want to in order to force diffraction blurring to soften the image is also less than idea.

You say you have used a lot of cameras over a lot of years. I'd be willing to bet many of them were film cameras, in which case moire was never a problem thanks to the random distribution of grains. When it comes to digital cameras, until recently, lenses, while good, were never as good and sharp as they are today (at least, in the DSLR world...for MF, most lenses have always been rather exceptional.) The softer lenses of the past helped to deal with the problem of missing or weak AA filters. Today, we have a convergence of several things that can only lead to significant problems with moire and aliasing: Radical improvements in lens quality, pushing their maximum resolving power to new limits; sensor resolution increasing at a slower pace than lens resolution; removal of AA filters. This is kind of a perfect storm...some manufacturers are apparently doing everything in their power to make moire a very serious problem for a lot of DSLR photographers, which will ultimately put them in the same boat as Leica and MFD owners: Having to defocus or stop down to force blurring and use the lens as an artificial AA filter.

Did you discuss the bold area I highlighted above (about the ratio between lens to sensor) a bit more in detail someplace?  This is likely the feature I'm looking for to be optimal, and likey what the D800E, and A7R have factored in. It is my next criteria for my future camera/sensor purchase.

If you mean the fact that output resolution is based on the convolution of lens+sensor, I've discussed it so many times all over this forum, it shouldn't be hard to find a topic with all the details. The detail in an image (raw file) is the result of a complex convolution of real-world details. In mathematical terms, assuming gaussian-like blurring behavior (which is reasonable), output resolution is roughly equal to the root mean square (RMS) of the input resolutions. Well, to be more specific, the size of the blur kernel that represents the output image is approximated by the RMS of the blur circles of the lens and sensor.

So, if your lens blurs by 3µm and your sensor has 5µm pixels (the lens resolves more detail than the sensor), then the output blur is SQRT(3µm^2+5µm^2), or 5.83µm. Notice that the output resolution is lower than BOTH your sensor and lens. If you improve your lens resolution as far as possible, let's say 0.7µm blur circle (the wavelenght of red light), your output blur is 5.04µm. Your maximum resolution is limited by the sensor...no matter how good your lens is, you can never resolve more detail than the sensor is capable of. This goes the other way as well. Let's say your lens blur is 3µm and your sensor has 2µm pixels. Your output blur is 3.6µm. If you reduce your sensor pixels to 800nm (0.8µm), your output blur is 3.1µm. You can never get any better resolution than your worst performing component.

That's why I always say the whole notion of sensors or lenses "outresolving" the other is more myth than fact. In one sense, I understand why people think about it that way. In reality, the two work together to resolve your image...without both, you have no image, so there really isn't one outresolving the other. The real fact of the matter is your output resolution is never as good as the potentials of your lens or sensor, and your output resolution can never be higher than the least capable of the two. Further, lenses have non-linear performance...as you stop the aperture down, their performance drops. It's tough to say a lens outresolves a sensor in general...at what aperture does it "outresolve"? And by how much? Enough to matter? Or is the lens just outresolving by a tiny bit? When you stop down to f/8, is the sensor outresolving? These questions really don't matter...the thing that really matters is how the output image looks, and regardless of which thing you change, more resolution is pretty much always a good thing, sometimes a neutral thing, but never a bad thing.

The D800/E sensor is definitely higher resolution than the 5D III, for example...however Canon lenses outperform most Nikon lenses, so in most cases, the better Canon lenses paired with the lower resolution 5D III outperform, by a small margin, the D800/E. Even DXO's own lens data shows that. The D800 sensor will certainly make the absolute most out of Nikon lenses, but until Nikon improves their lens designs, the D800 does not actually perform better, in the real world, than the 5D III. Ironically, it is thanks to that very fact that moire with the D800E is not a bigger problem than it is...the lenses soften detail enough that moire tends to occur minimally. The day Nikon lenses perform as well as Canon lenses, however, keep your eyes and ears peeled: The wrath of the moire-hating D800E user will be heard around the world. ;P

1307
EOS Bodies / Re: Canon EOS sensors, and technology
« on: April 09, 2014, 11:33:22 AM »
@Don: What do you mean by "address the sub pixels"? For DPAF, or QPAF, to work, the "subpixels" have to be underneath the CFA. If you are thinking you could get a higher resolution image by "addressing sub pixels", I don't think that would actually work.

I think this is the same mistake people make when they thing DPAF can improve DR...it really can't. MagicLantern improved DR by reading FULL pixels at two different ISO settings, and blending the result. But if you read half pixels at one ISO, and half pixels at another ISO,  you are actually getting quite a bit less light for both your high and low ISO "channel". Theoretically, you could improve the noise of the low ISO channel by applying the high ISO channels the way ML does, but since you are effectively doubling noise in the first place by using half pixels, your net gain in the end is effectively nothing...you end up roughly back where you started (i.e. if you started by binning the two halves (or four quads)).

1308
EOS Bodies / Re: Canon EOS sensors, and technology
« on: April 09, 2014, 03:05:42 AM »
Quote
Yes, it is a fact.  The magnitude of that difference is not so obvious.  Detail that is present but blurred in a predictable manner can be brought out in post.  (Side note - optical microscopes can now resolve beyond the Abbé diffraction limit, and one way of achieving that uses post-processing analysis of moiré resulting from patterned illumination, i.e. since the pattern is predictable, detail not present in the image can be extrapolated mathematically.)

Also, depending on the lens much of that extra detail may not be there to begin with, which is why the D800 with a Nikon 24-70/2.8 barely outresolves a 5DIII with a Canon 24-70/2.8 II, despite the 60% higher MP count of the D800's sensor.

I thought DXO showed the D800E as having the highest IQ if not almost as high as the Phase One IQ280 ? I could be wrong as I didn't study it, it was something I read in discussion of MF DB's. If that is not the case, please do list the top 3-5 markers of IQ in the DXO testing.

Do all the calculations you want. I am shooting for over 20 years now.
I have done my side by side apples to apples test using a Leica R Macro 60 lens mounted on each on a studio stand with controlled lighting. If you shoot jewelry, you will know the difference without a second thought.

Yes Don, I use a P25/P45 and have used, Kodak pro back, Blad CF39, Sinar evol75 on a Sinar 4x5. I have used Nikon before, but focus was horrible about 10+ years ago. Switched to Canon and have loved it for the product and service. I think they can do better in specialized features.

I use it in the studio on still subjects. For people I often use it for portraits that are slow moving. Otherwise I use the 5D mark2. 

There is a significant difference in the lowest end which is the P25(22MP) DB vs a 5D mark2. Regardless of all the numbers these guys want to crunch. A good portion and I don't hesitate to say it is due to the AA filter.

First, you put too much weight on DXO's numbers. As far as their sensor tests go, they do not actually measure "sharpness" or anything like that. It's actually extremely difficult to objectively test a sensor in terms of sharpness, as you have to use a lens to do so, in which case your not testing a sensor, your testing a sensor and lens combined, which totally changes the outcome (and the reasons why you get that outcome). The other problem with lens+sensor tests is they are bound by the least capable component...if the lens is the weak point, then no matter how good the sensor is, your output resolution is limited by what the lens is capable of...you can never resolve more than the lens resolves, period. Similarly, if the sensor has limited resolution and the lens is a powerhouse (like the Zeiss Otus 55mm f/1.4), then your output resolution is limited by the sensor...you can never resolve more than the sensor resolves, period. That makes determining how sharp a sensor is a very muddy issue, one that cannot be definitively pinned down. Hence the reason DXO measures things like SNR and dynamic range and color sensitivity in it's sensor tests...that's all they CAN measure.

Regardless of what DXO has to say about the D800 or D800E sensors, the removal of an AA filter does not increase image quality. Actually, in all too many cases (quite possibly the majority of cases), removal of the AA filter is guaranteed to REDUCE image quality, thanks to increased aliasing in general, moire specifically. This is clearly evident by all the numerous standardized image tests done with cameras over the years...while sharpness has increased in some newer cameras by a small amount, so too has moire. DPReview has plenty of examples where the removal of AA filters in Nikon cameras, or even just the weakening of the AA filter in many brands (including Canon) has greatly increased the amount of moire that occurs. (A great baseline for comparison on DPR is the 7D...it has an appropriately strong AA filter and doesn't suffer from moire at all. You can compare any newer camera with a sensor that is supposedly "better" than the 7D because of the removal or weakening of the AA filter...those images will be sharper, but they are usually riddled with moire.)

If the things you photograph have no regular/repeating patterns, and do not contain any elements with clearly defined edges, then increased aliasing due to having no AA filter is not an issue. There are not very many forms of photography where that actually turns out to be the case...landscape photography is probably one of the very few. Even say insect macro photography, for example, will suffer from the removal of the AA filter...things like antenna, feelers, legs, wing veins, anything thin, strait, with high contrast to it's surroundings will end up with clearly aliased edges, and not even a highly optimized AHD demosaicing algorithm will be able to hide that fact.

The only thing removal of the AA filter MIGHT do is increase the acutance between pixels, which ultimately has the potential to increase sharpness. This increase in sharpness is only possible if the lens is already resolving enough detail that the real image resolved at the sensor plane is not being oversampled by the sensor. Someone using the Nikon 14-24mm zoom lens on a D800E to photograph landscapes would probably be in heaven without an AA filter. There is a whole host of Sigma lenses that would probably fit quite well on the D800E also. I know I'd love to have such a kit for my landscape photography. For just about anything else, however, I'll take a camera with a properly designed OLPF. Sharpness is not the sole defining trait of image quality, it is only one of many (the others being things like SNR, dynamic range, color fidelity, spatial resolution).

Furthermore, the kind of blurring caused by an optical low pass filter (aa filter) is regular, predictable, and well-understood. That means it is very easily reversed (deconvoluted) with mathematical algorithms in software, and since it is a small effect at a specific and narrow range of spatial frequencies, it can be nearly entirely reversed. All it really takes is a light application of your basic unsharp mask to do a darn good job, and smarter algorithms that come with photo editing tools like Nik or Topaz suites can do an even better job. This is what Neuro was talking about.


1309
EOS Bodies / Re: Canon EOS sensors, and technology
« on: April 09, 2014, 12:55:55 AM »

OK, I shouldn't say large segment. But "ground breaking" changes or improvements are what big tech companies need to keep the spirit of innovation alive, don't you think? People who echo these sometimes small, and sometimes game changing innovations are what can snowball the market direction.

I think we are on the edge of a shift in digital cameras.

We need to step back and ask "why mirrors". In the days of film, you needed the mirror and optical viewfinder to know what you were looking at and we needed focusing screens to know if we were in focus.... Then came digital sensors and we treated them like film... because that is what we were used to.

A digital sensor is NOT film. It has different strengths and different weakness.... and the mirror is no longer the only way to see through the lens. A decent mirrorless camera (and there are several on them out there) will be designed to the strengths of digital technology. They already do many things better than DSLRs, but a great mirrorless camera will have to do everything better. Right now, the two big stumbling blocks are focusing and viewfinders.

Dual pixel technology may well be the end of the focusing dilemma... and as it matures we should be able to have far more capable focusing systems on mirrorless cameras than with DSLRs... the point I keep bringing up is that we should be able to recognize a bird and track it as it flies through the air, even though the operator is not steady. We already have P/S cameras that recognize individual faces and can even tag them for use on social media and I have a waterproof P/S that has "cat mode" and "dog mode" and when you put it in "cat mode" it tracks the face of the cat and not the dog so please don't tell me this is a far-fetched idea... It's not coming, it's already here!

The second stumbling block is viewfinders. Right now, optical viewfinders are better than EVFs.. A few years ago EVFs were garbage... there are some real nice ones now.... who knows what the future will bring? At some point, people will stop trying to design an EVF to be like an optical viewfinder and design them to the strengths of digital... perhaps they will get a bit bigger... perhaps you will have a little window open up on it to check focus at 10X... or exposure preview.... or whatever... but until they stop pretending it is optical they will be inferior. I am sure that what is currently in the labs is good enough for the real world.... we are that close.

I can see Canon coming out with a new camera that shakes things up. I would love to see a quad-pixel technology 7D mirrorless camera where you could address the sub-pixels individually for a 24megapixel  image with similar ISO and noise to the 70D, or bin them together for a 6megapixel low-light camera that had better low light performance than a 1DX....

Canon has a HUGE R+D department.... they are not all siting on their rear ends playing solitaire... something is coming and the delays to the successor to the 7D may just mean that the change is big.

Very well put. Especially that last bit...the change would really have to be big like that, for it to be justifiable. Otherwise, it's just a demonstration of a major Canon blunder, if the 7D II comes out and is a mediocre improvement over the 7D, and not much in terms of competition against counterpart offerings from other brands.

Regarding the two points about viewfinders and focusing. I'm not sure were "nearing the end" of the issues. I think DPAF marks the beginning of finally moving down the right path, however I think there is a lot of innovation along that path that needs to take place before you start seeing action photographers seriously think about dumping their dedicated AF sensors and familiar AF points for a mirrorless image-sensor-based AF system. DPAF should at least become QPAF, so we can detect phase in at least two directions. I think we may ultimately need to see one further innovation, dual-direction QPAF, where you have horizontal and vertical with one half of the sensor's pixels, as well as phase detected diagonally in two perpendicular directions with the other half of the sensor's pixels. Only then would you be technologically similar to how dedicated PDAF sensors are designed, and only then could you really start building advanced firmware to really produce high rate, high accuracy AF without a dedicated AF sensor.

There is still another problem, however, that mirrorless AF systems will need to overcome before they can really achieve parity with their dedicated AF system counterparts: Low Light Sensitivity. Modern dedicated AF systems are sensitive to light down to the -2 to -3 EV range. Not only that, each dedicated PDAF point receives a tiny fraction of the total light entering the lens (thanks to passing through a half-silvered mirror and an AF unit splitting lens), and each line sensor that comprises an AF point recieves at most half of that tiny fraction of total light. All that, down to at least f/5.6, and in "pro" grade cameras, down to f/8. Dedicated PDAF sensors are ludicrously sensitive to the smallest amount of light...and largely thanks to the fact that they can be fabricated independently of the image sensor, so they can be explicitly designed with huge photodiodes in each line sensor that have massive SNR. I'm  not sure how camera manufacturers will overcome this issue, as even at very high ISO settings, image sensors are nowhere near as sensitive as the photodiodes in PDAF sensors. I'm sure one of the big manufacturers will figure out something brilliant to solve this problem...but I think it is definitely something that needs to be dealt with.

(BTW, I am aware that Canon's current DPAF supports live view focusing up to f/11, however the speed of that focusing is nowhere even remotely close to as fast as a dedicated PDAF unit. The slower speed gives DPAF a bit of an advantage in that area...similar to the advantage Canon creates when they force a slower AF rate when attaching one of their teleconverters to a lens.)

As for EVFs, I can only hope they get significantly better. I'm very curious to see what Canon does with their Hybrid VF...I wonder how that will ultimately work, and whether it will be as flexible and user configurable/selectable as it really needs to be to be a success. I suspect it will be rather inflexible, and only activate the EVF under very specific circumstances (such as recording video).

1310
Animal Kingdom / Re: BIRD IN FLIGHT ONLY -- share your BIF photos here
« on: April 08, 2014, 09:47:03 PM »
We saw Ikarus, too:


WOW...that is FANTASTIC!

1311
Landscape / Re: Deep Sky Astrophotography
« on: April 08, 2014, 08:56:17 PM »
@TheJock: Check out the other thread. I've provided a lot of information on the kind of equipment you'll need to get started. We can continue the discussion there.

1312
Third Party Manufacturers / Re: How to Annoy a Photography Snob
« on: April 08, 2014, 12:24:35 AM »
I lost all respect for Ken Rockwell after reading this:

http://kenrockwell.com/ri/WhereDoBabiesComeFrom.htm

I know he, in his weird way, was trying to be "funny"...but so many things just go over the line in that page. When reading his photography pages, and when you see him in the few YouTube videos he is in, you get the feeling is a crass, arrogant buffoon...but when you read his "Where do Babies Come From"...you realize he's everything you fear he is...then you throw up.

I don't even bother to click on links to kenrockwell.com anymore...all I ever see now is...where do babies come from... T_T T_T T_T T_T T_T

1313
Landscape / Deep Sky Astrophotography
« on: April 08, 2014, 12:17:35 AM »
The other thread ended up with a bit too much discussion on the topic of astrophotography and the related gear. Figured a new, clean one, dedicated just to the imagery, would be good.

Please, feel free to share your own images as well! (If you already shared some in the old thread, maybe re-share them here, hopefully we can keep this topic free of astrophotography gear and technique related discussion, and just keep it on the images.)

Here are some of my images, produced with some dedicated astrophotography equipment (german equatorial tracking mount, or GEM, guiding telescope and camera, etc.) All of these were created from mid Feb. 2014 through the end of March. 2014.

Star Clusters
The Pleiades (Seven Sisters), in Taurus:

Original Attempt


Second Attempt (deeper exposures, softer detail due to tracking issues)

M35 and NGC2158, in Gemini


Nebula
Horse Head and Flame Nebulas, In Orion:


Orion Nebula (M42 & M43) and Running Man, in Orion:


Rosette Nebula, in Monoceros (Unicorn):

Original Processing


Reprocessed in PixInsight

Galaxies
M101 (Pinwheel Galaxy), in Ursa Major:


M81, M82 and NGC3077, in Ursa Major:


M51, in Canes Venatici:


Leo Triplet (NGC3628, M65, M66) & NGC3593, in Leo:

1314
Landscape / Re: Deep Sky Astrophotography
« on: April 04, 2014, 07:17:23 PM »
@Don & @Reinz: I'm really glad to hear you guys are interested in astrophotography. :) I think this is a GREAT time to get into the field...the technology we have today makes the cost of entry relatively low (if all you want to do is very wide field work, all you really need is a $800-$1000 mount, and your DSLR + lenses). If you find you really like it, high quality equipment can be purchased for only a few thousand dollars more, such as an astrograph OTA (like the AT8RC) and maybe an entry-level cooled astro CCD, the cheapest of which cost around $1500, about the same as a midrange DSLR.

The technology is pretty darn good, too. With an entry-level Atik CCD camera, people are producing high quality images that rival what NASA was getting a decade ago. Even highly advanced software packages for processing have become cheaper. It used to be that dedicated astro processing tools cost about $1000. Today, you can buy PixInsight, an extremely powerful processing system, for around $250.

Anyway, great time to be getting into astrophotography. I wish you guys the best, it's very fun (especially if your more technically minded, and enjoy a challenge.)

I started with a really crappy 3" refractor telescope and got hooked!

I picked up a Celestron Advanced GT tracking mount... it's nice and solid and seems to track quite well. After lots of fiddling with aligning mounts, I ended up putting in some patio stones in the yard, made sure they were as level as possible, and marked where the tripod legs go... instant alignment! I have an 8" reflector telescope that I can use, or I have a mounting rail with a quick release camera mount. I have shot video of planets through the telescope, 2X barlow, and a 60D and run the images through Registax and I have just started to get interested in image stacking for nebulas...

The more I learn, the better the images get, and that just makes me want to try harder.

Ah! So your already into it. Great to hear! Is that a 60D, or the 60Da (just curious)?

I haven't tried planetary yet. I'm using my 600mm lens as a scope, and it isn't even remotely long enough to do planetary. Right now is pretty much the time for planets, though. At night, we have Jupiter, Mars, and Saturn all near their closest approaches to earth (I think Jupiter hit perigee in January, and Mars hits Perigee this month!)

I just picked up a new contract, and it should pay decently. I may pick up the AstroTech AT8RC or AT10RC, to get myself an actual OTA that I can use a barlow with, and get a planetary imager. I don't want to miss the opportunity we have right now with all three planets high in the sky during the night.

Backyard EOS seems like a perfect tool for capturing images of nebulas.... what software do you recommend for processing the images?

For processing, I recommend you start with Photoshop. You should pick up Carboni's Astronomy Tools actions, and maybe Annies Astro Actions. These are practically essential, as they take otherwise complex, multi-step operations to help you stretch, denoise, deblotch, and enhance your images, and makes them "one click", effectively. Some actions might pop up some standard photoshop tools for input, but for the most part, these two action sets make up the core of the astrophotographers toolbox. At least, for DSOs they do.

Now, you do planetary, and planetary generally needs some different processing. I haven't looked too deeply for planetary processing actions, but I'm sure there are some out there. I'd look around, see what you can find. Ready-made actions really make the processing go faster, and are well worth the $20, $30, $50 you have to spend on them.

For planetary images, it's been recommended that I get a Celestron Neximage 5 CMOS Solar System Imager Camera to use instead of my DSLR... and I have been thinking of getting a 6D to replace my 60D for night skies... the 60D is real noisy!

I do recommend getting a proper solar system imager. However, in my research, a lot of Celestron's equipment turns out to be bottom rung. They make excellent OTAs, and their CGE Pro mount is quite good, however their guide camera and neximage imagers should probably be avoided.

If you want a good planetary imager, I would look at QHY (http://qhyccd.com/en/left/page3/qhy5-ii-series/). They make a MUCH better imager, using Aptina sensors (high Q.E., high dynamic range). You could also look into the Starlight Xpress Lodestar (http://www.sxccd.com/lodestar-x2-autoguider). The Lodestar X2 was just announced, however it uses the new Sony ICX829 sensor, which is one of the most sensitive sensors on the market. The Lodestar has always been one of the most recommended guiding cameras, although it also works for planetary (IIRC)...the catch is that it is VERY expensive. Another option is the SBIG ST-i, which is also a guider and planetary camera. I like SBIG, Santa Barbara Instrument Group, good old "Made in the USA". Plus, I used to live very near Santa Barbara when I lived in California...kind of my old stomping grounds. The ST-i is more often used as an off-axis guider with the SBIG astro CCD cameras, but it is also a very good planetary imaging camera. It's cheaper than the Lodestar, but I think a little more expensive than the QHY.

1315
Landscape / Re: Deep Sky Astrophotography
« on: April 04, 2014, 06:36:20 PM »
@Don & @Reinz: I'm really glad to hear you guys are interested in astrophotography. :) I think this is a GREAT time to get into the field...the technology we have today makes the cost of entry relatively low (if all you want to do is very wide field work, all you really need is a $800-$1000 mount, and your DSLR + lenses). If you find you really like it, high quality equipment can be purchased for only a few thousand dollars more, such as an astrograph OTA (like the AT8RC) and maybe an entry-level cooled astro CCD, the cheapest of which cost around $1500, about the same as a midrange DSLR.

The technology is pretty darn good, too. With an entry-level Atik CCD camera, people are producing high quality images that rival what NASA was getting a decade ago. Even highly advanced software packages for processing have become cheaper. It used to be that dedicated astro processing tools cost about $1000. Today, you can buy PixInsight, an extremely powerful processing system, for around $250.

Anyway, great time to be getting into astrophotography. I wish you guys the best, it's very fun (especially if your more technically minded, and enjoy a challenge.)

1316
EOS Bodies / Re: Canon's Medium Format
« on: April 04, 2014, 05:20:22 PM »
Sorry, everyone! Did not mean to kill this thread.

Regularly scheduled medium format discussion should continue from here.

1317
Landscape / Re: Deep Sky Astrophotography
« on: April 04, 2014, 05:16:11 PM »
WOW, that's a lot of information ... I've copy pasted it on to my smartphone and will have to go over it at least a few times to get my heard around it ... I think it looks like I'll have to raise my budget to around $2000 ... thanks for the awesome info.

Yeah, I would say $2000 will get you much farther. I spent just a little over $2000, on the Orion Atlas, a guiding setup, and a few accessories from ADM to help me mount it all.

I HIGHLY recommend you look into BackyardEOS. You can do astrophotography with nothing but a cable release and bulb mode, but BYEOS makes things much, much easier. Especially focusing, it uses a tethered live view mode to show you, on your laptop screen (or a Windows 8 tablet, if you have that) what the camera sees through the lens. You can center on bright stars, and actually control the lens' focus (even if it's switched to manual) with BYEOS itself, and it offers some VERY fine control. You'll never be able to focus properly with just the viewfinder or just live view on the back of the camera, and getting focus right is pretty critical.

BYEOS also lets you set up sequences, choose aperture, ISO, and exposure duration, mirror lockup, etc. You can set up multiple sequences in a single "program" to take multiple exposures of different durations as well, so you can create HDR images for scenes that might require it (say Orion Nebula or Andromeda Galaxy). It also lets you program sequences to take dark, bias, and flat frames (which you should research, as they are pretty essential). Taking darks, biases, and flats can be a real pain when you do it with a cable release, because you have to keep reconfiguring the exposure for different things. With BYEOS, you can just program HUGE sequences of darks, for example, spanning exposure times from 30 seconds to 600 seconds, and just let it run (which literally takes hours.)

Anyway, with $2000 you'll definitely have enough to get started.

1318
It's becoming harder to convince my friends that buying only Canon gear is the right thing to do.

Sigma have put out some amazing lenses in the last two years and Canon's only memorable release is the Canon EF 24-70mm F/2.8 L II.

I guess "memorable" is subjective, but over the past couple of years Canon has released, in addition to the lens you mentioned, the 24-70 f4, 24IS, 28IS, 35IS, 40mm pancake, and 200-400L, all of which are first rate regardless of whether one may actually want any of them.

Agreed, Canon has made several memorable lens releases. The 24-70 and 200-400 are probably the most memorable, but most people who have one seem to rave about the forty shorty, too!

1319
Landscape / Re: Deep Sky Astrophotography
« on: April 04, 2014, 02:28:47 AM »
Hi Reinz. When it comes to astrophotography, the mount is pretty much the most important thing. Most astrophotographers who have even moderately diverse goals (i.e. just galaxies and nebula) are going to need to use multiple telescopes with different focal lengths, or at least one telescope with barlows and focal reduces, to get a field of view wide enough or narrow enough to frame their subjects properly. A good mount can last you for many, many years, where as telescopes (or, for that matter, camera lenses) usually come and go until you hit the real high end (i.e. 20" RCOS or PlaneWave telescopes).

For $1000, you can get yourself an entry-level mount. Something like the Orion Sirius, which is the little sibling of the Orion Atlas. The Sirius has a capacity of 30lb, which for visual is generally fine, but that pretty much equates to 15lb for astrography (the Sirius doesn't have the most sturdy tripod, so you REALLY have to stick to the 50% capacity limit for imaging work). That is practically nothing in terms of capacity, but if you just stick to your DSLR and lenses, it'll at least get you started.

The Orion Atlas is a much more capable mount, it's capacity is 40lb, however imagers have been putting on 60-70% of the capacity and getting excellent results. Visual observers have put over 50lb on this mount when using sturdier tripods or full blown piers. The Orion Atlas is $1499, however it's fairly frequently on sale for $1399, and at times has been as low as $1200. Given how important the mount is, especially if you think you might want to move up from your lenses to a real telescope at some point in the future (and entry cost for telescopes can actually be pretty low...for example, the Astro-Tech AT6RC, a 6" Ritchey-Chretien telescope, is only $399 and it's designed specifically as an astrograph.) If you can muster it, I highly recommend getting the Orion Atlas mount, even though it's more than your $1000 budget. It will give you LOTS of room to grow in the future if you find that you like astrophotography (it could even be "the" mount you use for the next ten or twenty years....many people used the predecessor to the Atlas/EQ6 class mounts for about that long.)

From your existing equipment, the 5DIII hands down. Don't use a Nikon for astrophotography...their nickname in our community is "Star Eaters", since they clip to the black point, rather than using a bias offset (one of the many ways Nikon "cheats" their way towards cleaner shadows :P.) Canon's use of a bias offset is the reason there is a lot of banding in their shadows, which isn't good for regular photography. However since in astrophotography we use bias frames to remove the bias from the signal, Canon DSLRs are actually a lot better...they preserve more stars and deep nebula detail. So definitely use the 5D III.

You have a good range of lenses as well for "wide field" work. The 40/2.8 @ f/4 and 50/1.4 @ f/3.5 are both excellent for "whole constellation" images (for example, you could image the entirety of the core Orion constellation, as well as most of his club and kill: http://bit.ly/1lF7hSp) The 100mm Macro @ f/4 is a great lens for imaging entire small constellations, or for imaging parts of larger constellations (for example, it would neatly encompass the core of Orion, but not his club or kill: http://bit.ly/1jIciah) The 70-200 at 200mm @ f/4 is great for narrower regions, small constellations (for example, 200mm would encompass Orion's Belt and Sword, and the small reflection nebula M78: http://bit.ly/1mOwpGH) The 100-400 at 400mm @ f/8, while a bit slower and probably requiring more equipment (such as a guider, which itself would probably require a number of additional accessories to properly mount next to your camera), is good for imaging nebula themselves (for example, it would encompass just Orion's sword, which includes Orion Nebula (M42/M43) and Running Man Nebula: http://bit.ly/1ltmAeo; or it would encompass just Orion's Belt, which includes Horse Head and Flame Nebulas, IC434, and a number of small reflection nebula: http://bit.ly/1dSzPFJ).

If you go with just the mount, you will be able to attach your DSLR and a lens. The 100-400mm is probably not quite going to work, as you would need pretty steady tracking to image at f/8...that's pretty slow. Were talking 1" (" means arcsecond, ' means arcminute, 60 arc minutes per degree) tracking, which is not easy to achieve. So your probably going to be stuck at 200mm and less until you decide to upgrade. Thing is, that is really the best place to start anyway, as at those focal lengths, tracking error is really forgiving, so you should be able to track for several minutes, maybe as much as five minutes, without appreciable star elongation or trailing, allowing deep exposures of wide regions of the sky (which, during the two times of year when the milky way is up, are PACKED with IMMENSE swaths of nebula).

Unguided imaging is basically the domain if the wide and ultra wide field. If you want to see the kinds of images you can get at those scales, you should check out AstroBin. Plenty of good examples there (better than anything I've done as of yet.)

If you get an Orion Sirius mount, which is $1000, then that will suffice for DSLR with 200mm and less. You'll need to get a better mount than that if you want to do more. There are a lot of small APO refractors on the market, ranging in price from around $500 to as high as $10,000 or more, however most of the smaller, lighter ones that would work on a Sirius fall into the same general focal range that you already have with your Canon lenses (200mm to ~800mm). The logical upgrade for you would be to eventually move to a Cassegrain type OTA (Optical Telescope Assembly). Cassegrains include your standard SCT (Schmidt-Cassegrain Telescope), the Celestron EdgeHD (an Aplantic SCT, designed specifically to support a wide and flat field, right into the corners, for imaging), and the Ritchey-Chretien cassegrains (primarily those from Astro-Tech.) Meade also makes some Aplantic SCTs like Celestrons, however they tend to be more expensive, despite not really offering anything more, and there is one special benefit to the Celestron EdgeHD OTAs: They support Hyperstar, a special conversion mod that allows you to do ultra wide field imaging (~200-400mm) at f/2 (REALLY FAST...you could get really deeply exposed images in a couple minutes at that aperture.)

Generally speaking, the best upgrade from DSLR+Camera Lens imaging is to move to something like the Celestron EdgeHD 8" SCT, or the Astro-Tech AT8RC 8" Ritchey-Chretein. Both are reasonably priced, although Astro-Tech's prices are really hard to beat for the quality, optical design, and overall capabilities for imaging. For either of these, you would really want at leas the Orion Atlas (or the equivalent from Celestron, the CGEM or CGEM DX, however the Atlas is really the better option due to the rich community, EQMOD, and the option for installing belt mods to improve tracking and guiding accuracy down the road.)

My recommendation is pick up the Orion Atlas EQ-G, and use your 5D III and 50mm, 100mm, and 70-200mm lenses. You should be able to just bolt your camera to the included Vixen dovetail that comes with the mount, and not bother with purchasing any additional accessories initially. You will need to learn how to polar align the mount (the Atlas comes with a built-in polar finder scope, which once properly centered (the most annoying thing you will ever do, but thankfully you only have to do it once! :P), is highly accurate and easy), and you will need to either learn how to use the hand controller to "Align GOTOs", or purchase a $40 EQDIR cable, use EQMOD, and completely computerize your process (HIGHLY recommended, you can buy BackyardEOS ($50) to greatly simplify your imaging sequences, and gain a lot of powerful features, such as highly precise live view focusing on your laptop or a windows 8 tablet, to get the best results.)

1320
Landscape / Re: Deep Sky Astrophotography
« on: April 04, 2014, 12:26:44 AM »
Interesting, and I do like the color detail of the nebula, but overall it just looks softer than I'd like. 

It's a nebula...they generally tend to be "soft", what with being a bunch of whispy gas and all. ;P As for the stars, I purposely "decrispified" them and made them rounder/softer because otherwise they completely dominated the image, making it difficult to actually see the nebula. Part of the reason my stars end up too bright and crisp is the centroids are getting just a touch clipped during my exposures (necessary, to expose the nebula properly), and during processing the centroids get enlarged. So the star reduction routine is really just restoring the proper look to the stars anyway.

simply fantastic!  it feels like i'm looking at it through an ultra powerful telescope.

Thanks! Rosette is actually a fairly large nebula. It's larger than the Orion Nebula, which you can sort of see with your naked eye, too large even to fully fit in my 600mm FoV. The entire region is probably a bit bigger than your thumb  if you held it out about a foot and a half from your face over the sky...just to give you an idea of how large this region of space actually is. ;) Sadly, Rosette is so dim that unless you had a really garganguan telescope with  multi-foot sized aperture, you probably could never observe it visually.

Actually the stars in this image appear soft...as does really the entire image.  Just calling it like I see it.

Indeed, soft exactly as they are supposed to be. ;) No one wants HARD stars in their astrophotos...they are overbearing and dominating, and distract from the rest of the image, from the more interesting aspects of the image.

Pages: 1 ... 86 87 [88] 89 90 ... 298