October 24, 2014, 11:55:30 PM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - jrista

Pages: 1 ... 87 88 [89] 90 91 ... 299
1321
Lenses / Re: Focal lengths
« on: April 13, 2014, 01:46:33 PM »
Excluding telescopes, I've used the EF 600/4 L II with both 2x and 1.4x TCs (a Kenko in this case, as they stack directly without the need for an extension tube...IQ suffers a little bit). That gets you to 1680mm.

With telescopes, I've poked around with focal lengths up to around 8000mm to 10,000mm using SCT and RC type OTAs with barlow lenses. The only real reason you would use such focal lengths is for planetary (to get any real kind of sharp detail on planets, you need at least 8000mm), solar (sunspot closeups) and lunar (individual craters and finer surface detail).

I haven't purchased my own OTA yet, once I do, I really can't wait to do planetary imaging at over 8000mm. :P

1322
Animal Kingdom / Re: Show your Bird Portraits
« on: April 13, 2014, 12:11:20 AM »
Heard a bit of a "cooing commotion" outside earlier today. Slide my curtain aside just a little, and saw a couple of Mourning Doves mating. They had this cute little ritual: Preen, Cuddle, Kiss, Mate. They "did it" about a dozen times. :P

I have a blog forthcoming, but here are a couple of shots that I've quickly processed. I had no time to really set up, so I just grabbed my lens (only thing readily available was my 600mm, and they were RIGHT there on the railing of my deck, so apologies for the tight crop...these are all mostly 100% full frames), stuck it to the sill of my sliding glass door to my deck, and started shooting...no real artistic flare here, just a small little documentary of a cute dove couple in cute little dove love. :D













Enjoy!  :-*

1323
Landscape / Re: Astrophotography - which camera?
« on: April 12, 2014, 10:59:29 PM »
Wonderfully explained!

Conclusion: a 200mm lens is wwwwwwwway too short for the ISS. It's hopeless, meaning, that even if you capture it perfecty, you will have a hard time spotting it in the final image. And for streak images, 70 is wwwwwway too long.

So, the question is not "which camera" but rather "which lens". The answer is: No lens at all, you will need a telescope for that. So if you plan to get into astrophotography, there just is no other option than getting a telescope. Fortunately, compared to EF glass, telescopes are rather cheap. They are basically mirrors, after all.

It isn't quite that simple. Most telescopes are designed for visual observing. The ones in the range of a few hundred dollars to about a thousand are not really ideal for use as "astrographs". As it actually stands, a lot of people, including myself, price Canon's telephoto EF lenses like the EF 600mm f/4 L II as telescopes due to their superb optical quality as refracting telescopes. A good refracting telescope, such as an AOP Triplet, STARTS at around $1500, and the price can reach as high as $15,000 for a very good APO quadruplet or quintuplet.

Regarding reflector-type scopes. There are some newtonian astrographs, and they tend to be the cheapest, but you are still going to spend around $1000 for one. SCTs, or Schmidt-Cassegrain Telescopes, usually start around $1500 and can reach many thousands of dollars. You usually get a lot more aperture (physical aperture) with an SCT than with a Refractor, and they are usually optically superior to Newtonians. RCs, or Ritchy-Chretiens, are also "cassegrain" type reflecting telescopes, and are generally the preferred type of OTA design for "astrographs". It is actually pretty easy to get a good quality RC astrograph for a good price.

Specifically, the best entry-level one is the Astro-Tech 6" AT6RC, which is a 6" diameter aperture (same diameter as the EF 600mm f/4 L II lens) with an f/9 focal length (1372mm, to be exact.) If you want to get an actual telescope for doing astrophotography, and don't want to spend a lot of money, the AT6RC is only four hundred bucks:

https://www.astronomics.com/astro-tech-6-inch-ritchey-chretien-astrograph_p19910.aspx

The AT6RC will make the ISS 100 pixels in size on the 7D, without any additional accessories. If you slap on a 2x barlow lens, then you have 2744mm of focal length, and the ISS will be 200 pixels in size. Keep in mind, if you push your focal length that much, you are going to need a VERY, VERY, VERY STABLE mount, because the smallest amount of shake will completely obliterate any detail you might otherwise resolve. You will, most likely, also want an equatorial tracking mount with the ability to use custom tracking rates other than sidereal or king, and the only mount I know of that can do that easily (for ~cheap) is the Orion Sirius (~$1000) or Orion Atlas (~$1400), because these mounts support EQMOD (which allows you to download and utilize custom tracking rates, and there are premade profiles for known satellites and the ISS that you can simply download). You would need a tracking mount for the AT6RC strait up, let alone with a barlow.

If you want to go larger/longer, you could look at the AT8RC, which sells for about $900. You would definitely want the Orion Atlas, so the total cost of the mount + OTA at that point is around $2400, however the larger aperture will allow you to resolve MUCH finer details on the ISS than a smaller OTA (resolving power and smallest resolvable magnitude of astronomical objects, including satellites and the ISS, is directly related to the physical aperture diameter).

1324
Landscape / Re: Astrophotography - which camera?
« on: April 11, 2014, 10:15:23 PM »
It's often easier to do these things in arcseconds. 

...

Yes, definitely arcseconds. Couldn't agree more.

Wizardry alert. :)

I obviously haven't made the jump into Astrophotography yet  ???

Angles are easier because the sky, for all intents and purposes, is the inner curved surface of a perfect sphere, and measured in degrees for the celestial coordinate system (Right Ascension and Declination) and, therefor, arcminutes and arcseconds naturally apply. ;)

1325
Landscape / Re: Astrophotography - which camera?
« on: April 11, 2014, 08:14:50 PM »
It's often easier to do these things in arcseconds. The ISS is ~63" (arcseconds) long. We can compute the number of arcseconds/pixel ("/px or asp) using this formula:

Code: [Select]
(206.265 / focalLength mm) * pixelPitch µm
For the 7D and a 200mm lens, our asp is (206.265/200mm)*4.3 = 4.43asp. For the 5D II and a 200mm lens it's (206.265/200)*6.4 = 6.6asp. If we take the 63" length of the ISS, and divide it by our sensors arceconds/pixel ratio, we get ~14px for the 7D and ~9.54px for the 5D II.

Now you can slap on teleconverters to get longer focal length. The 7D will suffer from the effects of diffraction and less lignt sooner, meaning the 5D II will then be more capable of using a longer focal length with a teleconverter, or for that matter stacking teleconverters (you can stack in a number of ways...2x III + 1.4x Kenko, 2x III + 25mm ext tube + 2x TC III, etc.) Let's say you use a 2x TC on the 7D and two 2x TC's on the 5D III. The 7D and 5D II are going to be producing roughly the same noise, and diffraction softening will be roughly equal (slightly more on the 5D II):

7D: (202.265/400)*4.3 = ~2.2asp
5D II: (202.265/800)*6.4 = ~1.1asp

We have a roughly equivalent IQ case here (similar amount of noise), but a much longer focal length on the 5D II. The ISS is 28px large on the 7D, but 57px large on the 5D II. Since we are talking about highly collimated light, all you need to do really is manually focus in the stars or the moon...so you could, theoretically, stack as many teleconverters as you think your pixels will handle. The larger pixels of the 5D II will handle more than the 7D before you start achieving similar results on both (diffraction blurring will eventually reach a point where the ISS is blurred the same mount on both if you just keep stacking TCs, and the 7D will simply be oversampling that blurry image more than the 5D II, albeit with more noise.)

1326
EOS Bodies / Re: EOS 7D Replacement Mentioned Again [CR1]
« on: April 11, 2014, 07:55:59 PM »
In less than 5 months, it'll be 5 years since the 7D was announced and nothing so far indicates that a 7D II will ever be released ... I've lost my interest in it and if it does get released, I most likely will not buy one.

And when it turns out to be the best camera in Canon's lineup from a <insertyourpreferredfeaturehere> (1D X excluded from comparison :P), you'll most definitely be interested again! :D

While it is pretty odd that the 7D replacement has taken so long to deamorphize, I don't think it will arrive and be a dud, either. The 7D was (is?) one of Canon's most popular cameras, and they have to know that there are very high expectations for it's replacement. While it's certainly a personal choice whether to buy one or not, to not buy one just because it hit the streets later than you wanted is hardly a good reason. ;P And I think it's interesting for the very fact that it's taking so long...it just....HAS to be GOOD! Right?!

1327
EOS Bodies / Re: Canon EOS sensors, and technology
« on: April 11, 2014, 12:04:43 PM »
Speaking of battery life in DSLRs. When I attach my battery grip to my 7D and use fully charged batteries, I can get well over 2500 shots (which is usually about the limit for what I get on a shoot these days, not because of no battery power, but because by then the sun has set, or the animals/birds moved on, or something.) I rarely use my batteries below the 50% mark when using the grip, so it's possible I could get quite a bit more than 2500 out of one pair of fully charged batteries.

1328
EOS Bodies / Re: Canon EOS sensors, and technology
« on: April 09, 2014, 08:05:37 PM »
Jrista
Your points about removing AA filters are spot on,  moire and alaising will always be present and higher performance lenses will definately show the problem faster I would not buy any camera without one. However AA filters themselves are very difficult to produce especially in large volume and keep consistent. The AA filter is part of the optical system and taken into consideration when designing lenses and well designed and manufactured AA filters now have a minimal impact on resolution. Lateral chromatic abberations are still visable in most Canon lenses including L lenses and this will have an affect on apparent sharpness particularly further out to the edges of the frame and Canon will need to address this going forwards especially as Zeiss roll out more of their Otus range I see this as more of an issue rather than the affects of the AA.

Canon's shorter focal lengths, under ~200mm (including the 70-200) do need improvement in the corners. The worst of Canon's lineup are their wider angle lenses. The 24-70 II improve things, however it's corner performance (as even indicated by Canon's MTF charts) is still quite poor. I don't know why Canon has such a hard time with wide angle corners, but it's their Achilles heel, for sure. I think that is one of the main reasons Sigma has been making such major strides...they found the weak spot in the biggest photography manufacturer in the world, and have been exploiting it as much as they possibly can. :P

These days, I'm less concerned about a manufacturer's ability to produce AA filters of consistent quality, and more concerned about their ability to produce them strong enough. Sadly, I think the (uneducated) demands of the consumer for no AA filters are winning out in this arena, despite how non-beneficial that is for IQ. People want "sharp out of camera", and don't seem to understand the consequences of the tradoff that is REQUIRED to make that happen. An appropriately strong AA filter that minimizes moire to the point where only the strongest interference patterns make it show up is what we really need. I'd rather have slightly soft out of camera without moire, as I can easily sharpen in post, than have razor sharp out of camera with a bunch of aliasing and moire.

The AHDD, or Adaptive Homogeneity-directed Demosaicing algorithm used in Adobe ACR/LR is highly optimized. It is capable of interpolating in such a way as to utilize the raw luminance information in EVERY pixel, and only really suffers the resolution loss when interpolating the color channels. That means were getting the vast majority of the resolution our sensors are capable of with modern RAW editors like Lighroom, and only really suffering some loss in resolution and color fidelity in the color channels. That doesn't much matter, though, as we aren't as sensitive to softness in color as we are to softness in luminance detail.

Not to mention the fact that most cameras offer far more resolution than a growing majority of photographers need, what with publication on the web at relatively small sizes (compared to those required for print) being the primary means of sharing photography...from your average instagram and facebook uploads to your avid amateurs to your professionals.

1329
EOS Bodies / Re: Canon EOS sensors, and technology
« on: April 09, 2014, 05:17:11 PM »
Quote
First, you put too much weight on DXO's numbers. As far as their sensor tests go, they do not actually measure "sharpness" or anything like that. It's actually extremely difficult to objectively test a sensor in terms of sharpness, as you have to use a lens to do so, in which case your not testing a sensor, your testing a sensor and lens combined, which totally changes the outcome (and the reasons why you get that outcome). The other problem with lens+sensor tests is they are bound by the least capable component...if the lens is the weak point, then no matter how good the sensor is, your output resolution is limited by what the lens is capable of...you can never resolve more than the lens resolves, period. Similarly, if the sensor has limited resolution and the lens is a powerhouse (like the Zeiss Otus 55mm f/1.4), then your output resolution is limited by the sensor...you can never resolve more than the sensor resolves, period. That makes determining how sharp a sensor is a very muddy issue, one that cannot be definitively pinned down. Hence the reason DXO measures things like SNR and dynamic range and color sensitivity in it's sensor tests...that's all they CAN measure.

This is not news. Lets leave DxO out. I am sorry I even mentioned them for this discussion. I mixed some info and used them as a point of reference. Its something that can happen with the amount of info I go through. Please accept my blunder as a simple error as DXO is often the reference point for sensor quality, and I understand it is not of sharpness of the sensor, and more with DR and ISo, and the related.
[/quote]

It's not a problem to use sources like DXO as a point of reference. It just helps to have all your facts strait before doing so, so you don't mislead or confuse or otherwise sidetrack readers with incorrect or inconsequential information. ;P

Quote
Regardless of what DXO has to say about the D800 or D800E sensors, the removal of an AA filter does not increase image quality. Actually, in all too many cases (quite possibly the majority of cases), removal of the AA filter is guaranteed to REDUCE image quality, thanks to increased aliasing in general, moire specifically. This is clearly evident by all the numerous standardized image tests done with cameras over the years...while sharpness has increased in some newer cameras by a small amount, so too has moire. DPReview has plenty of examples where the removal of AA filters in Nikon cameras, or even just the weakening of the AA filter in many brands (including Canon) has greatly increased the amount of moire that occurs. (A great baseline for comparison on DPR is the 7D...it has an appropriately strong AA filter and doesn't suffer from moire at all. You can compare any newer camera with a sensor that is supposedly "better" than the 7D because of the removal or weakening of the AA filter...those images will be sharper, but they are usually riddled with moire.)

Quote
while sharpness has increased in some newer cameras by a small amount, so too has moire

The moire is subjective. I'm not too interested in the DPReview samples showing loads of moire issues. I have plenty personal samples I can stand by to tell you otherwise. Many samples in those cases are looking to show moire, and samples of it.

Actually, moire is a concrete, immutable artifact of repeatable patterns near nyquist interfering with the sensor grid. It not only leaves behind funky color and monochrome patterns...they are neigh impossible to correct in post...there IS no full moire removal in any RAW editor for a very good reason: It's impossible. You can reduce color moire, however depending on the tool, you might end up with color desaturation, blurring, etc. as a result. Even after removing color moire, the underlying monochrome moire pattern remains, and it cannot be removed (at least, not without significant blurring.)

We aren't talking about a subjective factor if IQ here, we are talking about a detrimental, and permanent, factor of IQ that gets introduced when the AA filter is removed. The DPReview sample images are not intentionally trying to show moire...they are simple sample shots of their standardized test scene. Moire occurs in their samples as a CONSEQUENCE of weak or missing AA filters. You can't simply brush moire and aliasing to the side and call it a non-issue...it is a critical issue to a great many photographers.

Quote
removal of the AA filter is guaranteed to REDUCE image quality


Be more specific. As with this statement, in this discussion you are saying that fullframe or larger sensors that are not using AA have lower image quality. How do you figure?

I explained it in the text you failed to quite. :P

Quote
If the things you photograph have no regular/repeating patterns, and do not contain any elements with clearly defined edges, then increased aliasing due to having no AA filter is not an issue. There are not very many forms of photography where that actually turns out to be the case...landscape photography is probably one of the very few. Even say insect macro photography, for example, will suffer from the removal of the AA filter...things like antenna, feelers, legs, wing veins, anything thin, strait, with high contrast to it's surroundings will end up with clearly aliased edges, and not even a highly optimized AHD demosaicing algorithm will be able to hide that fact.

The underlined falls under EXACTLY what I shoot on a regular basis, and I, with all the respect I have for your knowledge as I have read much of your posts, I think you are simply flat wrong about this. I have worked with about 20 digital camera systems in the past 24 years. I certainly don't have the understanding of sensors, and electro engineering you do, or even in the realm of it.  I know I have shot just about everything there is to shoot, and I specialize in macro work WITH dealing of " thin, strait, with high contrast to it's surroundings ". I uesd the Kodak 14mpixel SLRc camera, and if it didn't have issues with handling light, I would continue using it. The images from that didn't suffer the things you claim. Nor do the MF backs, tossing the optional AA filter aside. (never used one to this day).  Has moire EVER happened? Yes. Can I remember it being a problem or can I even count on my 10 fingers vs over 400K frames (with half using filter free cameras)? NO.

It's fine to have personal preferences. To base the entire discussion of "Canon EOS sensors and technology" solely on your personal preferences kind of makes it difficult to have a coherent discussion. Your personal preferences should really be left out of an objective discussion of the fundamental technology behind sensors, otherwise were just in the muddy territory of subjective muck, and anyone can make any argument to justify their own personal opinions. I personally try to remain objective when discussing technology, and leave my own personal preferences out of the discussion.

Regarding whether moire is a problem on MF cameras, Leicas, etc. If you do a few quick web searches, you'll find that they are a huge problem. There are countless threads on the subject, dating back many years, with MF and Leica users (and increasingly Nikon users) complaining about how bad the moire and aliasing can be on their incredibly expensive cameras. The solution, for many, is to use the lens so act as the AA filter. Either stopping down beyond the diffraction limited aperture of the sensor, or slightly defocusing, etc. One way or another, people have to deal with moire and aliasing if it occurs. If you have to constantly perform a very slight defocus, that makes using an autofocusing system very tedious. Concurrently, having to stop down more than you really want to in order to force diffraction blurring to soften the image is also less than idea.

You say you have used a lot of cameras over a lot of years. I'd be willing to bet many of them were film cameras, in which case moire was never a problem thanks to the random distribution of grains. When it comes to digital cameras, until recently, lenses, while good, were never as good and sharp as they are today (at least, in the DSLR world...for MF, most lenses have always been rather exceptional.) The softer lenses of the past helped to deal with the problem of missing or weak AA filters. Today, we have a convergence of several things that can only lead to significant problems with moire and aliasing: Radical improvements in lens quality, pushing their maximum resolving power to new limits; sensor resolution increasing at a slower pace than lens resolution; removal of AA filters. This is kind of a perfect storm...some manufacturers are apparently doing everything in their power to make moire a very serious problem for a lot of DSLR photographers, which will ultimately put them in the same boat as Leica and MFD owners: Having to defocus or stop down to force blurring and use the lens as an artificial AA filter.

Did you discuss the bold area I highlighted above (about the ratio between lens to sensor) a bit more in detail someplace?  This is likely the feature I'm looking for to be optimal, and likey what the D800E, and A7R have factored in. It is my next criteria for my future camera/sensor purchase.

If you mean the fact that output resolution is based on the convolution of lens+sensor, I've discussed it so many times all over this forum, it shouldn't be hard to find a topic with all the details. The detail in an image (raw file) is the result of a complex convolution of real-world details. In mathematical terms, assuming gaussian-like blurring behavior (which is reasonable), output resolution is roughly equal to the root mean square (RMS) of the input resolutions. Well, to be more specific, the size of the blur kernel that represents the output image is approximated by the RMS of the blur circles of the lens and sensor.

So, if your lens blurs by 3µm and your sensor has 5µm pixels (the lens resolves more detail than the sensor), then the output blur is SQRT(3µm^2+5µm^2), or 5.83µm. Notice that the output resolution is lower than BOTH your sensor and lens. If you improve your lens resolution as far as possible, let's say 0.7µm blur circle (the wavelenght of red light), your output blur is 5.04µm. Your maximum resolution is limited by the sensor...no matter how good your lens is, you can never resolve more detail than the sensor is capable of. This goes the other way as well. Let's say your lens blur is 3µm and your sensor has 2µm pixels. Your output blur is 3.6µm. If you reduce your sensor pixels to 800nm (0.8µm), your output blur is 3.1µm. You can never get any better resolution than your worst performing component.

That's why I always say the whole notion of sensors or lenses "outresolving" the other is more myth than fact. In one sense, I understand why people think about it that way. In reality, the two work together to resolve your image...without both, you have no image, so there really isn't one outresolving the other. The real fact of the matter is your output resolution is never as good as the potentials of your lens or sensor, and your output resolution can never be higher than the least capable of the two. Further, lenses have non-linear performance...as you stop the aperture down, their performance drops. It's tough to say a lens outresolves a sensor in general...at what aperture does it "outresolve"? And by how much? Enough to matter? Or is the lens just outresolving by a tiny bit? When you stop down to f/8, is the sensor outresolving? These questions really don't matter...the thing that really matters is how the output image looks, and regardless of which thing you change, more resolution is pretty much always a good thing, sometimes a neutral thing, but never a bad thing.

The D800/E sensor is definitely higher resolution than the 5D III, for example...however Canon lenses outperform most Nikon lenses, so in most cases, the better Canon lenses paired with the lower resolution 5D III outperform, by a small margin, the D800/E. Even DXO's own lens data shows that. The D800 sensor will certainly make the absolute most out of Nikon lenses, but until Nikon improves their lens designs, the D800 does not actually perform better, in the real world, than the 5D III. Ironically, it is thanks to that very fact that moire with the D800E is not a bigger problem than it is...the lenses soften detail enough that moire tends to occur minimally. The day Nikon lenses perform as well as Canon lenses, however, keep your eyes and ears peeled: The wrath of the moire-hating D800E user will be heard around the world. ;P

1330
EOS Bodies / Re: Canon EOS sensors, and technology
« on: April 09, 2014, 11:33:22 AM »
@Don: What do you mean by "address the sub pixels"? For DPAF, or QPAF, to work, the "subpixels" have to be underneath the CFA. If you are thinking you could get a higher resolution image by "addressing sub pixels", I don't think that would actually work.

I think this is the same mistake people make when they thing DPAF can improve DR...it really can't. MagicLantern improved DR by reading FULL pixels at two different ISO settings, and blending the result. But if you read half pixels at one ISO, and half pixels at another ISO,  you are actually getting quite a bit less light for both your high and low ISO "channel". Theoretically, you could improve the noise of the low ISO channel by applying the high ISO channels the way ML does, but since you are effectively doubling noise in the first place by using half pixels, your net gain in the end is effectively nothing...you end up roughly back where you started (i.e. if you started by binning the two halves (or four quads)).

1331
EOS Bodies / Re: Canon EOS sensors, and technology
« on: April 09, 2014, 03:05:42 AM »
Quote
Yes, it is a fact.  The magnitude of that difference is not so obvious.  Detail that is present but blurred in a predictable manner can be brought out in post.  (Side note - optical microscopes can now resolve beyond the Abbé diffraction limit, and one way of achieving that uses post-processing analysis of moiré resulting from patterned illumination, i.e. since the pattern is predictable, detail not present in the image can be extrapolated mathematically.)

Also, depending on the lens much of that extra detail may not be there to begin with, which is why the D800 with a Nikon 24-70/2.8 barely outresolves a 5DIII with a Canon 24-70/2.8 II, despite the 60% higher MP count of the D800's sensor.

I thought DXO showed the D800E as having the highest IQ if not almost as high as the Phase One IQ280 ? I could be wrong as I didn't study it, it was something I read in discussion of MF DB's. If that is not the case, please do list the top 3-5 markers of IQ in the DXO testing.

Do all the calculations you want. I am shooting for over 20 years now.
I have done my side by side apples to apples test using a Leica R Macro 60 lens mounted on each on a studio stand with controlled lighting. If you shoot jewelry, you will know the difference without a second thought.

Yes Don, I use a P25/P45 and have used, Kodak pro back, Blad CF39, Sinar evol75 on a Sinar 4x5. I have used Nikon before, but focus was horrible about 10+ years ago. Switched to Canon and have loved it for the product and service. I think they can do better in specialized features.

I use it in the studio on still subjects. For people I often use it for portraits that are slow moving. Otherwise I use the 5D mark2. 

There is a significant difference in the lowest end which is the P25(22MP) DB vs a 5D mark2. Regardless of all the numbers these guys want to crunch. A good portion and I don't hesitate to say it is due to the AA filter.

First, you put too much weight on DXO's numbers. As far as their sensor tests go, they do not actually measure "sharpness" or anything like that. It's actually extremely difficult to objectively test a sensor in terms of sharpness, as you have to use a lens to do so, in which case your not testing a sensor, your testing a sensor and lens combined, which totally changes the outcome (and the reasons why you get that outcome). The other problem with lens+sensor tests is they are bound by the least capable component...if the lens is the weak point, then no matter how good the sensor is, your output resolution is limited by what the lens is capable of...you can never resolve more than the lens resolves, period. Similarly, if the sensor has limited resolution and the lens is a powerhouse (like the Zeiss Otus 55mm f/1.4), then your output resolution is limited by the sensor...you can never resolve more than the sensor resolves, period. That makes determining how sharp a sensor is a very muddy issue, one that cannot be definitively pinned down. Hence the reason DXO measures things like SNR and dynamic range and color sensitivity in it's sensor tests...that's all they CAN measure.

Regardless of what DXO has to say about the D800 or D800E sensors, the removal of an AA filter does not increase image quality. Actually, in all too many cases (quite possibly the majority of cases), removal of the AA filter is guaranteed to REDUCE image quality, thanks to increased aliasing in general, moire specifically. This is clearly evident by all the numerous standardized image tests done with cameras over the years...while sharpness has increased in some newer cameras by a small amount, so too has moire. DPReview has plenty of examples where the removal of AA filters in Nikon cameras, or even just the weakening of the AA filter in many brands (including Canon) has greatly increased the amount of moire that occurs. (A great baseline for comparison on DPR is the 7D...it has an appropriately strong AA filter and doesn't suffer from moire at all. You can compare any newer camera with a sensor that is supposedly "better" than the 7D because of the removal or weakening of the AA filter...those images will be sharper, but they are usually riddled with moire.)

If the things you photograph have no regular/repeating patterns, and do not contain any elements with clearly defined edges, then increased aliasing due to having no AA filter is not an issue. There are not very many forms of photography where that actually turns out to be the case...landscape photography is probably one of the very few. Even say insect macro photography, for example, will suffer from the removal of the AA filter...things like antenna, feelers, legs, wing veins, anything thin, strait, with high contrast to it's surroundings will end up with clearly aliased edges, and not even a highly optimized AHD demosaicing algorithm will be able to hide that fact.

The only thing removal of the AA filter MIGHT do is increase the acutance between pixels, which ultimately has the potential to increase sharpness. This increase in sharpness is only possible if the lens is already resolving enough detail that the real image resolved at the sensor plane is not being oversampled by the sensor. Someone using the Nikon 14-24mm zoom lens on a D800E to photograph landscapes would probably be in heaven without an AA filter. There is a whole host of Sigma lenses that would probably fit quite well on the D800E also. I know I'd love to have such a kit for my landscape photography. For just about anything else, however, I'll take a camera with a properly designed OLPF. Sharpness is not the sole defining trait of image quality, it is only one of many (the others being things like SNR, dynamic range, color fidelity, spatial resolution).

Furthermore, the kind of blurring caused by an optical low pass filter (aa filter) is regular, predictable, and well-understood. That means it is very easily reversed (deconvoluted) with mathematical algorithms in software, and since it is a small effect at a specific and narrow range of spatial frequencies, it can be nearly entirely reversed. All it really takes is a light application of your basic unsharp mask to do a darn good job, and smarter algorithms that come with photo editing tools like Nik or Topaz suites can do an even better job. This is what Neuro was talking about.


1332
EOS Bodies / Re: Canon EOS sensors, and technology
« on: April 09, 2014, 12:55:55 AM »

OK, I shouldn't say large segment. But "ground breaking" changes or improvements are what big tech companies need to keep the spirit of innovation alive, don't you think? People who echo these sometimes small, and sometimes game changing innovations are what can snowball the market direction.

I think we are on the edge of a shift in digital cameras.

We need to step back and ask "why mirrors". In the days of film, you needed the mirror and optical viewfinder to know what you were looking at and we needed focusing screens to know if we were in focus.... Then came digital sensors and we treated them like film... because that is what we were used to.

A digital sensor is NOT film. It has different strengths and different weakness.... and the mirror is no longer the only way to see through the lens. A decent mirrorless camera (and there are several on them out there) will be designed to the strengths of digital technology. They already do many things better than DSLRs, but a great mirrorless camera will have to do everything better. Right now, the two big stumbling blocks are focusing and viewfinders.

Dual pixel technology may well be the end of the focusing dilemma... and as it matures we should be able to have far more capable focusing systems on mirrorless cameras than with DSLRs... the point I keep bringing up is that we should be able to recognize a bird and track it as it flies through the air, even though the operator is not steady. We already have P/S cameras that recognize individual faces and can even tag them for use on social media and I have a waterproof P/S that has "cat mode" and "dog mode" and when you put it in "cat mode" it tracks the face of the cat and not the dog so please don't tell me this is a far-fetched idea... It's not coming, it's already here!

The second stumbling block is viewfinders. Right now, optical viewfinders are better than EVFs.. A few years ago EVFs were garbage... there are some real nice ones now.... who knows what the future will bring? At some point, people will stop trying to design an EVF to be like an optical viewfinder and design them to the strengths of digital... perhaps they will get a bit bigger... perhaps you will have a little window open up on it to check focus at 10X... or exposure preview.... or whatever... but until they stop pretending it is optical they will be inferior. I am sure that what is currently in the labs is good enough for the real world.... we are that close.

I can see Canon coming out with a new camera that shakes things up. I would love to see a quad-pixel technology 7D mirrorless camera where you could address the sub-pixels individually for a 24megapixel  image with similar ISO and noise to the 70D, or bin them together for a 6megapixel low-light camera that had better low light performance than a 1DX....

Canon has a HUGE R+D department.... they are not all siting on their rear ends playing solitaire... something is coming and the delays to the successor to the 7D may just mean that the change is big.

Very well put. Especially that last bit...the change would really have to be big like that, for it to be justifiable. Otherwise, it's just a demonstration of a major Canon blunder, if the 7D II comes out and is a mediocre improvement over the 7D, and not much in terms of competition against counterpart offerings from other brands.

Regarding the two points about viewfinders and focusing. I'm not sure were "nearing the end" of the issues. I think DPAF marks the beginning of finally moving down the right path, however I think there is a lot of innovation along that path that needs to take place before you start seeing action photographers seriously think about dumping their dedicated AF sensors and familiar AF points for a mirrorless image-sensor-based AF system. DPAF should at least become QPAF, so we can detect phase in at least two directions. I think we may ultimately need to see one further innovation, dual-direction QPAF, where you have horizontal and vertical with one half of the sensor's pixels, as well as phase detected diagonally in two perpendicular directions with the other half of the sensor's pixels. Only then would you be technologically similar to how dedicated PDAF sensors are designed, and only then could you really start building advanced firmware to really produce high rate, high accuracy AF without a dedicated AF sensor.

There is still another problem, however, that mirrorless AF systems will need to overcome before they can really achieve parity with their dedicated AF system counterparts: Low Light Sensitivity. Modern dedicated AF systems are sensitive to light down to the -2 to -3 EV range. Not only that, each dedicated PDAF point receives a tiny fraction of the total light entering the lens (thanks to passing through a half-silvered mirror and an AF unit splitting lens), and each line sensor that comprises an AF point recieves at most half of that tiny fraction of total light. All that, down to at least f/5.6, and in "pro" grade cameras, down to f/8. Dedicated PDAF sensors are ludicrously sensitive to the smallest amount of light...and largely thanks to the fact that they can be fabricated independently of the image sensor, so they can be explicitly designed with huge photodiodes in each line sensor that have massive SNR. I'm  not sure how camera manufacturers will overcome this issue, as even at very high ISO settings, image sensors are nowhere near as sensitive as the photodiodes in PDAF sensors. I'm sure one of the big manufacturers will figure out something brilliant to solve this problem...but I think it is definitely something that needs to be dealt with.

(BTW, I am aware that Canon's current DPAF supports live view focusing up to f/11, however the speed of that focusing is nowhere even remotely close to as fast as a dedicated PDAF unit. The slower speed gives DPAF a bit of an advantage in that area...similar to the advantage Canon creates when they force a slower AF rate when attaching one of their teleconverters to a lens.)

As for EVFs, I can only hope they get significantly better. I'm very curious to see what Canon does with their Hybrid VF...I wonder how that will ultimately work, and whether it will be as flexible and user configurable/selectable as it really needs to be to be a success. I suspect it will be rather inflexible, and only activate the EVF under very specific circumstances (such as recording video).

1333
Animal Kingdom / Re: BIRD IN FLIGHT ONLY -- share your BIF photos here
« on: April 08, 2014, 09:47:03 PM »
We saw Ikarus, too:


WOW...that is FANTASTIC!

1334
Landscape / Re: Deep Sky Astrophotography
« on: April 08, 2014, 08:56:17 PM »
@TheJock: Check out the other thread. I've provided a lot of information on the kind of equipment you'll need to get started. We can continue the discussion there.

1335
Third Party Manufacturers / Re: How to Annoy a Photography Snob
« on: April 08, 2014, 12:24:35 AM »
I lost all respect for Ken Rockwell after reading this:

http://kenrockwell.com/ri/WhereDoBabiesComeFrom.htm

I know he, in his weird way, was trying to be "funny"...but so many things just go over the line in that page. When reading his photography pages, and when you see him in the few YouTube videos he is in, you get the feeling is a crass, arrogant buffoon...but when you read his "Where do Babies Come From"...you realize he's everything you fear he is...then you throw up.

I don't even bother to click on links to kenrockwell.com anymore...all I ever see now is...where do babies come from... T_T T_T T_T T_T T_T

Pages: 1 ... 87 88 [89] 90 91 ... 299