L
Loswr
Guest
The above is an assertion I don’t trust.No AA filter = images you can't trust.
Upvote
0
The above is an assertion I don’t trust.No AA filter = images you can't trust.
For a start the 5DsR does have an AA filter, plus an attempted-cancellation filter. So it's not a bare sensor.
But a lot of the 'detail' allegedly seen in non-AA photos is actually not real detail, it is the hard contrast boundary between photosites on the sensor. With an AA filter the transitions are softened.
The above is an assertion I don’t trust.
Camera engineers know this and that's why they put an expensive anti-aliasing filter into their cameras. They don't add cost to annoy customers, they do it so their cameras produce trustworthy results.
Trust or lack thereof is your own value judgement. The camera will behave as designed, with or without an AA filter (or OLPF, if you prefer). As pixel pitch decreases (or sampling frequency increases, if you prefer), the number of real-world scenes in which aliasing occurs drops.Then look up the proper application of the Nyquist-Shannon sampling theorem.
https://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem
(emphasis added)
"...if the Nyquist criterion is not satisfied, adjacent copies overlap, and it is not possible in general to discern an unambiguous X(f). Any frequency component aboveis indistinguishable from a lower-frequency component, called an alias, associated with one of the copies. In such cases, the customary interpolation techniques produce the alias, rather than the original component. When the sample-rate is pre-determined by other considerations (such as an industry standard),
is usually filtered to reduce its high frequencies to acceptable levels before it is sampled. The type of filter required is a lowpass filter, and in this application it is called an anti-aliasing filter. "![]()
So, without an anti-aliasing filter, it is not possible to discern a low frequency component from an aliased high frequency component.
I saw a brilliant example of this once and wish I had saved it. It was a picture of a picket fence and it looked mostly okay. But a tighter view showed that the number of pickets per section in the wider view didn't match reality. In reality, the fence had many more, smaller pickets, but those were aliased down to many fewer, wider pickets in the image by reflection about the Nyquist frequency. In other words, the original image was lying about the size and quantity of the pickets.
This isn't open to debate, it's a proven, mathematical theorem that's been known for about a century.
Camera engineers know this and that's why they put an expensive anti-aliasing filter into their cameras. They don't add cost to annoy customers, they do it so their cameras produce trustworthy results.
Too true Neuro. Here is a discarded shot of a waxwing using the 5DIV and the 400mm DO II + 2XTC, a combination that is not the sharpest. The waxwing is slightly soft and even then the back wings have Moire. You have to download to see the full extent of the Moire. I occasionally do see Moire because the repeating fine structure of bird wings is prone to that in close ups, and I discard those. Otherwise the extreme cropping I have to do gives images that never have Moire, and the absence of an AA-filter helps because I have to sharpen less and so do not increase noise.Trust or lack thereof is your own value judgement. The camera will behave as designed, with or without an AA filter (or OLPF, if you prefer). As pixel pitch decreases (or sampling frequency increases, if you prefer), the number of real-world scenes in which aliasing occurs drops.
Besides, your statements indicate that you believe an OLPF precludes all aliasing. It doesn’t, it merely alters the range of subject frequencies at which aliasing is evident. In other words, you can still get moiré with an AA filter. So by your logic, you can’t trust any camera. Guess you’ll just have to give up photography. How sad.
Trust or lack thereof is your own value judgement. The camera will behave as designed, with or without an AA filter (or OLPF, if you prefer). As pixel pitch decreases (or sampling frequency increases, if you prefer), the number of real-world scenes in which aliasing occurs drops.
Besides, your statements indicate that you believe an OLPF precludes all aliasing. It doesn’t, it merely alters the range of subject frequencies at which aliasing is evident. In other words, you can still get moiré with an AA filter. So by your logic, you can’t trust any camera. Guess you’ll just have to give up photography. How sad.
As pixel pitch decreases (or sampling frequency increases, if you prefer), the number of real-world scenes in which aliasing occurs drops.
No, it's not a value judgement, it's fact. Every single scene will have frequencies beyond Nyquist.
trust /trəst/ verbNo, it's not a value judgement, it's fact. Every single scene will have frequencies beyond Nyquist.
The discussion was not about 'scientific accuracy', it was about 'trust'. Moreover, a camera (in the context of a photography forum) is not a scientific instrument (unless Lee Jay was being paid to accurately document the number of pickets in a particular fence and his count was wrong due to the lack of an AA filter...in which case, even though the instrument was performing as designed, the person who paid him for the job should not have trusted someone who would pick the wrong tool for the task at handWhile I think the entire AA discussion is "mountain out of an ant hill", I have to agree with the statement that scientific accuracy in a digital sample suffers without a lowpass filter. You won't find anyone suggesting a scientific instrument doesn't need a lowpass filter, or will work better without one, unless they are absolutely certain that instrument's sampling frequency is higher than anything they will sample.
If I were in the lab solving a novel protein complex by cryo-electron microscopy, then I certainly would use low-pass filtering. However, if I am taking a photo of my grandson blowing out the candles on his birthday cake, then I am confident that my 5DSR will not add a year or two to his age or make him younger by aliasing the candles and I do trust that my camera would not add an extra pair of wings to an eagle to turn it into the avian equivalent of a biplane.The discussion was not about 'scientific accuracy', it was about 'trust'. Moreover, a camera (in the context of a photography forum) is not a scientific instrument (unless Lee Jay was being paid to accurately document the number of pickets in a particular fence and his count was wrong due to the lack of an AA filter...in which case, even though the instrument was performing as designed, the person who paid him for the job should not have trusted someone who would pick the wrong tool for the task at hand).
My superresolution and other confocal microscopy systems do not have an OLPF, although I do routinely sample at ~2.3x the optical resolution to satisfy Nyquist.
I felt the same way about the original 7D. An amazing camera in every respect but it was let down by a very poor sensor. In some ways the 7D paved the way for Canon's next gen AF system and pretty much led to the host of pro-body upgrades that led to the 5DmkIII. But I was never particularly happy with the image quality. Sure, I got some great images from it but I found the images noisier and softer than any other Canon DSLR I've owned and the RAW files couldn't take a lot of processing. But every thing else about the camera was amazing. Even at 400 iso I found noise every where. So I passed on the 7DII, which seemed to have similar issues. I loved the extra reach that the 1.6x crop offered and on paper the camera looked like a very capable 1DxII lite...but the image quality wasn't in the same league.
Instead I went for a pair of 5DIII's and haven't looked back. Still using them today....sure I'd like to upgrade to a pair of 5D4's but the mkIII's are still working well for me.
trust /trəst/ verb
1. believe in the reliability, truth, ability, or strength of.
Sorry, trust requires belief in a fact, it's not the fact itself.
Semantic BS.Belief is acceptance without evidence that justifies such a conclusion. This is a proven mathematical theorem, thus no belief is required. It's simply a fact.
i would say "JUST WAIT AND SEE".canon is not from yesterday.and sony is just problematic player at any technic. always.canon,nikon,fuji,leica are real photography producers since ever.and to be honest,how many of us are taking photos with iso above 1600 in real world. Martin Osner.Any problem with 5d iv.
It is always a battle between optical resolution from increasing the pixels vs diffraction. IMO ore pixels always wins (30D vs 7D vs 7D2) - unless you can show me the same image shot with high and low MP presented at the same viewing size that shows me any different.
There is imo no chance something like 7D III is going to happen in an EF-M mount. First - ergonomics / size aspects - you want your 7D III being larger than M5, right? Second - with EF-M, there is no upgrade path to the RF mount lens. I think that if something like the 7D III is going to be released one day, it is going to be the RF version. And once that happens, it is going to be the last nail in the coffin of an EF-M mount, not that it will die off, but anchoring it definitely in the hobby segment.
You shoot whatyou have to. If you are doing landscape f5.6 will probably give too little DOF - sometimes these diffraction comments ignore the necessities of the art and end up being purely philosophical.
Can you show me an image where a high-res sensor 'diffration limited' photo has less resolution than a lower-res sensor image that (in theory) is less 'diffraction limited'??
- MILCs eliminate essentially all opto-mechanical components except optional sensor stabilisation assuming that electro-mechanical shutter will be replaced with electronic one. This essentially leaves only digital electronics. There is a well documented trend since circa 1975 that digital electronics components typically drop in prices at least 20% per annum for the same functionality or capacity.
- Cheaper to make does not automatically translate to lower prices for the buyer, of course. Manufacturers would like to increase their margin, but competition is likely to force them to lower prices in the end. However, looking at it from the point of view of a manufacturer is is very desirable to have low production cost. You can then can higher margins and still be competitive.
- DSLRs are outselling MILCs because the bottom range DSLRs are actually quite cheap. At least until recently their prices were generally lower than those of MILCs. Rebel T6 with kit zoom lists on Canon US web site for $399.99 and the street price is much lower.
- I would respectfully disagree with your view that adding gizmos to DSLRs can continue. In my view, this is not the case and certainly not on the scale possible in a MILCs. In the AF area the main limitation is the number of AF points in a DSLR. The other limitation is that you by definition cannot display all these gizmo visual effects in the OVF. Of course, you can display them on the LCD in Live View, but then why to have a mirror and OVF, you could just as well use your phone + Instagram. Well, it is actually obvious to me that DSLRs reached maturity. Of course, sensors still change slowly, but the main concept of the DSLR remains more or less constant. The last revolution apart from transition from film to digital was autofocus, but that was 35 years ago still in the film days. The maturity is a good thing in a way and this is why I still use a DSLR.
IF the readout speed is vastly increased, I could see digic 8 being fast enough to handle an aps-c sized processor with increased pixel density. It's handling a full frame right now just fine with eye-af, especially with the new firmware. The big hangup has been how fast the speed readout from the chip is. If it was mirroreless, more than likely you'd be able to get 7dMkII like FPS with that setup as well since it's only having to process af points from a much smaller chip, and the 90d will not be displaying that real time. Even with increased pixel density it's still less points than the current R when in live view. If you use the digic 8 to interpret phase detect focal systems it will not even begin to tax the chip.
The 2020 olympic ready mirrorless sports 1dx type camera would need both the increased readout speed AND a newer Digic 9 chip. If they release a full frame version of this newly designed chip in the 70-100 megapixel range the increased readout speed of the sensor would make the current Digic 8 chip work in a hypothetical 5ds replacement, but with slower fps than the R, just like the 5ds versus 5dIII.
The more I think about it, the more getting rid of the 7D line makes sense from what I always considered the primary reason why the line existed: Fast fps, 1dx level tracking
Both the RP and the R exhibit tracking on par with the 1dx, even if they can't fire off shots as fast as it can. This level of tracking will be in ALL future mirrorless bodies and all future bodies with live view. The only differentiation is in FPS speeds. The 80D already was only 2 fps lower than the 7dMkII, and had a better sensor to boot. So why wouldn't they add in the 1dx tracking and use the digic 8 to handle it? Even with that advanced setup, in some respects it will have worse tracking than the R and RP outside of speed! If you add that in, at that point what you are losing is better weatherproofing and a joystick. And I'll bet we'll see at least one of those on the upgraded (and upgraded price) body.
I've been hoping for pet-eye-Af for a long time. My bearded dragon moves fast when the cat is chasing him around the house. Pet-eye-AF will be a Godsend. Though, I have to wonder whether the camera would confuse the lizard eye with the holes in my Fruit Loops if he happens to be near the bowl. Any word on insect-eye-AF development?
Without a surprising new sensor, it’s highly unlikely. To achieve the “zero blackout,” the A9’s sensor reads in something like 1/150s. Most sensors read in closer to 1/30s.
So much of the discussion about diffraction is confusing or misleading to those of us not so conversant on what various terms mean. "Diffraction limited" seems to be used as if there is some zone where there is no diffraction, and then suddenly you hit a wall where your image gets ruined. Some online calculators can reinforce that impression.
Of course there is diffraction at f/1.4, and even more at f/32. The effect becomes noticeable gradually. I would guess that the "limited" moment comes when diffraction starts to be the limiting factor more so than anything else. But I don't know for sure.
Yes, one needs to decide what is important in making the compromises. Obviously the choices will be different in shooting a misty woodland scene from doing product photography.
For the 2017 total solar eclipse, I was shooting with my T3i. My only telephoto lens at the time was the rather bad 75-300mm that has a lot of chromatic aberration, among other faults. I was concerned about diffraction, but also needed focusing leeway from depth of field, since I didn't dare look through the OVF, and focusing on the screen, even shaded was rather difficult. I also presumed that stopping down would help minimize the CA. Graphs I saw on line suggested that f/11 was the best choice at 300mm on that lens. Days before, I put the filter on the lens and practiced shooting the sun. Even keeping the sun in the picture was a challenge, even through I had swung the floppy screen into the shadow of the camera itself. It turned out that f/11 gave sharp looking pictures with that less than optimal lens. Sunspots showed up very clearly. My eclipse pictures turned out about as well as those by people using superior equipment.
Here's how I dealt with seeing the non-flippy screen on my cameras during the eclipse: foam board with holes cut just the right size to slide them onto the lens (from the rear before attaching it to the camera).