August 30, 2014, 02:28:25 PM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - jrista

Pages: [1] 2 3 ... 273
1
EOS Bodies / Re: Is Canon now two generations behind Nikon?
« on: Today at 06:39:51 AM »
How much is Canon paying you to suffer and defend the brand?  If the threads bug you, don't read them.  Better yet, go out and take some shots.  As far as I can tell, the D810 is a Nikon-version of the 5D3, but with a substantially better sensor.  It's natural that people want more oomph out of their cameras and its natural for them to look at competing brands for validation of their choices.  Seriously, go out and take shots.

How about you don't wade in, missing the point by a country mile with worn-out, rote, flamebait clichés?

The folk here who "defend" (which, incidentally, is an immature, immotive characterisation of what's really going on here) Canon, do so:

Because they don't appreciate lies, half-truths and irrelevances presented as "facts" - much less as show-stopping, catastrophic failures by Canon.

Because they know, from their own use of Canon equipment, that it can achieve anything they need a camera to do - which is, images (not pixels) of the most sublime image quality anyone might possibly wish for.

Because the whining about Canon's "sub-standard" sensors says more about the whiners (and their own failings) than it does about the sensors.

Simply put, they "defend" because that's the proper reaction to the bullsh*t. Other people who (God help them) might choose to visit Canonrumors to get some useful information about the capabilities of Canon cameras deserve a balanced view that pushes back against the interminable DR crap.

And it'll continue to happen for as long as the DR whiners continue to push their DR agenda, and as long as that agenda continues to mean sweet FA for the vast majoiity of photographers out there in the Real World.

The notion that it's all pointless is a fallacy, though. The 5D III was indeed a much better general purpose camera than the D800. To deny that the D810 closes the gap significantly, however, is just as much a lie or half-truth as anything your claiming the other camp is doing.

The D810 HAS significantly closed the gap on the 5D III. It DOES have a good AF unit, and it DOES have a higher frame rate (it's only 1fps slower than the 5D III now). It's the D4 AF system, which has a much tighter point spread clustered in the center of the frame. The 5D III inherited the 1D X AF system, which has the largest AF point spread of any DSLR AF system to date, which gives it a strong edge for tracking subjects across the majority of the frame. The 5D III has full expansion mode f/8 AF, allowing up to five AF points to be used around the center. The D810 has that same capability now, though. The D810 has the ability to shoot 7fps in crop mode...and with pixels under the 5µm limit, only 0.6µm larger than the 7D pixels, it offers the option of enhanced reach and action shooting all in a single camera body...that's something the 5D III does NOT have any counterpart for.

It's one thing to be sick and tired of DRivel. However, the D810 brings a hell of a lot more to the table than just more DR than the 5D III. To deny that is to stick your head in the sand and sing a little song about how nothing has changed. The 5D III is an excellent camera, it's phenomenal for high ISO work, where it's still superior to the D810, it handles like a dream, and it's compatible with the best telephoto lenses available for DSLRs. The D810, however, even if you completely ignore it's sensor IQ advantage, has closed the gap between the two cameras CONSIDERABLY.

And that happened in LESS than two and a half years since the original release of the 5D III...not more than three and a half years, as you mistakenly state in a later post.

There is a very strong defense of Canon here on CR. Ironically, that defense is so consistent and ignorant of the real competition that Canon is facing, not just on the sensor IQ front but on every aspect of their DSLRs, that you guys are just giving Canon more reason NOT to improve their products by constantly saying things like you have in your post above. You want the people coming here to CR to get a balanced view of the state of Canon's DSLR market? Stop sticking your heads in the sand, stop ignoring the fact that Canon clearly seems hell-bent on pushing video features in their DSLRs at the expense of many other capabilities, and acknowledge that Canon technology, not just the sensor but other technologies as well, is or has fallen behind the competition. There ARE better options out there for some types of photography, and the number of options is increasing...for more reasons than simply getting more DR.

2
EOS Bodies / Re: EOS 7D Mark II & Photokina
« on: Today at 06:19:01 AM »
If one can see moire in the image, there is not enaugh resolution on sensor, and we are not lens limited yet. Waiting fo 64Mpx APS-C cam. muhahaha...

Actually, if one can see moire in the image, they have an improperly designed AA filter. :P We don't NEED to significantly oversample the lens to avoid moire. We've been avoiding moire for over a decade...the problem today is that manufacturers are removing the AA filters while we are still often UNDERsampling the lens. Moire shouldn't be a problem...the fact that it is, is because photographers and manufacturers are artificially making it a problem by systematically weakening and entirely removing AA filters from cameras that were doing just fine with them before.

We have different angle of view on that. I tried to point to a fact that if you see moire, the lens certainly resolves more than sensor itself. That way If we want to up the resolution, increasing sensor resolution still IS the way, as the lens can do that. Of course with some losses, but that´t the deal. Still worth it. If you don´t see moire in the image taken with AAless sensor where it should be, then you´re using your lens resolution potential at its full capabilities, and that´s where we (at least me) we want to go one day. One day nobody will need to be bothered with sensor resolution. It will be absolute compared to lenses we put in front of it, only what will matter will be DR, efficiency, noise supression and stuff. This megapixel fight will move to different aspect.

In that respect, I agree. We DO eventually want to get sensor resolution to the point that it oversamples, eliminating the NEED for AA filters. Were a pretty long way off from that day, though. If lenses like the Otus are any indication, we can push 400lp/mm from an ultra high quality lens at wide apertures. That means we would need pixels around 1.25µm in size to simply MATCH that resolution, let alone oversample it. The theoretical limit on useful minimum pixel size is 0.9µm (900nm, well into the wavelengths of near-IR light!) A full-frame sensor at point nine microns would be a GIGAPIXEL sensor. Assuming were at least at 16-bit ADC by the time such a sensor arrives, we would need in-camera data throughput of over 2.3GB/s just to process one frame per second, and data throughput of approximately 13GB/s to process six frames per second.

That kind of technology is beyond extreme. Relatively few things process data at such incredible speeds...high end, high power GPUs are one of the few that come to mind, along with the level three and lower data caches on a CPU. Those devices require considerable amounts of power to operate.

So, yes, the notion of a sensor that outresolves every lens you can put in front of it is the ideal...it's a very lofty one. I think we may see sensors that outresolve lenses that peak in resolution somewhere between f/4 and f/2.8 at some point, as many current lenses already achieve their optimal near-diffraction-limited resolution somewhere around f/4. Were still talking about sensors with hundreds of megapixels, though, and the data throughput requirements are still rather insane by todays standards.

That's nothing to say of the hardware requirements for the PC's that would be used to process such images, or all the pixel-peepers who would look at their images and freak out because of how "soft" they look (when, ironically, that's the entire point...to OVERsample. :D)

3
EOS Bodies / Re: Is Canon now two generations behind Nikon?
« on: Today at 03:33:04 AM »
So little innovation coming out of either company.

I'm not sure I would say that. Canon has been in the top five most innovative companies of decades now. They file some 3000-4000 patents a year these days:

http://www.usa.canon.com/cusa/about_canon?pageKeyCode=pressreldetail&docId=0901e02480ae93e9

Keep in mind, Canon is a company with a massive presence in "imaging" in general, from general photography and printing to video and cinematography to medical imaging to CMOS fabrication. Canon has been winning awards for being one of the worlds most innovative companies for years. The difference between Canon and Sony's sensor division is that Canon has such a broadly sweeping focus in imaging...where as Sony went so deeply into debt to build their 20 billion dollar plus CIS monstrosity that their bond status is now junk. Sony is apparently betting a significant portion of the company on their sensor play...so it's not surprising that they are innovating in CIS specifically more than Canon is.

I personally wish Canon would funnel more of their R&D budget into improving still photography IQ...but they seem to have a different focus right now.

4
EOS Bodies / Re: EOS 7D Mark II & Photokina
« on: Today at 03:17:17 AM »
If one can see moire in the image, there is not enaugh resolution on sensor, and we are not lens limited yet. Waiting fo 64Mpx APS-C cam. muhahaha...

Actually, if one can see moire in the image, they have an improperly designed AA filter. :P We don't NEED to significantly oversample the lens to avoid moire. We've been avoiding moire for over a decade...the problem today is that manufacturers are removing the AA filters while we are still often UNDERsampling the lens. Moire shouldn't be a problem...the fact that it is, is because photographers and manufacturers are artificially making it a problem by systematically weakening and entirely removing AA filters from cameras that were doing just fine with them before.

5
EOS Bodies / Re: Is Canon now two generations behind Nikon?
« on: Today at 12:07:43 AM »
the 5D III is still a superior CAMERA. It has an inferior sensor...but it's still a superior camera overall.

Why?

I have owned both and I very strongly disagree.

Make a case, not bland sweeping statements.

LOL, I've written so many deeply detailed statements that make my case that it's beyond ridiculous. I believe I've earned the right not to have to explain it all again. How about providing more yourself, beyond simply:

Quote
"I have owned both and I very strongly disagree."

You strongly disagree....WHY? BESIDES sensor IQ....what makes the D800 or D810 a superior camera overall than the 5D III? What makes the Nikon ECOSYSTEM better than the Canon ecosystem?

6
EOS Bodies / Re: Is Canon now two generations behind Nikon?
« on: Today at 12:05:06 AM »
Yeah, the shutter/mirror slap on the 5D III is pretty amazing. Even close up, it isn't really loud. It's actually got a somewhat complex sound, a mix of a slap, a thud, and a "clink"...maybe the sounds together help cancel each other out and that's what minimizes noise. Either way, it's MUCH more pleasant than the 7D slap.

..  and far better than the barn-door-in-a-gale whack of the 5d2.

LOL, indeed! :D

I do kind of like the machine-gun stutter of the 1D X, though. It's loud, scary, and makes people turn heads. ;P

7
EOS Bodies / Re: EOS 7D Mark II & Photokina
« on: August 29, 2014, 11:05:13 PM »
Sure, there is no question there are limits to how small you can shrink pixels with an FSI design.

Yup. That's the clarification that I was after  :P.

Quote
As far as I am concerned, BRING ON THE 96mp MEGAPIXEL MONSTROSITIES!! MUHAHAHA!!

LOL!

You are laughing but I bet that they are going to do it in 10 (?) years.

Oh, I'm laughing because I KNOW they are going to do it. Some people laugh at the megapixel race, but me, I'm all for it. I want as much resolution and dynamic range as I can get my hands on, particularly for landscapes.

8
EOS Bodies / Re: EOS 7D Mark II & Photokina
« on: August 29, 2014, 10:56:05 PM »
As I said. Technology has been marching on.

Right.

But even with light-guides (to guide the light onto the photodiode), there are still limits as to much you can shrink pixels.
These are physical entities and you cannot shrink them indefinitely with a given technology.

That's the only point I'm making.
You make it sound like smaller pixels are always better - and that's not unconditionally true.

There's a physical limit that cannot be crossed.
That's why manufacturers are using finer and finer CMOS processes (Panasonic is at 65nm now).
And also looking for alternative solutions - like BSI, Sony's stacked technology, etc..

So, smaller pixels are generally better - but only when newer, more advanced technologies are used.

There's also the issue of the full-well capacity of a photodiode.
Smaller full-well capacity automatically lowers SNR. You should know that.

So, it's a balancing act, really, for pixel engineers.
A blanket statement like 'smaller pixels are always better' is just that - a blanket statement.
Some necessary small print needs to be added to discussion 8).

Sure, there is no question there are limits to how small you can shrink pixels with an FSI design. I already mentioned that ALL small form factor sensors that use 1.2µm pixels and smaller use BSI designs now.

But we are primarily talking about larger sensors. In larger sensors, we don't have the kinds of problems with maintaining incident light ratio on the photodiodes. We don't even need lightpipes...a single layer microlens works sufficiently to control over 90% of the light. A second layer would again focus any dispersal from the color filter back into the "well", again minimizing any remainder losses.

There is also a limit to how far use of finer processes will improve things for larger sensors. For smaller sensors they are essential, even with BSI, as your packing so much into increasingly small spaces. I mean, when the 0.9µm generation hits, the pixels will be smaller than the majority of the infrared bandwidth! But, large sensors still have huge pixels. It will be many generations before we drop below 3 micron pixels, assuming we ever do. It's a lot harder to make large optics perform well outside of the central FoV, and I think lens design will ultimately become the bottleneck for keeping the megapixel race alive with larger sensors.

Assuming we do reach 3 micron pixels at some point, on either a 180nm or 90nm process...that would be a 96 MEGAPIXEL sensor in full frame, and a 37 megapixel sensor in APS-C. That's WAY up there. The highest resolution sensors will probably sit around 4.5µm to 3.7µm pixel sizes for a while still, a couple DSLR generations, which puts us out another eight years approximately?

Assuming everything is manufactured on a 180nm or smaller process soon, I don't think that fill factor will be the primary or even a significant issue for APS-C and FF sensors for so long that it simply doesn't matter. In that light, I still assert that you can always do more with smaller pixels. As far as I am concerned, BRING ON THE 96mp MEGAPIXEL MONSTROSITIES!! MUHAHAHA!!

9
EOS Bodies / Re: EOS 7D Mark II & Photokina
« on: August 29, 2014, 10:32:06 PM »
Assuming equivalent or better sensor technology, more pixels is never bad.

You mean assuming better technology only.
For equivalent technology, this works only up to a point - at least for front-illuminated sensors.

In a front-illuminated sensor, the photodiode of a pixel is located at the bottom of a well, basically (see the left diagram):



The well is formed by the layers of metal wiring above the photodiode.

As pixels shrink, this well becomes narrower and narrower.
At some point, the well becomes so narrow that the micro-lenses on top can no longer focus the light on the photodiode.
This leads to light losses - and the resulting image quality degradation.

Thus, to further shrink the pixels, you need to switch to a finer CMOS process (or maybe BSI).

The likely reason that the 5DIII has 'only' 22mp is not because Canon no longer believes in megapixels (they do).
Rather, Canon appears to have hit the shrinking limit of their 500nm CMOS process, on which the 5DIII sensor is made.

The 70D is likely made on a finer CMOS process (180nm?), though, as I can't imagine that they've
been able to stretch their 500nm process to make the 20mp/dual-pixel sensor of the 70D.

So, smaller pixels are indeed generally better.
It's not a free ride, though; there limits as to how much you can shrink with a given technology.
Beyond that, you need to change your technology - or image quality degrades with smaller pixels.

I know exactly what a sensor pixel looks like. ;) I also know that manufacturers have been using a double layer of microlenses to solve the problem with the FSI design you showed. They have been for years. I know some BSI designs even use double microlens layers. Further, I know Canon has a small form factor sensor design that uses a double layer of microlenses AND a lightpipe in an FSI design to minimize the problem even further. I'll raise your crude diagram with an actual cross section electron micrograph of Canon's 180nm Lightpipe sensor with Cu-interlinks (note, this is a rather old image, from about two years ago...Canon moves slowly, but I'd certainly hope they have sensors using BSI for pixels this small now...also note, given this is a 180nm process, the pixels pictured here are really quite small, I'd say less than 2 microns, so don't take this as an example of FF or APS-C fill factor):



As I said. Technology has been marching on. The simplistic grid of photodiodes and bare wiring/transistors is a thing of the past. Even gapless microlenses are a thing of the past. For really small pixels, BSI is ubiquitous these days, and we know Canon has some patents for BSI technology as well...so even the double-microlens layer/lightpipe design from the image above is a thing of the past.


I do agree that Canon is probably at the limits of the 500nm process, at least on a competitive front. They already make pixels much smaller than the 5D III's in their APS-C sensors, so they aren't at the limit of the process for their FF parts. They are obviously at the limits of their ability to remain competitive at 500nm for those larger sensors, though.

10
EOS Bodies / Re: Is Canon now two generations behind Nikon?
« on: August 29, 2014, 09:35:18 PM »
That still isn't going to tip the scales. Assuming we can actually discern the difference of a decibel or two at the frequency with which a mirror slaps in a DSLR (human auditory discernment ranges from 0.7dB to 3dB...but it depends on the frequency of the sound as to which end of that range we can actually discern a difference), the 5D III is still a superior CAMERA. It has an inferior sensor...but it's still a superior camera overall.

That could change, for sure. The D820 may finally tip the scales...but I don't think it's happened yet.

I agree, and doubt it really matters given how good the silent shutter on the 5D3 is already.  My brother-in-law got married recently and the photographer was using a 5D3 and 6D.  I was surprised by how quiet the shutter is when your face isn't pressed to the viewfinder as I'd never been on the business end of a 5D3 before.

Yeah, the shutter/mirror slap on the 5D III is pretty amazing. Even close up, it isn't really loud. It's actually got a somewhat complex sound, a mix of a slap, a thud, and a "clink"...maybe the sounds together help cancel each other out and that's what minimizes noise. Either way, it's MUCH more pleasant than the 7D slap.

11
EOS Bodies / Re: EOS 7D Mark II & Photokina
« on: August 29, 2014, 08:34:09 PM »
Lee Jay, jrista, and also sarangiman , sorry for the delay, but my ex-wife lives 40 km away from me so it took me a while. Plus, I'm very slow in writing.

Your only thinking about the individual pizza slices here. Your missing the bigger picture: The eater who eats 1/6th of a pizza eats 1/6th of a pizza SIX TIMES!! Therefor, the eater is not eating one pizza slice...the eater is eating A WHOLE PIZZA!  ;D This is the critical point that everyone seems to miss. If an eater eats two 15" pizzas, one cut into 6ths and one cut into 8ths...has the eater eaten less total pizza when eating the one cut into 8ths? NOPE!! He's still eaten a whole 15" pizza, same as he did when he ate the one cut into 6ths.
but I still think the eater here is not the whole sensor, it's each individual photosite. If the eater was the whole sensor, you'd obtain zero resolution, no detail. But you need detail to produce a meaningful image, so you must compare the eater to the single photosite. So the more the eaters (the higher the resolution), the less amount of pizza each eater eats.

Your still thinking about it wrong. The eater is not the sensor, nor the photosite. The eater is light. The pizza is the sensor. The slices are the pixels. When you create an exposure, light illuminates the ENTIRE sensor...not only one or two or ten thousand pixels...but the whole thing.

Hence the analogy. The eater is light. The eater is always eating a WHOLE PIZZA. It doesn't matter if that pizza is sliced into 6ths, 8ths, 12ths, or 40 millionths. The eater is STILL going to eat the whole darn thing....every single time (i.e. every single exposure). :P

Make sense now?


I know BSI sensors have the wiring on the opposite side, and that large sensor only marginally benefit from this configuration, that's why this more expensive and lower yield technology is not used in larger sensors, but this is true for (relatively) low density photosites per area unit. The higher the density, thus the smaller the photosites, the greater the benefit. I don't know the numbers, but are you sure the wiring of conventional sensors matter so little in the light blocking effect on the photosites?

Yes, I am sure. It used to matter, but that was near a decade ago now. Canon was using microlenses almost that long ago, and that alone changed the game considerably. Pretty much everyone is using gapless microlenses, and some are using two layers of microlenses. The non-sensitive die area is no longer blocking, reflecting, or absorbing much light. It's a percent or two, which has a small impact on overall Q.E., but it isn't significant enough to worry about.

A vastly more impactful issue is the use of a CFA itself. The color filter over each pixel is reflecting anywhere from 60% to over 70% of the incident light!!! The CFA is the real killer of sensitivity, by a very, very LONG shot. The small differences in photodiode area are pretty minor these days when talking about pixel sizes 4µm and larger. Canon is actually suffering a lot more because of that than their competitors...their process is 500nm. I have long suspected that Canon has not made a 24mp APS-C sensor yet because that would put them into a pixel pitch range where the 500nm wiring and transistor size WOULD hurt the performance of smaller pixels (i.e. fill factor, DESPITE the use of microlenses, becomes an issue again, simply because the ratio of photodiode area to total die area is too small).

Most other manufacturers are making sensors with 180nm or smaller processes. At 180nm, the difference between a 4µm pixel and a 6µm or 10µm pixel are pretty moot. There is a small percentage difference, but it doesn't generally amount to significant enough differences in total light gathering capacity that it matters in the grand scheme of things. I mean, look at the D800 vs. 1D X. The former has only a 1% loss vs. the latter at low ISO, and a massive 37% LEAD over the latter at high ISO. Granted, the D800 has better technology...it's got a Q.E. of 56%, the sensor does use a 180nm fabrication process, it's got better microlensing, etc. The 1D X only managed to scrape back the lead because Canon's analog CDS is (currently) superior at higher ISO.


Anyway...the use of microlenses, light pipes, BSI designs, etc. pretty much negates the fill factor issue. The remainder of the issue is being resolved with tall photodiodes, materials science that allows more electrons of charge to be held in a smaller photodiode area, etc. At some point, I figure all sensors will be BSI with some kind of reinforcement technology to keep the sensor substrate rigid, minimizing yield losses. When every sensor is BSI, the whole fill factor issue will be gone for good.

Does a 24 MP APS-C sensor, even at 180 nm, still marginally suffers from the interposed wiring? jrista, it seems you know where to find such information, I'd like to deepen my knowledge, why don't you post the most interesting links you find, every now and then, on CR? I mean not now, don't get me wrong, but you're one of the most active posters here, I'm sure at least some CR members would appreciate some technical reading sometimes, I for sure.

I periodically have these sensor patent and technology binges. :) I dig around looking for the latest and greatest news on what the image sensor world is doing, find and read patents (if I can, that's often a MAJOR PITA, as a lot of patents are only written in Japanese, and translating them can be a confusing endeavor.) I regularly read http://image-sensors-world.blogspot.com/ and browse through http://www.chipworks.com/ for the latest teardowns. Those two places are usually where I learn about new technology, and from there, I'll go digging for more information.

One thing to note...the VAST majority of the high end sensor technology I talk about is actually only really used in tiny sensors. Stuff 1/2" size of smaller, the kind of sensors used in hand held devices, cars, specialized video cameras, etc. Even though Sony's FF and APS-C sensors are currently the creme of the crop, and Canon's are up to a couple generations behind just about everyone at this point...in the grand scheme of things, FF and APS-C sensor technology across the board is quite primitive. There are some AMAZING things being done at the ultra tiny end of the scale. Were talking about pixel sizes of 1.4µm, 1.2µm, and 1.1µm are the current smallest generation. Most new small form factor sensors are 1.2 and 1.1 microns in size, however soon they will be reaching the theoretical limit for visible light image sensors, 0.9µm or 900nm. At 900nm, were starting to close in on the bandwidth of visible light, which ranges from around 750-800nm down to 320-380nm. (To my knowledge, light cannot pass an aperture smaller than it's wavelength. At some point, someone may figure out a way to get around that hurdle, but so far, I think that the 900nm pixel pitch is considered the minimum size for a pixel such that the luminous flux of an incoming wavefront can actually pass through the pixel aperture, or be refracted by a microlens, and still reach the photodiode.)

It took a lot of innovations just to make 1.4 micron pixels possible, and the current cutting edge 1.1 micron pixels required even more not just to be possible, but to still have enough sensitivity to be useful. Keep in mind, many of these sensors are only 1/8th of an inch in size at their largest, so the total light gathering capacity is minuscule. The proliferation of video into everything, including phones and tablets, now the booming market for car rear view video, etc. has demanded a level of high speed sensitivity that still cameras never dreamed of. That's where we got innovations like black silicon (similar to Canon's SWC lens nanocoating), which significantly increases the absorption rate of incident photons, multibucket pixels for high dynamic range, color splitting filters as an alternative to color filter arrays that don't waste any light, and a whole host of other improvements that radically improve the overall quantum efficiency of devices in very low light at high frame rates. Other innovations have succeeded in reducing dark current to practically meaningless levels (current generation FF and APS-C DSLR sensors have massive amounts of dark current...at their operating temperatures, probably as much as many electrons per second of exposure, so CDS and bias offsetting or black point clipping was essential to minimize dark current signal and reduce thermal noise)...dark current levels in modern cutting edge sensors is as little as a small fraction of an electron per second (in other words, it takes many seconds for even one electron to be freed in a pixel due to dark current).

I have an ultra high sensitivity 74% Q.E. Aptina sensor in my guide and planetary camera that I use for astrophotography. It has dark current of about 0.005e-/px/s. Sony's new ExView II CCD sensors (used in thermally regulated CCDs for astrophotography), which have 77% Q.E., have dark current of 0.003e-/px/s or less! Both of these sensors come in 1/2" and 1/3" varieties...so they are pretty small, half the size or less of an APS-C.

At some point, I figure the technology from the cutting edge, and ultra small, will eventually work it's way up to the larger form factors. However, on the SENSOR innovation front...the only things Canon has shown in the last couple of years are DPAF and a couple multi-layered sensor patents. Canon is an innovative company, however it seems most of their innovation is focused in areas other than image sensor development. In the same time frame, Aptina, Sony, Omnivision, Toshiba, and a number of other major players in the sensor market have filed as many as dozens of patents for sensor technology...each. If and when these cutting edge 1.1 micron sensor technology innovations find their way into FF and APS-C sensors...I don't suspect it will be Canon who does it first.

12
EOS Bodies / Re: Is Canon now two generations behind Nikon?
« on: August 29, 2014, 08:01:52 PM »

However, in every other respect, the 5D III is still the superior camera, with a better AF system, higher frame rate, quieter shutter and mirror slap, etc. It is also accompanied by a better ecosystem, including all the various kinds of lenses, some very unique lenses (i.e. MP-E 65mm 1-5x Zoom Macro, 17mm TS-E, 200-400 w/ integrated TC, etc.), and a whole host of top of the line accessories, including the phenomenal RT flash system.


I can swear I read somewhere that the D810 silent shutter mode is actually a couple dB better than the 5D3 version.

That still isn't going to tip the scales. Assuming we can actually discern the difference of a decibel or two at the frequency with which a mirror slaps in a DSLR (human auditory discernment ranges from 0.7dB to 3dB...but it depends on the frequency of the sound as to which end of that range we can actually discern a difference), the 5D III is still a superior CAMERA. It has an inferior sensor...but it's still a superior camera overall.

That could change, for sure. The D820 may finally tip the scales...but I don't think it's happened yet.

13
EOS Bodies / Re: Is Canon now two generations behind Nikon?
« on: August 29, 2014, 07:15:51 PM »
The Nikon D810 is better than the Canon 5DIII in every respect.  The sensor is 2 generations better.

You have that backwards. The D810 is better than the Canon in ONE KEY respect. The sensor IS 2 generations better. The D810 might have a better meter as well, however for what that better sensor is most often used for, I wouldn't say the meter maters much.

However, in every other respect, the 5D III is still the superior camera, with a better AF system, higher frame rate, quieter shutter and mirror slap, etc. It is also accompanied by a better ecosystem, including all the various kinds of lenses, some very unique lenses (i.e. MP-E 65mm 1-5x Zoom Macro, 17mm TS-E, 200-400 w/ integrated TC, etc.), and a whole host of top of the line accessories, including the phenomenal RT flash system.

The D810 closed the gap that existed between the 5D III and the D800, and it does have the superior sensor...but that is a different thing than the D810 somehow being superior in every respect. The D810 still suffers from Nikon's manufacturing issues...even it has a problem with sensor spots (only this time they are white spots, instead of black oil and dust spots.)

I'm all for more DR, but the sensor alone still doesn't make the camera, or the ecosystem, better.

14
EOS Bodies / Re: EOS 7D Mark II & Photokina
« on: August 29, 2014, 06:05:43 PM »
@Lee Jay: Thanks for the detailed replies. Reducing noise at the expense of resolution -- that sums it up well, I think.

In the past (and this is often how it's presented by "experts"), I've thought that, given identical sensor technology, going from 20.2MP to 24MP would just translate into more noise, and I guess it does, but if I scale it back down to 20.2MP, I haven't lost anything -- or maybe even slightly gained. Then in optimal conditions, I have more resolution, and in noise-producing conditions, I can scale the image down and be no worse off than the 20.2MP version of the same sensor tech.

Does that sound right? If so, then bring on the pixels! I wouldn't mind the flexibility to compress for noisy images and have extra resolution for low-noise images.

---

Interesting stuff. I'm open to explanations if anyone else wants to add, but this seems to make sense...

Yup. You pretty much have it. Assuming equivalent or better sensor technology, more pixels is never bad. It may not necessarily be better, beyond the core theory a lot of factors play a role...but more pixels can pretty much never be bad, certainly not worse.

The reason that full frame sensors usually perform better than APS-C sensors is the greater total area. If you frame identically with both cameras, then the larger the sensor, the more total light your gathering. That means less noise (on a normalized basis), and usually, even though the pixels tend to be bigger, more detail (since your getting more pixels onto the subject than you can with a smaller sensor.)

The only real caveat to the above is reach-limited situations. In those cases, smaller pixels will always resolve more detail than larger pixels. Doesn't necessarily matter if they are in a smaller sensor (although they usually are)...all that matters is that when you are at a fixed distance from your subject with a specific lens, your subject is going to fill the same absolute sensor area regardless of how big the sensor is. Then, smaller pixels mean more detail.

15
Quote
We are talking about comparable resolution capabilities, the bit you sum up as "If your lens has the resolving power to see the added resolution", that is the bit we are talking about.

I fail to see why crop v.s. FF is part of this topic when you clearly and accurately suggested it is about sensor density v.s. lens resolving power.  The original author might as well have posted "Estimating extra reach (resolving power) of Canon v.s. Sony.  - Makes as much sense.

Which brings us full circle - There is no extra 'reach' on Crop v.s. Full Frame, the diffraction limits are the same either way.

The Subject should read:"Estimating extra reach (resolving power) increased sensor densities provide for a given lens.

It doesn't really matter what the title of the thread is. The simple FACT of the matter is that smaller sensors nearly universally have smaller pixels than larger sensors. ONE large sensor has a lot of pixels, the 36.3mp Exmor, however even it's pixels are still larger than the pixels on the vast majority of modern day crop cameras. It doesn't matter if we call it "Estimating the extra reach of crop vs FF" or "Estimating the extra reach of small vs. large pixels". That's quibbling over semantics. Since smaller sensors are nearly guaranteed to have smaller pixels, the current title works fine.

Next, however, the whole notion that either a lens or sensor "sees" the resolving power of the other is a fallacy. I've explained this countless times on these forums, but here it goes again.

OUTPUT RESOLUTION (the measurable resolution of the image produced by a CAMERA, for the purposes of this post, defined as Lens+Sensor), is the CONVOLUTION of the resolution of the real scene as it's light passes through the lens and is recorded by the sensor. That's the key word here, convolution. Cameras convolve information. While it is possible for a lens to resolve 86lp/mm at f/8, and a sensor to have the ability to separate 116lp/mm, the notion that the sensor "outresolves" the lens at f/8 is a misnomer. What is really happening is the lens and sensor are working together to produce a BLUR SPOT. The size of that blur spot is what determines the resolution of the OUTPUT IMAGE.

We can very closely approximate the resolution of lenses and sensors by using the following formula:

Code: [Select]
blurSpot = SQRT(lensSpot^2 + sensorSpot^2)
The spot size of a lens can be computed by multiplying the resolving power in line pairs per millimeter, multiplying by two, and taking the reciprocal:

Code: [Select]
lensSpot = 1/(lensRes*2)
We can further convert the blur spot into spatial resolution by using the following formula:

Code: [Select]
spatRes = (1/blurSpot) / 2
We can combine these formulas into one single formula to take :

Code: [Select]
spatRes = (1/SQRT(lensSpot^2 + sensorSpot^2)) / 2
If we have a 1D X, 5D III, D800, 70D, and D5300 then (let's just assume they are monochrome sensors, for the sake of simplicity) each of those has a sensor spot of:

Code: [Select]
1DX: 6.92µm
5DIII: 6.25µm
D800: 4.9µm
70D: 4.16µm
D5300: 3.9µm

If we use the same theoretical lens, one which performs ideally at all apertures, on all five of these cameras, at apertures of f/2, f/4, and f/8, then the lenses DIFFRACTION LIMITED resolving powers are:

Code: [Select]
f/2: 346lp/mm
f/4: 173lp/mm
f/8: 86lp/mm

Converting these to spot sizes:

Code: [Select]
f/2: 1/(346*2) = 0.0014mm (1.4µm)
f/4: 1/(173*2) = 0.0029mm (2.9µm)
f/8: 1/(86*2) = 0.0058mm (5.8µm)

Running the numbers, we get the following:

Code: [Select]
1DX f/2: (1/SQRT(0.0014^2 + 0.00692^2)) / 2 = 71lp/mm
1DX f/4: (1/SQRT(0.0029^2 + 0.00692^2)) / 2 = 66.8lp/mm
1DX f/8: (1/SQRT(0.0058^2 + 0.00692^2)) / 2 = 55.5lp/mm

5DIII f/2: (1/SQRT(0.0014^2 + 0.00625^2)) / 2 = 107lp/mm
5DIII f/4: (1/SQRT(0.0029^2 + 0.00625^2)) / 2 = 94.6lp/mm
5DIII f/8: (1/SQRT(0.0058^2 + 0.00625^2)) / 2 = 68.6lp/mm

D800 f/2: (1/SQRT(0.0014^2 + 0.0049^2)) / 2 = 134lp/mm
D800 f/4: (1/SQRT(0.0029^2 + 0.0049^2)) / 2 = 111lp/mm
D800 f/8: (1/SQRT(0.0058^2 + 0.0049^2)) / 2 = 74lp/mm

70D f/2: (1/SQRT(0.0014^2 + 0.00416^2)) / 2 = 153.5lp/mm
70D f/4: (1/SQRT(0.0029^2 + 0.00416^2)) / 2 = 121lp/mm
70D f/8: (1/SQRT(0.0058^2 + 0.00416^2)) / 2 = 76lp/mm

D5300 f/2: (1/SQRT(0.0014^2 + 0.0039^2)) / 2 = 161.6lp/mm
D5300 f/4: (1/SQRT(0.0029^2 + 0.0039^2)) / 2 = 125lp/mm
D5300 f/8: (1/SQRT(0.0058^2 + 0.0039^2)) / 2 = 77lp/mm

Again, these are all theoretically diffraction limited apertures. Assuming such a case, small pixels, even the very small pixels of the D5300 are STILL resolving more detail at a diffraction limited f/8, which has a maximum theoretical resolution of 86lp/mm, than any of the full frame cameras with bigger pixels. There are diminishing returns, however the D5300 still enjoys over a 4% OUTPUT IMAGE resolution lead over the D800, and it enjoys a very large lead of 12.3% over the 5D III and a whopping 38.7% lead over the 1D X.

This is a DIFFRACTION LIMITED sensor. The notion that a higher resolution sensor cannot benefit at fully diffraction limited, narrow apertures like f/8, is patently false. The notion that a lens that is not resolving more than the sensor "cannot see" the resolution of the sensor is patently false. The two, lens and sensor, WORK TOGETHER to produce the final output resolution. In actuality, the specifics are certainly more complicated. Lenses tend NOT to be diffraction limited at wider apertures, and optical aberrations, of which there are many that affect the convolution of the incoming wavefront in different ways, will limit resolution at wide apertures on many lenses. However the same rules apply...for any given lens spot, regardless of whether it is limited by diffraction or aberrations, is going to CONVOLVE with the sensor. Higher resolution sensors, while they will eventually reach a point of diminishing returns, are STILL going to resolve more detail than lower resolution sensors.

Pages: [1] 2 3 ... 273