September 02, 2014, 10:16:46 PM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - jrista

Pages: 1 ... 107 108 [109] 110 111 ... 278
1621
EOS Bodies / Re: Will Canon Answer the D4s? [CR2]
« on: January 15, 2014, 03:37:27 AM »
... Assuming the trend towards higher megapixel counts continues, that will only continue to diminish performance at ISO 6400, meaning any technological improvements to improve light sensitivity will only restore ISO 6400 noise performance to the level of sensors with fewer megapixels.

while I really like to read all your insights I would still like to question your last sentence here a bit:

Has not the D800 sensor (+ electronics) proven, that it is possible to improve on both targets - resolution AND image quality [at low and high ISO] ... at the same time? The D800 has dramatically more resolution than the preceding D700 and still better (less visible noise and more DR, color fidelity) at ISO 6400. 

While there obviously is some tradeoff between resolution, low-ISO IQ (DR, noise/banding) and hi-ISO IQ it seem that with innovative approaches all 3 corners of the triangle can be pushed forward. SOny (+ Nikon) have been able to do this and get products to market that deliver the goods to customers, while Canon has been lagging for some time by now. Canon has managed to only push 1 corner of this triangle: hi-ISO image-quality - but not resolution and low-ISO IQ.  And even in the Hi-ISO IQ, Canon's sensors are definitely not really superior to Nikon/Sony sensors: 1DX sensor is really only marginally ahead of the D800 from ISO 3200 upwards and probably not at ahead of the D4. 

As far as I am concered, that is the main source of "relative discontent" with Canon compared to Nikon (/Sony) when discussing sensor performance.

My feeling is, that Canon up to now really is UNABLE to match Nikon/Sony sensor performance, rather than UNWILLING to do so - or intentionally "holding back", saving improvements for future products.

The D800 is no better at high ISO than any other camera. My post was specifically addressing the notion that you could flatten the noise curve, or even eliminate noise. You can't. It is IMPOSSIBLE to simultaneously have ISO 100 and ISO 6400 perform the same. Cannot happen. If you reduce the quantity of light, the ratio of noise to signal will increase, simple as that...and that has nothing to do with the hardware, that is an attribute of gathering light.

The D800 did exactly what I said Canon could do: Reduce read noise. That's it. Sony Exmore has 3e- read noise at all ISO settings, which is why its LOW ISO settings are better. Canon has a non-linear read noise curve. In the case of the 5D III, for example, it is 33e- at ISO 100, 18.2e- at ISO 200, 10.6e- at ISO 400, and 6e- at ISO 800. After that it drops to 3e- and below. Canon needs to eliminate their read noise, and they will gain Low ISO DR as a result. But Canon can't really do anything to make ISO 6400 have the same amount of total noise as ISO 100, and they have very limited options to reduce high ISO noise in general (which is done by increasing signal strength, which allows for a reduction in gain, which is directly what impacts photon shot noise...but we are talking fractional changes here, nothing significant or particularly substantial.)

All cameras currently on the market that have pixels that fall into certain size ranges are going to have similar high ISO performance. There isn't even the option of making them significantly different, because it is a matter of physical limitations, not technological. Canon has a lead when it comes to high ISO performance over SoNikon right now, however it is marginal. The primary reason Canon has the lead is generally larger pixels, and better CDS (Canon's high ISO read noise can be less than 2e-, where as with SoNikon sensors it is usually closer to 3e-.) The D800 has visibly worse high ISO performance than any Canon sensor (especially the 6D, and except the old 18mp APS-C) due to its use of smaller pixels. And that is entirely expected by the theory...high ISO is limited by physics, and how that limit affects noise is ultimately guided by FWC...of which the D800 has 33% less of compared to the 5D III (and 41% less than the 6D).

1622
EOS Bodies / Re: A New Rebel for CP+? [CR1]
« on: January 14, 2014, 11:26:50 PM »
Blah. Another Rebel. I REBEL! RISE IN REVOLT OF THE REBEL!! Muahahahahaa!!  ::)  :-*

I hate to say it, but Rebels are the least exciting thing Canon ever releases. They are ubiquitous, boring, and always the same. I really can't wait until Canon releases something exciting again...7D II, UberMP with HyperDR, SOMETHING EXCITING!! Meh.  ???

1623
Animal Kingdom / Re: Show your Bird Portraits
« on: January 14, 2014, 11:20:31 PM »
Had the "Invasion of the Bushtits!" today. Huge horde of them blasted through a few times, giving me some opportunities to photograph these hilarious little fuzzballs. The light was terrible, as it was overcast and snowing...and the 7D utterly sucks in this kind of light. But, the birds were fun!





Read the whole story of the invasion at my blog.

1624
EOS Bodies / Re: Will Canon Answer the D4s? [CR2]
« on: January 14, 2014, 11:18:24 PM »
I think the bigger question is - Does Canon Even Need to Answer the D4s...
(btw: if naming convention holds up the D4s will not be any significant increase in mp, if any at all, so it is not going to be a 'pro-body' D800 - just improvements Nikon feels worthy of a different model name)

If we look at Canon's current lineup of top tier bodies (1Dx, 5DmkIII, 6D, 70D, T5i) they all outsell their competitors offerings. While Canon surely wants to advance the technology as much as the next brand, it comes down to dollar and cents in the end.. and it is there that Canon continues to hold the lead. So other than bragging rights to some ambiguous scoring service there is actually little reason for Canon to worry about a specific brand or model they already outsell...

Not everyone will agree with this but as for the other argument - DR! - well it is not as significant or as necessary as many believe. By that I mean while a generous range can provide flexibility and creativity in certain situations it is of limited use. A properly or creatively exposed shot can relieve the need for 14 stop post processing as there would be no 'need' for it to begin with. To give an example: If Canon/Nikon/Sony/etc were to develop a sensor that captured every scene with well lit shadows, exaggerated colors, exaggerated contrast, a complete dream like scene with 50-stops of DR, and no more leeway for processing because the sensor has already captured and reproduced everything there is to be seen - people would still complain about the lack of stops they have in post as a must have, must design etc.

Isn't photography the art/science/creativity of capturing light and shadow?? If the scene being captured has shadows you cannot see into with your eye then there is no 'need' to remove the shadows in development/print, actually doing so tends to ruin the feel of the scene most of the time. It is true that there is an interest/intrigue in an image that looks the way you see the world in a dream - where everything is lit by some magical indirect lighting coming from every direction - but as it is not the way we see the world in real life it will always be a method of processing, a fad, an interest that comes and goes.. It is not something a camera manufacturer need redesign their products around. It is simply not something 'needed' to the point that people fret over 1,2,3 stops difference between this model or that brand.

We would all be better served by a sensor that produces noise free images through the ISO range, even if it did not capture any more than 11 stops of DR, and I mean without downsampling, without film-like-grain, and without 3x the post processing.. I personally would not care if Canon ever squeaked out another stop of DR as long as they work towards ISO 100 performance at >6400 ISO...


You clearly don't understand the primary source of noise. It is impossible to have ISO 100 performance at ISO 6400, while still having comparable sensor resolution to sensors of today. "Noise" is a general term that refers to ALL noise in an image. NOT all noise in an image is from the camera's electronics. Noise caused by camera electronics is called read noise, however read noise only affects the deep shadows, and it is generally only present to a relatively significant degree at lower ISO settings. You are also missing the fact that dynamic range is relative to noise. Eliminate noise, and you effectively have infinite dynamic range (or, in the case of a digitized result, you gain the maximum dynamic range up to your bit depth...whatever that may be...14bits/14stops, 16bits/16stops, 1024bits/1024stops.)

The primary source of noise, by a very significant margin, is photon shot noise. This noise is present in the analog image signal itself, and has absolutely NOTHING to do with the camera. The amount of photon shot noise is approximated by SQRT(Signal), so as signal strength drops (which is what happens when you crank up ISO), the ratio of noise to signal increases. Canon cannot fix that. Canon, as well as every other camera and sensor manufacturer on the planet, have absolutely no control over that. You will never have ISO 100 performance at very high ISO settings like ISO 6400. For that matter, even ISO 100 has noise...its less, but noise is always present in every signal, regardless of what the ISO setting is.

Even if Canon reduced megapixel count to 1mp, and greatly increased pixel size, you are STILL not going to have ISO 100 like performance...it'll be a lot better, but it will still be noisy relative to the image size, because the total signal strength at ISO 6400 is still the same, it's just spread out across fewer pixels with larger capacities. Reducing megapixel count is largely no different than downsampling. You gather more photons per pixel...you lose detail (in the case of a 1mp sensor, a LOT of detail), but you have less apparent noise. ISO 100 noise levels drop right along with ISO 6400 noise levels, so even though ISO 6400 is better, ISO 100 is that much better, too! If you take this concept to its ultimate conclusion, you eventually arrive at a one-pixel sensor of infinite size...that would be the only way to actually eliminate noise at all ISO settings...but, it's entirely impractical and implausible.

You MUST make a trade off. More megapixels, more per-pixel noise, fewer megapixels, less per-pixel noise. Doesn't matter if you reduce sensor pixel count, or downsample in post, either way, it's the same tradeoff. Assuming we stick with current technology, we could DOUBLE quantum efficiency for all sensors that have less than 50% (which isn't that many these days, most sensors are 49-51% Q.E. at least now). If we double quantum efficiency, and leave pixel size the same, that only means that ISO 6400 is now as good as ISO 3200. At the same time, ISO 100 got a stop better as well! It is now as good as a native ISO 50 would have been on a sensor with a Q.E. of 50%. Once were at 100% Q.E. (also technically infeasible...at best we could get somewhere around 90% or so with extreme cooling to -80°C), that's it...we cannot improve quantum efficiency any more. ISO 6400, for that sensor resolution, is the best it's going to be without taking some radical departure from standard sensor designs.

Some patents exist, like color splitting filters, as an alternative to color filter arrays. This might get you another half stop or so. So, ISO 6400 might look as good as ISO 2500. Maybe we employ some kind of layered photodiode...its been done in Foveon-type sensors, however its tricky and the impact on noise is minimal in practice. We might gain another half stop...so ISO 6400 now looks like ISO 1600. Back-illuminated sensor design might get us a small fraction of a stop for pixel sizes as big as they are in DSLRs...it wouldn't be worth the added fabrication costs. That about exhausts the extreme measures we could take to improve ISO 6400. Our sensors will now probably cost a good three to four times as much as they did if we employ all of these techniques...all for two stops of ISO improvement. Were still a long, long ways from ISO 6400 looking anything remotely as good as ISO 100, however...and there is still that nagging little fact that every time we improve ISO 6400, we also improve ISO 100. We never actually achieve the goal of normalizing noise at all ISO settings, because anything we do to make ISO 6400 better makes all the other ISO settings better as well.

It doesn't matter what you do, there is no normalizing the noise levels of different ISO settings. There will always be noise, at all ISO settings, and noise will increase as the square root of the signal as ISO is increased, because that's simply how the physics works. It is impossible to have the same levels of noise at all ISO settings. It's a matter of physics, not technology. We can't break the laws of physics. And they are already bent pretty far with current sensor technology...it is nothing short of amazing that we get the kind of IQ we currently do out of small form factor sensors with 1100nm pixels...that is as small as a wavelength of deep infrared light!!

Now, contrary to the issues above with eliminating all noise at all ISO settings, camera manufacturers DO have control over how much noise their electronics generate. They don't have total control, some things are still beyond their control...for example, we cannot completely eliminate dark current noise, but we can reduce it with CDS (Correlated Double Sampling), and we can greatly reduce it even more by cooling sensor circuitry to temperatures well below zero (-80°C is the sweet spot for power vs. dark current reduction). We can reorganize circuitry, move high frequency components into isolated areas on the die, increase parallelism and reduce operating frequency, and probably a whole host of other things that are currently being discovered or have yet to be discovered that give us control over read noise.

By controlling read noise, we reduce the thing that is actually eating away at dynamic range at lower ISO settings. Canon sensors are not limited to 11 stops of DR. Actually, according to some older studies done by I believe Roger Clark of Clarkvision, when we ignore downstream sources of read noise, Canon's current sensors are likely capable of over 15 stops of dynamic range in analog space. That dynamic range is REDUCED by read noise, which includes noise from dark current as well as noise from high frequency components downstream of the sensor. Canon technically has a lot of options when it comes to reducing this source of noise...hence the reason low ISO dynamic range is a highly contentious point with Canon users. Many manufacturers in the CIS industry have started moving past the 11-12 stop "barrier" that used to be the limit throughout the first half or so of the last decade. Several manufacturers are achieving more than 12 stops of DR at ISO 100, and one has achieved over 13 stops of DR at ISO 100. All of them have achieved that by reducing read noise.

So sorry...but Canon cannot eliminate all noise. Simply not possible. Canon CAN reduce the noise their camera electronics are introducing into the low end of the image signal, however that will do little to affect ISO 6400 performance. Even assuming we employ all the best known options for improving literal light sensitivity on the sensor, outside of GREATLY reducing pixel count (by a factor of two or more, which is really just the same as downsampling), we MIGHT get another two stops of noise performance before we hit an impenetrable brick wall. That still leaves us at least four stops away from having ISO 100 performance at ISO 6400. Assuming the trend towards higher megapixel counts continues, that will only continue to diminish performance at ISO 6400, meaning any technological improvements to improve light sensitivity will only restore ISO 6400 noise performance to the level of sensors with fewer megapixels. 

1625
EOS Bodies / Re: Where are Canons innovation?
« on: January 14, 2014, 06:01:06 PM »
Sorry, I do not think Aptina has anything to do with Nikon D4, it is a Renesas made sensor as in D3

And what information do you have to back that up? There is no mention of CMOS Image Sensors on Renesas' site. I found one article that mentioned Nikon uses Renesas microcomputer parts for controller chips in some of their cameras, but that is not the same thing as a CMOS Image Sensor.

Chipworks has a breakdown of the D4 sensor, however you have to pay (a hefty price) for it. The only other mentions of the D4 sensor is on Nikon Rumors, and last I remember, it was there that I saw the D4 sensor and Aptina mentioned together. Chipworks also did a breakdown of the D3 sensor, and they mentioned Renesas as a possibility, but only because they had prior ties to Nikon (for the controllers), however they also noted that Renesas has no history of actually manufacturing CIS parts (despite apparently having a couple patents for CIS technology.)

The Renesas idea has been brought up before, but there is absolutely no hard evidence to suggest they actually fabbed any Nikon sensors. The only concrete link between Nikon and Renesas is for controllers. There were numerous debates on DPR about Renesas and Nikon, even there there has never been any conclusion that Renesas has ever manufactured a CIS part, let alone any D3/D3s/D4 sensors.

Aptina, on the other hand, most definitely has a very powerful and strong presence in the CIS world. If any third party, other than Toshiba and Sony, has ties with Nikon to manufacture sensors, it would be Aptina.

1626
EOS Bodies / Re: 7D Mark II on Cameraegg
« on: January 14, 2014, 03:40:25 PM »
Don't worry, Moore's law hasn't run out yet, and 10TB hard drives are in the works.

HDDs are too slow.  I'll wait for the 10 TB SSDs to come out.  :P

I've been amazed at the recent prices of larger SSDs. 500Gb SSDs hit as low as $320 last month, and 1Tb SSDs hit as low as $640. Kind of exciting, seeing the larger ones start to come down in price (especially when it still costs over $200 for a 200GB SSD, and $150 for a 128Gb SSD... O_o ). I wonder how long it will be before 2Tb SSD drives drop below the $1000 mark.

1627
EOS Bodies / Re: Where are Canons innovation?
« on: January 14, 2014, 03:31:53 PM »
Nikon doesn't make a fraction of the revenue Canon does on their photography division. That doesn't bode well for future Nikon innovation. As it stands, the bulk of the innovation in Nikon's most recent camera bodies came from other companies, like Sony. That is a precarious position to be in...relying on other companies so much. If any one of them faltered or failed, Nikon could be dragged right down with them.
This is a point so important that it is staggering! There are not a lot of companies out there producing large quantities of imaging sensors that could go into DSLR's... What happens if Sony fails, or at the least, gets rid of the portion of it's business that makes the sensors for Nikon? Hopefully, someone will buy that division and the production will continue, but if it doesn't, Nikon will be out of business until someone else can set up a production line and get up to speed... a process that will take years....

I think Nikon could probably fabricate their own sensors. They used to in years past. Their management thought it would be more profitable to stop investing money in their own fabrication, and buy their sensors and the like third party.

Perhaps I'm wrong, but doesn't Nikon fab the D4 sensor?

They designed it, or at least had a hand in its design, much like the D800 sensor. I am not sure they actually manufactured it...I thought Aptina did the actual fabrication.

Could be. This is what I found: http://nikonrumors.com/2012/08/22/the-sensor-inside-the-d4-is-made-by-nikon.aspx/
but it is hardly conclusive. I have worked for companies which sell circuit cards. They usually went outside for fab, but included their logo in a silkscreen layer.

Either way, it's not particularly relevant. Even if they build sensors for the D4, D3200, etc., Sony halting their line would be a huge speedbump for Nikon.

Yeah, kind of hard to be conclusive about it. Some of Canon's parts have Canon's name on them, but fabrication markings from other fabs. For example, DIGIC5 has Canon's name, because they designed it (along with Ti), but it was fabbed by UMC (http://www.chipworks.com/en/technical-competitive-analysis/resources/blog/inside-the-canon-rebel-t4i-dslr/). I know Nikon has had a hand in designing most if not all of the parts that are included in their cameras. I honestly don't know when they last fabbed their own sensor, though.

As far as Canon goes, I am very curious to see Chipworks tear apart whatever new sensor finds it's way into the 7D II. If it has some radical changes, especially a die shrink, I'd be very curious to know if it was fabbed by Canon, or fabbed elsewhere.

1628
EOS Bodies / Re: Where are Canons innovation?
« on: January 14, 2014, 12:05:39 PM »
Nikon doesn't make a fraction of the revenue Canon does on their photography division. That doesn't bode well for future Nikon innovation. As it stands, the bulk of the innovation in Nikon's most recent camera bodies came from other companies, like Sony. That is a precarious position to be in...relying on other companies so much. If any one of them faltered or failed, Nikon could be dragged right down with them.
This is a point so important that it is staggering! There are not a lot of companies out there producing large quantities of imaging sensors that could go into DSLR's... What happens if Sony fails, or at the least, gets rid of the portion of it's business that makes the sensors for Nikon? Hopefully, someone will buy that division and the production will continue, but if it doesn't, Nikon will be out of business until someone else can set up a production line and get up to speed... a process that will take years....

I think Nikon could probably fabricate their own sensors. They used to in years past. Their management thought it would be more profitable to stop investing money in their own fabrication, and buy their sensors and the like third party.

Perhaps I'm wrong, but doesn't Nikon fab the D4 sensor?

They designed it, or at least had a hand in its design, much like the D800 sensor. I am not sure they actually manufactured it...I thought Aptina did the actual fabrication.

1629
EOS Bodies / Re: 7D Mark II on Cameraegg
« on: January 14, 2014, 10:36:42 AM »
I am hoping the 7D2 is to the 7D, as the 5D3 was to the 5D2 and is compelling enough to make me want to upgrade. I am specifically looking and hoping for a sizable improvements in DN, High ISO Performance and the 5D3 AF system or similar.

I am so disappointed with the 7d by today's standards that any improvement will be worth the upgrade.  Just show me where to sign...

The problem Canon has with the 7DII is that it will certainly rob sales from the 1Dx. If 10 fps @ 18mp with 61 point AF is true then it really will be a 1.6x cropped 1Dx. I suspect that Canon will leave the IQ much the same as the 70D, it's got to hold back on something for this camera.
Naturally we all want a camera with less video compromises (the 7D has a really strong AA filter), far better noise handling threashold and a DR to match the current Sony Exmore CMOS sensors. But I seriously doubt that Canon will invest in sensor tech for this camera and I think they will brin in the new tech for the 1Dx replacement or it's high MP cousin.

I dunno...if people have the dough for a 1D X, they will be getting a 1D X. There are so many other features of the 1D X that will trounce any variant of the 7D II, it's still worth the cash. For everyone else, well, they weren't going to be buying a 1D X in the first place due to cost, so the 7D II won't really be stealing anything...it would actually boost total sales for Canon well above any potential "conversion loss"...millions could buy the 7D II over the next few years, so even if Canon loses a few 1D X sales a year, it simply won't matter.

Regarding the AA filter...there have been patents and papers about electromagnetically attenuated AA filters that can be adjusted (either automatically, based on seen preview analysis or camera mode detection (still or video), or by a user controllable setting.) A user attenuated AA filter would be the ultimate solution, IMO. You can attenuate it down for say landscapes, and crank it up for video. Or simply tune the AA blur factor according to the detail levels in the scene...if you have a lot of high frequency detail, jack it up...if not, drop it to it's minium setting. That would make my day...hell, that would make my year, if Canon included something like that in one of their cameras.

1630
EOS Bodies / Re: Will Canon Answer the D4s? [CR2]
« on: January 14, 2014, 09:10:36 AM »
That would be post-demosaic
(snip)
If you've ever tried to push a 16-bit TIFF around the same way you push a RAW around,

But isn't everything on a Bayer sensor post-demosaic?  You can use your slider settings as parameters to your demosaic algorithm, but you still have to demosaic the raw file to have an "image" rather than "data."  E.g. a very intense, pure-green light will show as (near) zero on a red-filtered photosite.

No. When you edit a raw, most of the exposure settings, white balance, and a number of other things are applied to the RAW data. Demosaicing converts the results of that initial processing on the RAW into an image that can be displayed on screen. Other settings, such as sharpening, occur after demosaicing (which is partly why sharpening in a RAW editor can have such a great impact on increasing noise.) Some raw editors apply wavelet denoise algorithms on the RAW, and a couple tools like Topaz DeNoise apply some pre and some post demosaic denoise if you apply it to the raw.

Most RAW settings are applied pre- or during- demosaic, and some are applied after. Demosaicing, during RAW editing, is near the end of the pipeline that repeatedly renders, realtime, a RAW image on screen. If you export to a TIFF, this same pipeline is used to render to a file rather than to the screen. Any edits on the TIFF, from that point on, are obviously on RGB data.

Quote
downsample

I get the downsample thing, and I don't dispute that.

Quote
how much you lose by converting to an RGB image...you lose a LOT.

Seems to me this could have more to do with round-off error than raw vs. TIFF.  I'm not, by any measure, an expert in Photoshop, but maybe I'll try "pushing around" a TIFF file using adjustment layers vs. old-style hard edits.

It doesn't really matter if you use old style edits or adjustment layers. Once demosaiced, you lose your editing latitude. It's like using sRAW or mRAW in Canon cameras...you can push exposure around a bit, but you lose a lot of editing latitude. Where before you might be able to shift exposure up or down by five stops without encountering attenuation limitations or artifacts, with sRAW, you can only push exposure around a couple stops before you run into problems. TIFF is the same deal...you can push levels and curves around a bit, but attenuate too much, and you run into problems.

Quote
In the statement of mine you quited, I was speaking about the hardware aspects of a camera. The RAW, as it comes strait out of a DSLR, is limited in terms of dynamic range, by the bit depth of the ADC.

I guess this is another item I don't quite get: I'd assume "hardware" DR would be limited by the fwc and read noise, rather than the ADC.  You could always have an ADC whose digital output does not fall on EV boundaries, being either coarser or finer.  E.g., the 20D has fwc of about 51,400e (per Clarkvision) but has 12-bit ADC.  The 7D has fwc of about 24,800e (per Clarkvision) but has a 14-bit ADC.  If I understand correctly, this should give the 7D finer gradation, but not necessarily more hardware DR.  (all dependent on noise, of course)

Well, depends on what you consider hardware. If you are thinking purely sensor, then yes, DR is limited by FWC and read noise. According to some of Roger Clark's older work with Canon gear, their sensors themselves have actually been capable of around 15 stops of dynamic range for a while. The intrinsic read noise component from dark current in a Canon sensor is pretty low, only a couple electrons. Canon CDS technology, built into their sensors, is very good. However, the more significant component of read noise in Canon cameras comes from downstream sources. One of those sources is their high frequency off-die ADC units themselves, which probably consume about a stop or so worth of DR alone. Canon also has a downstream amplifier, used in a variety of situations, which also has the potential to add read noise. Shipping an analog signal along a bus is also another opportunity for noise to be introduced. Total read noise would be an amalgamation of noise from all these sources, not just the sensor itself.

When I refer to hardware DR, I am referring to this entire processing pipeline in the camera. I'd say Sensor DR if I meant just the sensor, but hardware DR is what you get out of the sensor + bus + downstream junk + ADC + DSP. Since the ADC is 14 bits, and the output is digital rather than analog, your implicitly limited to 14 stops, regardless of what the sensor itself might actually be capable of. Your quantizing the analog sensor information, with a cap on the maximum quantization value. If Canon can solve their downstream noise problems, if their intrinsic dark current noise from the sensor really is as low as 1.5-2e- after CDS, then I think Canon could actually benefit from 16-bit ADC. We might be able to get as much as 15.6 stops of DR out of Canon's current sensor tech.

1631
EOS Bodies / Re: Will Canon Answer the D4s? [CR2]
« on: January 13, 2014, 10:16:43 PM »
It is impossible to have more than 14 stops of actual real-world DR with DSLRs because we only have 14-bit AD converters.

At the risk of looking like an idiot, I confess I don't quite follow this.   I've never examined de-mosaic algorithms, but I haven't assumed that they are limited to weighted averages of adjacent photosites.  I've always assumed that a de-mosaic algorithm could choose to include some "addition."  In other words, that the resultant pixel RGB values could be scaled up to accommodate adjacent photosites that are (nearly) maxed out or nearly-zero.  By analogy, it's the difference between rolling two dice and averaging, vs. rolling two dice and adding.

Certainly an individual photosite can't have more DR than permitted by its physical characteristics, but I don't see why the resultant post-demosaic scale can't, as it were, "go up to 11."

That would be post-demosaic, though. I mean, we can downsample images and gain DR as well...but again, post-demosaic. Our editing latitude in a tool like Lightroom comes from editing the RAW image. The RAW is effectively a digital signal, and we push that signal around with the exposure, highlight and shadow sliders, the tone curve, etc. Once you "rasterize" that digital signal, your editing latitude disappears. If you've ever tried to push a 16-bit TIFF around the same way you push a RAW around, you would understand how much you lose by converting to an RGB image...you lose a LOT. So, technically speaking, you could gain a stop or more of dynamic range simply by downsampling...however in order to downsample, you have to demosaic the RAW...and you lose your editing latitude. You have less noise, but you don't have quite the editability you once did.

In the statement of mine you quited, I was speaking about the hardware aspects of a camera. The RAW, as it comes strait out of a DSLR, is limited in terms of dynamic range, by the bit depth of the ADC. If you have a 14-bit ADC, you can't have more than 14 stops of DR...and to actually achieve 14 stops of DR, you would require a zero noise floor. Not even Sony Exmor has a zero noise floor, which is why DXO's "Screen DR" measure says 13.2 stops, rather than 14.4 stops, of dynamic range. DXO Screen DR is effectively a direct measure of the hardware capabilities...Print DR is a measure of a post-processed image (and, to be quite frank, we don't know exactly what kind of processing is used to produce the 12mp normalized "print" image they measure Print DR from...so it is, IMO, a sketchy measure.)

For all intents and purposes, until we have 16-bit ADC, the theoretical maximum hardware DR of a camera is 14 stops, and due to noise, the actual realizable dynamic range is going to be less than 14 stops.

1632
Lenses / Re: Get a 300mm or 600mm? Oh the agony...
« on: January 13, 2014, 10:05:38 PM »
jrista, thanks for the hints and encouragement.  I'm seriously thinking of buying the iOptron 3302B SkyTracker Camera Mount so that I can have more fun with the night sky.  Any thoughts on that idea?

Jack

Jack,

Regarding astrophotography, I would recommend that you research a little before you invest any money into something like this. I'm not saying that it's a bad thing, just that you should know before you spend. This is a good site to start with: http://www.astropix.com/INDEX.HTM

I'm sure that there are many ways of categorizing astrophotography, but off the top of my head, here are a few different types:
  • Lunar imaging - you likely won't need a tracking mount, just a good tripod and a good telephoto lens (+ teleconverter if you have it). The moon is bright enough that you shutter speeds will be relatively fast. This is a great place to start.
  • Wide angle sky images (e.g. the Milky Way) - This is where the iOptron could be useful, but even this can be successfully done with just a good tripod. See the link that I posted earlier for some great advice on how to do it. This is also a great place to experiment.
  • Planetary imaging (notably Jupoter, Saturn , and Mars) - a DSLR is not the best camera for this. You need loooong focal length ( > 1000mm) to magnify the planets enough to see detail, and a webcam or similar device is better than a DSLR. That is because you'd be cropping out 95% of the DSLR image, and you'd want to stack at least dozens, preferably hundreds of images.
  • Images of galaxies, nebula, clusters, etc - This absolutely needs a tracking mount, but it would definitely push the limits of the iOptron, depending on what you are looking to get out of your photos. To get a really sharp image like you see some of the advanced folks getting, you're talking at least the $1k range just for the mount. Yes, it *can* be done for less, but you'd need to put a fair amount of sweat and tears into your effort.
  • Solar imaging - the key requirement is to get a specialized solar filter to go on the FRONT of your lens/scope. If you're talking sunspot images, it isn't too bad. If you want beautiful pictures of solar flares, you're talking very specialized gear.

Don't get me wrong, I strongly encourage you to give astrophotography a try, but I would see what you can do with your existing equipment first, then decide if you want to continue before spending money on any specialized equipment.

Finally, if you're thinking about getting a 600mm for other reasons (e.g. birding), then you've already invested 80-90% monetarily of what you need for some good deepsky imaging. However, you're only 10-20% of the way through the learning curve (but that's half the fun, right?  ;) ).

Dave

You can do a lot more than just milky way with an iOptron. People have been using devices like that to get pretty darn good Messier "deep sky" results...larger galaxies and nebula, open clusters, etc. A 100mm f/2.8 Macro lens and an iOptron could get you some pretty phenomenal results of say Orion's Belt and Sword, which contains at least 5 nebula. Slap on a 135mm or 200mm lens (so long as the whole setup is under the 8lb weight limit), and you could zero in on say just the Orion Nebula. At 200mm, periodic error in the iOptron might limit how long you can expose, but exposing for a few seconds and excessive stacking can still get you some pretty phenomenal results. It can be quite a useful tool...I was actually planning to buy one not more than two months ago, when I decided instead to save my money for more ambitious goals (i.e. Celestron EdgeHD 1100 DX.)

For $300-500, things like iOptron's devices are a good way to get started into both wide field and deep sky astrophotography without spending thousands of dollars. Now, Jack should be aware, there is no way in hell that little device is going to hold his 300mm lens...just in case he was thinking he'd slap on the 2x TC and do some hard core imaging of deep sky objects. You need a much sturdier mount with much more accurate tracking and much lower periodic error (and, probably, some autoguiding as well...and all of that mounts up to considerable cost...don't expect to get away with less than a $5000 investment.) 

1633
EOS Bodies / Re: Will Canon ditch the AA Filter?
« on: January 13, 2014, 09:53:43 PM »
why should we assume that suddenly, sensors are going to rapidly outpace lenses in terms of resolving power?

This would be where we see things differently, I just don't see peak resolution going up significantly in lens reviews. I guess most reviews aren't using a 20MP+ 1.6x crop sensor, but with numbers usually hitting 50lp/mm at the most and at not particularly wide apertures, I don't see lenses at f2.8 approaching anything near 100lp/mm any time soon. Most of the improvement, both on the Sigma and Zeiss lenses, seems to be in bringing f1.4 up from being barely usable to being good (30lp/mm or below to near 40lp/mm), not from good to great.
If lens companies actually are capable of making 200lp/mm capable lenses I would love to see some.
I have to wonder if we aren't seeing resolving power of that sort on the high end P&S cameras though, maybe the cost of doing something like that on 35mm is just prohibitive.

You are misinterpreting the results of "lens" reviews. Lens reviews are not lens reviews, really...they are camera reviews. Actually, they are camera image reviews. The 50lp/mm is the OUTPUT resolution, it is not generally indicative of either lens or sensor resolution. As either lens or sensor resolution goes up, output resolution increases up to the limit imposed by the lowest resolution component of the system. We can use SQRT(LensBlur^2 + SensorBlur^2) to approximate system blur, which can then be turned into lp/mm. If we assume a 123lp/mm diffraction limited f/5.6 lens, and a 100lp/mm sensor, the actual output resolution would be (1/SQRT(0.00407^2 + 0.005^2))/2, or 77.5lp/mm. The output resolution, 77.5lp/mm, is considerably lower than either the 123lp/mm of the lens or the 100lp/mm of the sensor. Even a diffraction limited lens can suffer from additional losses in contrast due to transmission loss...actually, most do, at least around a percent or so. So, instead of 77.5lp/mm, we are probably closer to 65-70lp/mm. If you test a lens that is not diffraction limited at f/5.6 on that 100lp/mm sensor, you lose even more...50-60lp/mm.

A high end lens, like the 600mm f/4 L IS II, offers exceptional resolution wide open, and becomes truly diffraction limited, with high transmission (thanks to things like nanocoating, which reduces transmission losses to the 0.1% or less range) is diffraction limited by f/5.6, and offers even higher resolution at f/4 (even though it is not diffraction limited.) Assuming we test a 600/4 II on our 100lp/mm sensor, and if we assume the lens at f/4 resolved around 150lp/mm, the output resolution is going to be 91.5lp/mm.

It should also be noted that there are not many sensors on the market that actually achieve a true 100lp/mm spatial resolution. When you factor in AA filters, IR cut filters, and bayer CFA design, even the microscopic amount of shake caused by the shutter, a sensor that has a 5µm pixel pitch is actually maybe going to achieve around 50-65lp/mm. Maximum output resolution is limited, on the upper end, buy the lowest common denominator...so if the sensor can only resolve 50lp/mm, the result of a lens test with that sensor, regardless of how good the lens actually is, will NEVER be higher than 49.999...lp/mm. System resolution, output resolution, is asymptotically related to the lowest common denominator of the components involved in producing the output.

This is the real fundamental problem with lens tests...the lowest common denominator in the vast majority of them (in ALL cases, where an actual camera is used and the lens is diffraction limited at around f/8 or faster) is the sensor. You could, theoretically, have a lens with infinite resolution. If your sensor only achieves 50lp/mm, then the "lens" resolution, as a result of testing the output image, would be 49.999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999yaddayaddaadinf lp/mm.

Lens tests are really not lens tests. They are camera tests. Actually, to be quite frank...they are 100% bogus. They don't tell you jack about the lens. They tell you a whole lot about the camera, but they don't really tell you anything about the lens, because the lens is sensor-limited (hmm, I tink I jus coined a new term.) They don't tell you all that much about the actual resolving power of the lens, because the actual analysis is performed on the image that comes out of the camera.  Which is why it isn't surprising to see low end lenses in the 20-35lp/mm range, and high end lenses in the 45-50lp/mm range when tested with the average sensor, and rarely higher than 70lp/mm or so when tested with even the highest DSLR resolution sensors on the market (like Sony's 24mp APS-C sensors.)

It would have been hard to guess that sensor blur would be that bad. Just looking at the numbers one would assume they shouldn't be the limiting end.
I really hope the Megapixel wars get back into full swing.
(It will also be really interesting if Sony actually does produce a black and white sensor.)

Again, thanks for going into such detail.

Welcome.

I am not sure what the numbers say, however I do know what my eyes tell me. My eyes tell me that every X-Trans image I've seen always felt a bit soft. Now, they also tell me that the X-Trans performs pretty well at high ISO (for its pixel pitch and sensor dimensions, everything is relative here, so still not quite as well as a FF sensor), and that would be thanks to a greater degree of pixel averaging, which we all know reduces noise.


Lens tests are really not lens tests. They are camera tests. Actually, to be quite frank...they are 100% bogus. They don't tell you jack about the lens. They tell you a whole lot about the camera, but they don't really tell you anything about the lens, because the lens is sensor-limited (hmm, I tink I jus coined a new term.) They don't tell you all that much about the actual resolving power of the lens, because the actual analysis is performed on the image that comes out of the camera.

Calling lens tests 100% bogus is a bit harsh, IMO.  Granted, as absolute measures of lens performance, they're useless (that's one reason Roger/Lensrentals got an optical bench).  But they can be quite useful as relative measures of lens performance, assuming the sensor used as a benchmark is the same for the lenses being compared, and doesn't have substantially fewer MP than the camera with which you intend to use the lenses being compared.

Ahh so there is something to give an objective measurment. I've been wondering why people don't just test the lens separate from any sensor. You would think that more review sites would want to invest in something like that, especially since it would make reading the results a whole lot easier.
And then all we need is tests for sensor blur and we could calculate the resulting sharpness of any system!

An optical lens test bench is a pretty expensive piece of equipment. I'd say well out of range for the vast majority of individuals and organizations that have taken up "lens testing" on the net in one fashion or another.

1634
EOS Bodies / Re: Patent: Microadjustment Automated
« on: January 13, 2014, 09:49:01 PM »
I don't think CDAF's lack of predetermine directionality would matter in this case. All that is necessary is for CDAF to achieve focus as a reference point. Once a reference point is attained (and CDAF CAN indeed achieve very good focus once it's done going through all its gyrations), you save the focus group position of the lens, then all you need to do is test PDAF at a distribution of AFMA settings until you zero in on the one that most closely matches the CDAF position, and ensure that setting produces repeatable results.

The PDAF cycles aren't required. Once CDAF has found the optimum contrast point, the AFMA value can be determined almost instantly by evaluating the phase differential and then applying the appropriate AFMA adjust value to bring that differential to its minimum. This is one method outlined in Canon's patent.

Well there you go then. Even better. Wasn't there a thread recently about the lack of Canon innovation? I think this patent would debunk that notion right off.

1635
EOS Bodies / Re: Will Canon ditch the AA Filter?
« on: January 13, 2014, 09:41:56 PM »

From what I've seen from the X-Trans, its sharpness doesn't come close to a lot of other offerings, even smaller form factors like Olympus, in many cases. That's kind of the tradeoff...you never really have moire (the moire you see in video is likely due to line skipping or something like that), but you don't get to actually utilize the sensor's full potential from a resolution standpoint, and there is clearly some overlap of larger blocks of pixels which is going to soften things up a bit.


I'm glad you wrote that.  I've been playing around for a couple of weeks with a Fuji X-E1, mainly to use with "legacy" manual lenses, based in part on the reports of remarkable sharpness that I've seen in various reviews.  Maybe the one I bought is defective, or maybe the kit lens is that came with it is (or both), but almost none of the photos I've taken outdoors in good light with the kit lens (which also receives very high praise in some quarters for sharpness) looks really sharp to me.  At first I thought it was the result of inaccurate AF, but while that occurs far more than it should (a well-known problem with these cameras, it seems), many photos simply look soft (this seems to get worse as the distance from the subject increases), both the camera's JPEGs and raw files via Lightroom and Photo Ninja (this is supposedly the best at dealing with the quirks of Fuji's raw files - DxO doesn't even try).  I don't have this problem with any other camera I own or owned or have recently tried, so I don't think it's just my ineptitude.

I have no insider technical knowledge, let alone the sort of technical knowledge you and others here have, but there seems to be something going on with either the sensor or how Fuji cameras create and process raw files that results in a degree of softening and may explain in part the remarkable low noise performance of these cameras at high ISOs.  You can see this if you use the dpreview comparison tool - Fuji raw files without noise reduction seem very smooth compared to any other APSC-size sensor, and even seem to compare favorably in that regard to some FF cameras, but there's less detail/contrast/punch - they look as though they've already been subjected to a heavy dose of noise reduction.

I guess the moral of the whole story is, so long as digital sensors use regular patterns to lay out pixels, there really isn't any "free" way to eliminate moire. One way or another, it involves blurring the image. X-Trans is interesting, and if it actually uses a larger grid of pixels in a MUCH higher resolution sensor to produce much lower resolution output (say 10mp, interpolated from say 100mp), then I think it would be extremely intriguing. You would have no moire, better color fidelity, richer luminance detail, the whole nine yards. By interpolating more pixels without reducing output size, you are effectively achieving the same thing as a low pass filter, however you are actually NOT being as discerning...your blurring image detail regardless of frequency, where as a low pass filter blurs WITH regard to frequency. A low pass filter is more discerning, and therefor it preserves more image detail while very effectively mitigating aliasing and moire.

Pages: 1 ... 107 108 [109] 110 111 ... 278