February 27, 2015, 08:26:37 AM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - jrista

Pages: [1] 2 3 ... 333
1
EOS Bodies / Re: Possible Canon EOS 5D Mark IV Spec Talk [CR2]
« on: February 26, 2015, 09:42:05 PM »
This is the way I see it.  And of course Lee Jay and PBD can add/correct me if needed.

To increase high ISO performance to me is to increase DR at high ISO/light-limited situations.  I need to increase FWC or max signal per pixel or QE, however you look at it.  You need to lower read noise.  HOW you do those two things I'm not really commenting about but if you can do that you can increase S/N at high ISO.  You already have less read noise with smaller pixels so why can't you increase the size or efficiency of the photodiode in the pixel?  I've been in discussions about smaller parts in and around the pixel to make way for larger photodiodes, for instance.  I can also see where the FWC could be more important than read noise and overcome the higher read noise by adding more signal because signal is additive whereas noise is added SQRT.  So in that case, larger pixels might still win.  This of course is all at equal sensor size and equal technology.


Increasing FWC (probably a poor term to use, max saturation is probably better) is definitely a way that you can improve DR at higher ISO. Reducing read noise can certainly help, but at higher ISO read noise is already quite low, 3e- or less these days usually, and it's tough to complain about that. The saturation point at higher ISOs is usually only a couple thousand e-, sometimes as little as a few hundred e-, so increasing the charge capacity of each pixel is probably the better way to improve SNR at high ISO.

You aren't thinking about this correctly.

At high ISO, FWC is limited artificially by all the gain.  The cells are fully capable of holding a lot more charge, and they do, but the A-D saturates because of all the analog gain.

Assuming you are at high ISO because you are light-limited (not a bad assumption, IMHO), then the ONLY way to increase DR is to reduce read noise (assuming Bayer dyes, same QE, etc.).


I understand it perfectly.

I agree that if you are light limited (a possible use case for high ISO), then you would need lower read noise to increase DR. I disagree that's the only use case for high ISO. I frequently shoot at high ISO when there is plenty of light, easily enough to saturate the entire sensor in a fraction of a second, because I need motion-stopping shutter speeds for very fast motion (just watch a Chickadee or Bushtit sometime...those things NEVER stop moving, and they make these ultra fast micro-moves that blur with shutter speeds lower than around 1/2000th or so). Increased EQE and increased FWC would result in greater IQ at the higher ISO settings I often need to use for these birds. Personally, I would rather not go with larger pixels to achieve that higher FWC though...I want my resolution.

So, I disagree that the only way to increase DR at high ISO is to reduce read noise. Just look at the A7s...that sucker has a MASSIVE FWC (true full well capacity, the base ISO maximum charge capacity of the photodiode) of 155557e-! It's saturation point at say ISO 12800 is 1298e-. Now that's thanks to having a greater fill factor...more total light sensitive surface area in the sensor, and per pixel, because the pixels are huge. The saturation point at the same ISO for the 6D is 604e-. Both cameras have similar read noise at that ISO, 1.8e- and 1.6e- respectively, but one has 9.7 stops DR and the other has 8.4 stops. Why does one have more DR than the other, if the only way to improve DR at high ISO is to reduce read noise? The camera with the higher DR actually has higher read noise! The 6D has a lower fill factor, less total light sensitive surface area.

The capacity of a photodiode is primarily limited by it's area, and the sensitivity is limited by area and EQE. Sensor sensitivity is affected by total light sensitive area in the sensor. So if Canon had made the 6D with the 65nm process Samsung is using, they could have increased the photodiode area. I don't have time at the moment to actually calculate how much area increase Canon could achieve without changing pixel size, but suffice it to say they could increase light gathering capacity and sensitivity by increasing photodiode area. There are still EQE losses due to the use of microlenses and CFA. Improve microlenses (aspheric lenses have been researched to better focus off-axis light onto photodiodes), replace the CFA with color splitters, move to BSI and basically gain nearly the entire surface area of the sensor as light sensitive area, and you increase the amount of light reaching each photodiode, which makes them saturate faster, thus utilizing that increased capacity, therefor allowing you to reduce gain further (which reduces the amplification of everything, noise included.) That improves IQ, even at high ISO.

2
Landscape / Re: Deep Sky Astrophotography
« on: February 26, 2015, 09:07:52 PM »
Nice work, Schmave. :) Glad you gave it a try. I would be willing to bet, with a little bit more tolerance for some noise, that your data has more information in it than you think. It's a classic beginner "mistake" to try and make the background totally black. Thing is, space isn't black (very, very few areas of the sky actually have a black background, and none of the areas with emission nebula, which are all along the milky way, have a black background at all...there is tons of faint dust and filaments of emission nebula scattered all about the milky way.)


I would work on keeping your background level above black...maybe 20-30 levels. You should be able to bring out more of the nebula then. You will have more noise, but noise is just a fact of life with astrophotography...we work with such weak signals. ;) Keep it up! You've just started a very long journey. The next step would be to get some kind of tracking mount...be it a Polari or AstroTrac, or something larger and more expensive, once you can track, you'll see your ability to get deeper exposures increase dramatically.

3
EOS Bodies / Re: Possible Canon EOS 5D Mark IV Spec Talk [CR2]
« on: February 25, 2015, 02:58:15 PM »
This is the way I see it.  And of course Lee Jay and PBD can add/correct me if needed.

To increase high ISO performance to me is to increase DR at high ISO/light-limited situations.  I need to increase FWC or max signal per pixel or QE, however you look at it.  You need to lower read noise.  HOW you do those two things I'm not really commenting about but if you can do that you can increase S/N at high ISO.  You already have less read noise with smaller pixels so why can't you increase the size or efficiency of the photodiode in the pixel?  I've been in discussions about smaller parts in and around the pixel to make way for larger photodiodes, for instance.  I can also see where the FWC could be more important than read noise and overcome the higher read noise by adding more signal because signal is additive whereas noise is added SQRT.  So in that case, larger pixels might still win.  This of course is all at equal sensor size and equal technology.


Increasing FWC (probably a poor term to use, max saturation is probably better) is definitely a way that you can improve DR at higher ISO. Reducing read noise can certainly help, but at higher ISO read noise is already quite low, 3e- or less these days usually, and it's tough to complain about that. The saturation point at higher ISOs is usually only a couple thousand e-, sometimes as little as a few hundred e-, so increasing the charge capacity of each pixel is probably the better way to improve SNR at high ISO. There is also the simple fact that you really want to improve the signal, and reducing noise doesn't exactly do that per-se...only increasing the charge capacity and the charge accumulated per unit time does. Increasing Q.E. can certainly help, but at very high ISO, you suffer from clipping problems (so, while with high Q.E. you might have the necessary sensitivity, if it isn't paired with a capacity increase, it might be a useless improvement to sensitivity.)


Increasing photodiode size is certainly one way to improve charge capacity, but there have been other recent innovations (usually for super small pixel sizes) that use layered photodiodes to capture deeper penetrating photons and convert them to charge as well (not for the purposes of color, just increased charge capacity). There have also been innovations in photodiode design...charge accumulates at a layer at some particular depth inside the PD, around where N-type and P-type silicon interface. I've read of a couple patents that have been changing the curve of that interface layer to increase capacity without increasing PD size, or to change the structure of that layer. A curved or shaped layer has more surface area, and thus more room for electrons and electron holes to accumulate.


If you could double your signal with the same read noise, you could certainly gain some DR. For example, if you have 300e- saturation at ISO 12800 and 3e- RN, you would have ~6.67 stops DR. If you increase saturation to 600e-, you would have ~7.67 stops of DR. You gained a stop. Doubling charge in a photodiode without increasing it's size could be tough, so it's unlikely to see quite that much of a change without some other technological innovations. A reduction in process size, use of a curved N/P interface layer in the PD, use of BSI, etc.

4
Landscape / Re: Deep Sky Astrophotography
« on: February 22, 2015, 05:13:35 PM »
Here is another. I just had a run of six clear nights...something I've never seen before...and got a ton of data on several targets. Most were galaxies, the one nebula was Rosette. This is an 11 hour integration (164x240s subs).

5D III + 600mm f/4 + 1.4x (840mm 1.55"/px) on Atlas mount

Two versions, one "narrow band" like and one "natural color":

This is my longest integration to date, at 11 hours. I did this from my back yard with an IDAS LPS-P2 light pollution filter. That's replacing my Astronomik CLS filter, and it's actually quite amazing. Not entirely dark site quality data, but quite good data nevertheless.

John, 

Outstanding Details!  This must look great at 1:1.  One comment, on the second image the blue halos around the stars are a bit detracting.  Cleaning those up, and a little color correction (less blue) in the stars would really strengthen that image. Im guessing these are left over from your Light Pollution Filter.   Excellent work!


Actually, the first image is more what I got strait out of camera. The IDAS LPS-P2 actually produces great stars with great color balance, but it doesn't handle the deeper pinks of emission nebula as well. The blue halos are due to the processing, which I am working on. I need to figure out how to properly reduce the stars in both the luminance and RGB images before combining them. I am also working on doing some pre-stretch masking on the stars in the second color version to prevent them from becoming overly blue (a more neutral white-based would be ideal.)

5
EOS Bodies / Re: Possible Canon EOS 5D Mark IV Spec Talk [CR2]
« on: February 22, 2015, 05:09:25 PM »

Not sure why no one else has suggested this, but:


18mp x3 layer sensor?


That would make it 54 million photodiodes, but spatially an 18 million pixel sensor.

Seems a bit odd, but it is afterall a test prototype. I dont see Canon reducing the MP in a 5 series unless they plan on using the 20.2 MP sensor from the 6D but with DPAF. 12FPS would nice but would they really bring the frame rate into the 1DX realm? Current 5D3 is 6FPS. 8-10FPS seems more realistic in a final product.  A "quantum leap" in DR and FPS in a 1 body would be fantastic, not that the current model is a slouch whatsoever. And no, you dont need mirrorless to do it. None of them do it now. 20FPS doesnt do a lick of good without a reliable AF system and mirrorless isnt there yet.


They might reduce it if it is a layered sensor. If it is 18mp x3, then it would have 54 million photodiodes. If Canon is pairing the camera with a DIGIC 7...then they should have the necessary throughput (each DIGIC 7 chip would need to handle around 600MiB/s throughput, which is a little more than double the current DIGIC 6 chips.)


It seems more logical to me that it would be 10fps, in which case each DIGIC 7 (or whatever they call it) would end up handling 500MiB/s throughput, or exactly double the current DIGIC 6.

6
EOS Bodies - For Stills / Re: Coincidence or?? MP/SEC
« on: February 17, 2015, 03:25:29 PM »
Hmm, yeah, calibration with the bias signal might benefit from a higher bit depth. You would more accurately represent the differences in each column of the bias... Interesting.


Regarding the 7D II, as far as I know the dual pixel read for AF is a literal dual pixel read. That would be 40.2mp in that case. I wonder if there are two separate reads, though...one for one half of each pixel, then another for the other half? It is also possible each AF read is converted to only 7 bits as well. Dunno. The specifics on that might be contained within the DPAF patents, though...

7
EOS Bodies - For Stills / Re: Coincidence or?? MP/SEC
« on: February 16, 2015, 07:43:30 PM »
Output data at 14fps would be JPEG, but the input data from the sensor would still be 14-bit. I don't believe Canon does any kind of downgrading on the bit depth of the ADC units (I've never found any information indicating as much anyway), so I don't believe there is a 12-bit 14fps read mode. The output data rate for JPEG (when writing to the memory card) would certainly be lower, but the input rate into the DIGIC processor would still be 14-bit.

It's difficult for anyone outside of the development team to do any more than speculate, but I'd argue that if the camera can really handle 14 bit readout at 14 FPS, then why force JPEG only on users? The buffer is still there, and if the readout and JPEG engine can keep up with 14 bits, the buffer should be able to too - so instead of forcing JPEGs on the user, why not give them a choice of a smaller buffer depth and raw? Card speeds only become relevant once the buffer fills up, and the target audience of the 1D X should in Canon's eyes be capable of deciding which trade off to choose.

I'm under the impression (falsely or not) that it's JPEG only as the readout doesn't support 14 bit CR2 files at 14 FPS.


The readout is simply transferring charge and converting it to digital numbers. JPEG doesn't come into play until that native signal information has already entered the DIGIC chip, since it is DIGIC that is actually performing the conversion to JPEG. The sensor isn't going to natively spit out 8-bit JPEG data, as that requires processing, like color space conversion, compression and encoding, etc.


Why does Canon limit 14fps to JPEG? I don't know the answer to that. I am just quite certain that neither the sensor nor the off-die ADC units are spitting out JPEG data natively. It could be possible that the ADC units are switching to 12-bit mode at 14fps. Samsung certainly doe it at 15fps. Why Canon even bothers with a 14-bit ADC in the first place when their data barely supports more than 11 bits worth of information because of system noise levels is beyond me. If a 12-bit ADC would allow them to process 17.5fps, without any loss in DR, I don't know why they don't. That said, I've never encountered any information anywhere, including in patents, that indicates Canon switches to 12-bit data output from their ADC units. There could very well be a different reason they had to limit 14fps to JPEG.

8
Canon General / Re: Decline in DSLR sales explained
« on: February 16, 2015, 03:54:01 PM »
Totally ridiculous that Canon (and Nikon) continue to refuse to put touch screens on top of the line DSLRs. I defy anyone who has used a 70D to argue that higher end cameras would not benefit from touch screens. Yet, we have this ridiculous concept that they aren't "professional."


I think that's the inverse of the question that needs to be asked. Do professionals WANT that stuff? Personally, I don't really care if Canon puts a touch screen on their xD lines or not...it might be useful in the menus, but the way I actually use the camera for actually doing photography, the LCD is off and black. I use the buttons and dials for everything, frequently without ever removing my eye from the viewfinder.


Adding more technology is fine, but why expend the resources doing it if the statistics (whatever they may be, just making a case here) show that most of your buyers wouldn't actually care about or use the feature much of the time, if any of the time? In the professional (or semi-pro or avid enthusiast) world, I think a lot of Canon's sales come from return buyers who are looking for something familiar. I also think that most focus more on the traits that actually affect IQ, vs. a new (and certainly potentially useful) way of interacting with certain features of the camera. If someone has all the necessary procedural memory to control Canon DSLRs because Canon has stuck with the same key button layout for each line for a while now, a touch UI suddenly becomes a backburner item.


I think the consumer grade lines are totally different. The xxD and xxxD/xxxxD Rebel lines are catering to an entirely different audience. Things like touch screens, or LTE and WiFi, internet accessibility, maybe some built-in app features like Instagram or Facebook, etc. are not only useful features, but as the video was getting at, essential features for that segment of the DSLR market to survive. But that's an entirely different market segment, at least the way I look at it. It's the segment that isn't catering to return customers who want familiarity and the ability to instantly access critical functionality at the press of a button, because having that procedural-memory/muscle-memory speed and accuracy when using the device isn't the most critical thing for consumers. The most critical thing for consumers is (apparently, these days) the social aspects of photography.


In that respect...even IF Canon and Nikon and the rest manage to somehow meld smartphone and DSLR into a usable device...are people actually going to give up the convenience of their existing and easily accessible pocket cameras (smartphones) for something larger, bulkier, and still more complex (interchangeable lenses, at the very least)? Smartphones are like the disposable camera of the modern age: Tough to beat for the average consumer.

9
EOS Bodies - For Stills / Re: Coincidence or?? MP/SEC
« on: February 16, 2015, 03:02:53 PM »
At full tilt, many Nikon bodies reduce the data per frame by dropping from 14 bit to 12 bit raw.

Canon don't offer that, but the 1D X does have compromises at 14 FPS other than the mirror staying up leading to viewfinder blackout and no AF tracking - the output is JPEG only. That's likely due to the sensor output dropping to 12 bit as a work around for the high data rate. If that's the case, the sensor readout and processing has gone from 12 FPS @ 14 bits to 14 FPS @ 12 bits - which is an identical data throughout.

Add bit depth into your calculations, and the 5Ds works out at ~443 MB/s.
The 1D X tops out at ~380 MB/s at both 12 and 14 FPS.

The 5Ds is no speed demon purely in terms of FPS, but that is an unprecedented data throughput for a Canon stills camera.


Output data at 14fps would be JPEG, but the input data from the sensor would still be 14-bit. I don't believe Canon does any kind of downgrading on the bit depth of the ADC units (I've never found any information indicating as much anyway), so I don't believe there is a 12-bit 14fps read mode. The output data rate for JPEG (when writing to the memory card) would certainly be lower, but the input rate into the DIGIC processor would still be 14-bit. That would mean the input throughput for the 1D X is basically the same as for the 5Ds, around 465mb/s plus any additional overhead.


I still believe that DIGIC5+ and DIGIC6 are 250mb/s input throughput per chip, or a total of 500mb/s total maximum input throughput.

10
EOS Bodies - For Stills / Re: Coincidence or?? MP/SEC
« on: February 16, 2015, 02:12:18 PM »
This is a ridiculous thought..

Canon 5DS / 5DSR - 50.6mp - max 5/sec burst = 253mp per sec
Canon 1DX - 18.1mp - max 14/sec burst = 253mp per sec

Two highest - when summed - figures from the Canon line-up in terms of image size.

As they use different processors is there another limiting factor? or is this just stupid?

My vote is stupid, but I wanted to share it nonetheless.


Remember, every frame read includes the masked border pixels for calibration purposes. That increases the megapixel count above the number of pixels that actually end up in your final images, usually by around 4-6%. The 1DX actually has 19 million pixels. The 5Ds is likely to have close to 53 million pixels. Total pixels, masked ones included. So total megapixel counts per second are more like 266mp.


Every pixel is converted to a 14-bit number, that whole chunk of data is read 14 times per second for the 1Dx, or five times per second for the 5Ds. The data throughput of the DIGIC chips MUST be able to handle the RAW PIXEL INPUT throughput, which is:


Canon 1D X: 19,000,000px * 14bit/px / 8bit/byte * 14fps = 465,500,000byte/s
5Ds: 53,000,000px * 14bit/px / 8bit/byte * 5fps = 463,750,000byte/s


When you throw in overhead, both cameras are very likely using DIGIC processors capable of handling an input throughput rate of 250mb/s (since two DIGIC chips are used).


The big difference between DIGIC5 and DIGIC6 is clearly not throughput. The 7D II requires less than 400mb/s total, so it only needs 200mb/s throughput for it's pair of DIGICs. The big difference with DIGIC6 is that the chips are doing a lot more onboard processing of each pixel, reducing noise and whatnot, than the DIGIC5 chips did. Overall processing power increased, even if data throughput did not. Supposedly that increased processing results in better images. Ignoring DR, I think that is the case...OOC images from the 7D II and 5Ds definitely look better. The DIGIC 6 processors don't seem to help at all in the area of read noise (and one wouldn't expect them to), but that doesn't change the fact that these chips are doing more processing than the DIGIC 5 chips.

11
Canon EF Prime Lenses / Re: Canon EF 600mm f/4L IS II USM
« on: February 16, 2015, 12:03:27 AM »
Yeah, the Atlas Pro is basically a ready-to-go version. It's about $500 more expensive, whereas the belt mod itself is $200. If you have mechanical skill, I would say get the original Atlas and the belt mod, save the $300. If not, the Atlas Pro is a good mount.


I'm also not really trying to discourage, just make sure you know exactly what's involved. If you are up for it, it's an awesome hobby, and it is most definitely rewarding.

12
Canon EF Prime Lenses / Re: Canon EF 600mm f/4L IS II USM
« on: February 15, 2015, 10:05:24 PM »
Candc, that is indeed the mount I have. My mount has also been upgraded with the Rowan Engineering belt mod, which eliminates a number of gears in favor of pulleys and belts. That eliminates a good deal of backlash, makes the mount more responsive, makes it quieter, etc. The belt mod is another $200. I've also hypertuned it, which improves performance. You can get a kit for doing that for $200, or buy the various parts you require yourself for less than $60. Hypertuning requires mechanical skill, a VERY careful hand, and a LOT of patience and time...it's a slow, methodical process, and I don't recommend it unless you really know what you are doing.


Both the hypertune and belt mod gave me the performance I needed to get these images. Before them, my tracking was around 2-3" RMS, while my image scale was 1.5"/px. Combined with seeing, before modding my stars were usually 6" or so in size, whereas after modding they are about 2.8" in size. That really matters once you get down to it...tight stars are a key factor in a quality image. I recommend the belt mod regardless of your mechanical skill...it greatly improves the performance of the mount.


Regarding getting started in astrophotography, I don't know of any books. I'm a self starter, I have taught myself everything I know my entire life. My knowledge is all based on theory I knew, theory I learned (usually just researching on the internet), practical experience, and trial and error. I am happy to help when I can, but I don't own any astrophotography books, so I can't really help there. I do know a guy who sells CDs for beginners. Jerry Lodriguss, astropix.com. You could start there.

Astrophotography is the most complex form of photography on the planet, by far. I love to see more people getting interested in it...but before you drop a lot of cash on the hobby, make sure you have the patience for it. The Rosette images above? That was five solid nights of imaging, four hours a nigh on Rosette (and the other four on other targets). That is 20 hours of just image acquisition. Another few hours to gather darks and flats (I needed new darks...usually I use a library for those). Over five solid hours of pre-processing, nearly two for frame analysis and rejection (based on a variety of technical factors in the image...FWHM (Full Width Half Maximum, a measure of star diameters), Eccentricity (a measure of star roundness), Noise and Noise Support, etc.), and over three hours of actual image integration work. That was all done in PixInsight with SubframeSelector and Batch PreProcessing scripts and the PI Integration tools. After that came about nine hours of extensive and detailed processing. That involved a lot of heavy work in PixInsight to remove background gradients (gradients are a bitch, and you end up with lots of them in the city because of LP), calibrate color. I extracted an artificial luminance channel and process that with deconvolution, noise reduction, star reduction, more noise reduction, stretching, and finally contrast tuning. After processing the lum, I went back to the original image and started processing the color (noticed a marked change in color after one step, made a copy at the previous, and then split the processing into two paths to get both images above). The color processing involved extensive heavy noise reduction, but you still have to be careful with that so as to avoid star bloat and color fringing in overly softened detail, star reduction, stretching, color contrast and color toning. Finally I had to combine the luminance with both sets of RGB data, and did further processing on those to again enhance contrast and bring out detail without exacerbating noise. That was over eight hours of processing right there. After that, I then started working on all the exports. First I had to export to Potoshop for final processing (vertical banding NR), then final export back into PixInsight for cropping, resizing, and export of multiple versions of the data at multiple sizes, including full size, 50%, small web size, in lum only and both RGB images. I spent a little extra time creating these two images with detail crops as well:







All in all, it was over nine hours of processing before I was finally finished, and about 14 hours of total processing time. The entire process from start to finish including image acquisition was over 35 hours. On ONE object.

I'm not trying to show off here...I just want you guys to understand the level of effort required to make images like these. It is extremely time consuming, requires very dedicated effort, expensive equipment, and both diligence and patience to get through the first six to eight months (which are usually very difficult, as you struggle with all the mechanical and optical and electronic aspects of your gear before you finally learn how it all works, figure out how to keep it all balanced and operating smoothly and guided well and all that). I spent the last year (literally, I started doing astrophotography Feb. 12, 2014) learning how to do all of this, and the images I've shared here are indicative of both the experience I gained over the last year, as well as all the base theoretical knowledge I had going into things.

I LOVE to see new people get interested in astrophotography, for sure. But it's expensive. Excluding my 600mm lens (which is a big part of the reason I was able to progress so fast...it is a flat fielded 150mm aperture f/4 "telescope"...those things cost at least $10,000 in the astro world anyway, and often significantly more; most people start out with something like an 80mm APO Triplet at f/6 or so), I've put over three grand into mount, mount upgrades, guide equipment, filters, large capacity deep cycle batteries, laptop, software, and a wide range of accessories. All of that is very low end equipment, and the hypertuning/belt mod barely gets the mount up to snuff (and a lot of the time, it simply isn't...and my IQ suffers considerably...I can share some examples of how.)

If you love this kind of stuff, love tinkering and fiddling, want to learn a highly technical form of art, then you won't have any problems. On the other hand, if you think it just takes a night of pointing a scope at the sky, and a couple of hours of processing to create images like Rosette, or the Orion images, I recommend you at least learn more about the hobby before you spend money. Even a very low end mount is $1200-$1500, and that is the bare minimum. I would rather you guys be educated enough to know what your getting into, than to recommend you just dive in and spend a lot of money on a hobby that you may find is not something you want to or have the time to invest a lot of effort into. There are simpler ways of starting as well. You can get a Polari or AstroTrac and use a DSLR with smaller wider field lenses to get amazing images for about a grand. That's a much cheaper way to start, an while it still requires and investment of time, it's not nearly as much as it took to produce the Rosette images I just shared.

13
EOS Bodies - For Stills / Re: Why not 16 bit files?
« on: February 15, 2015, 04:40:44 PM »
Canon's banding is absolutely due to their hardware. This is a master bias frame from my Canon 5D III:



Note the vertical banding? Here is a better view of it:



This is a superbias frame, where the original master has been run through a special noise reduction algorithm to leave behind just the base bias signal itself. Note...bias SIGNAL. This is IN the sensor, due do the voltage applied to each column of pixels.

Every sensor has this. There are ways of reducing this banding, such as per-column ADC with adaptive adjustment to balance the offsets, producing a flat bias.

The bias signal itself is easy enough to eliminate...you simply offset. The problem with Canon's bias is that it varies a lot...so you cannot offset the entire thing. Each column can often deviate significantly from others, and when they do deviate significantly, that leads to the fixed banding that you can see in photographs.

This is just one source of banding, but it is the source of fixed banding. It's not the most pronounced banding, a lot of the horizontal and vertical banding that shows up in Canon cameras is largely random, and has a significantly larger standard deviation. These other sources of banding come from the row and column drivers, from readout electronics, and from signal interference.

Canon cameras can also still have problems with amplifier glow. For example, a master dark frame from my 5D III:



The amp glow is obvious. The horizontal banding is more obvious in this image as well, as I have subtracted the superbias frame from it, leaving behind only thermal signals and read noise.

As for why not use 16-bit. Canon cameras, due to their noise levels, can't even make full use of 12 bits of data, let alone 14 or 16 (or more). Again, if we use my 5D III as an example, the full well capacity at ISO 100 is (according to sensorgen) 68151e-. The read noise at ISO 100 is 33.6e-. The number of discretely discernible tonal levels in my 5D III is 68151/33.6, which comes out to 2,028.30, or 2028 discrete tonal levels. Noise affects the entire signal...not just the shadows. A 12-bit number can represent everything from 0 through 2^12-1, or 4095. Since 2028 < 4095, Canon is technically wasting bits by using a 14-bit ADC. At best, they are representing the deviations caused by noise more accurately, but they haven't actually increased tonal resolution to any meaningful degree. They are just wasting two bits. If they went to 16-bit ADC, they would be wasting four bits.

The above should also be fairly obvious due to the limited dynamic range of Canon cameras. Canon DSLRs thus far have around 11 stops and change of dynamic range. Dynamic range is limited by the bit depth of the ADC, so if Canon had significantly more than 12 stops (i.e. enough to actually make use of a 16-bit ADC), every single Canon camera would have 11.99 stops of DR. Since most Canon cameras have 10.9-11.4 stops of DR, it is safe to conclude that they are unable to effectively use the full 14 bits of data their RAW format supports. When Canon cameras are all delivering 13.9 stops of DR at ISO 100, then we can start to wonder why they aren't using a 16-bit ADC. :P

14
Landscape / Re: Deep Sky Astrophotography
« on: February 15, 2015, 01:45:45 PM »
Here is another. I just had a run of six clear nights...something I've never seen before...and got a ton of data on several targets. Most were galaxies, the one nebula was Rosette. This is an 11 hour integration (164x240s subs).

5D III + 600mm f/4 + 1.4x (840mm 1.55"/px) on Atlas mount

Two versions, one "narrow band" like and one "natural color":






This is my longest integration to date, at 11 hours. I did this from my back yard with an IDAS LPS-P2 light pollution filter. That's replacing my Astronomik CLS filter, and it's actually quite amazing. Not entirely dark site quality data, but quite good data nevertheless.

15
Canon EF Prime Lenses / Re: Canon EF 600mm f/4L IS II USM
« on: February 15, 2015, 01:44:19 PM »
Here is another. I just had a run of six clear nights...something I've never seen before...and got a ton of data on several targets. Most were galaxies, the one nebula was Rosette. This is an 11 hour integration (164x240s subs).


5D III + 600mm f/4 + 1.4x (840mm 1.55"/px) on Atlas mount


Two versions, one "narrow band" like and one "natural color":








This is my longest integration to date, at 11 hours. I did this from my back yard with an IDAS LPS-P2 light pollution filter. That's replacing my Astronomik CLS filter, and it's actually quite amazing. Not entirely dark site quality data, but quite good data nevertheless.

Pages: [1] 2 3 ... 333