October 24, 2014, 11:17:40 PM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - jrista

Pages: 1 ... 66 67 [68] 69 70 ... 299
1006
Animal Kingdom / Re: Show your Bird Portraits
« on: June 05, 2014, 09:44:26 PM »
Wild Turkey...

Great shot! Love it!

1007
Third Party Manufacturers / Re: Nikon's D800E 30% sharper than D800
« on: June 05, 2014, 08:53:43 PM »
That would be because scoring lenses, sensors, or any other aspect of a camera is a ridiculous idea that completely decimates the consumers ability to make an EDUCATED choice about their camera purchasing decisions. Scoring hides all the details, and in the process throws away a lot of relevant information that is CRITICAL to making those decisions. DXO's lens scores are only valid within the context of DXO...they have no meaning in any other context, and are therefor valueless in a store when your holding two actual lenses in your hands. DXO's lens scores are obviously biased, as they "score" the ridiculously cheap low end 50mm f/1.4 higher than the ludicrously expensive ultra high end 600mm f/4 L II. Sensor scores completely ignore the rest of the complex systems that cameras actually are, making no allowance for anything like AF functionality, metering functionality, or even the aesthetic aspects of camera construction...ergonomics and menu systems.

No one has tried to replicate what DXO does...because what DXO does is inane.

Strange that I don't see any significant criticism anywhere except forums dominated by Canon owners. If DxO was so irrelevant then the disdain would be more widespread.

Oh, it's not just here. It's rather mild here...you should see the stuff people say about DXO over on DPR Forums! :D Them ppls is crazy!

Quote
It isn't a matter of predicting it. We know exactly why it happens and how it happens.

Obviously not everyone shares the same defeatist attitude as you do and that they've put R&D efforts into understanding and combating it.

These concepts are very well understood. Deconvolution, waveform interference, aliasing and moire, etc. have all been researched heavily for decades. Most of the reason we haven't seen software tools for certain things is that we haven't had the computing power in the past. Deconvolution algorithms are complex, some highly complex. Specialized software in the past used to be the only way to get advanced tools like this, and they were usually very targeted (i.e. ONLY performing one or two types of denoising or deconvolution), and were usually so slow that the algorithms could take minutes or even much longer to complete.

Today we have significantly more computing power, such that it takes seconds or less to apply these complex algorithms to increasingly large images. A lot of that is also due to optimizations made to the algorithms, gaining speed, often at the cost of precision (most photographers don't need scientific accuracy.) So, things like moire removal are now showing up. It isn't because we suddenly learned something about moire. No...we already knew all about it. We've known about interference patterns for hundreds of years. We just have more computing power now. It's easier to make tools that address, or try to address, some of these kinds of problems now, because compute cycles in the billions per second exist on people's desktops as a matter of course these days.

There isn't a lot left to discover here. Not with bodies of research spanning decades. I guarantee you, Nikon won't be producing any miracle-working demoire algorithms any time soon. They may be able to get slightly better results with color moire removal (which, BTW, is currently what all moire algorithms are...they only remove the color aspect of the patterns, they don't actually remove the interference patterns.) They may be able to better identify which pixels and colors are primarily aliased, and only affect them, rather than desaturating and blurring pixels that aren't primarily aliased. That would certainly be an improvement, but it doesn't really change much in terms of how much of the moire pattern itself is removed.



The big question is...when there is already a highly effective means of preventing moire, and a means that introduces a PREDICABLE and DECONVOLVABLE pattern of blurring that can EASILY be reduced in post with a basic sharpening algorithm...why spend so much time, effort, and money removing AA filters, and even more trying to solve the problems moire creates in post?

It's a total, utter waste. Moire is a destructive, unpredictable form of image artifact that is unevenly distributed throughout your image. Those kinds of artifacts are the worst to deal with. They are the most complex, and the ones you really want to PREVENT, not REACT to. It's identification and effective removal is highly difficult in post. This is in contrast to simple high frequency blurring, which is highly predictable, evenly distributed, and extremely easy to counteract in post with either basic sharpening (effective, but not ideal) or more advanced deconvolution (wavelet deconv. or something similar, far more effective).

People are too concerned with sharpness, especially OOC sharpness. Sharpness is something we understand exceptionally well, and something for which we have some truly exceptionally powerful tools to enhance (just check out PixInsight...some of the tools in there are simply mindblowingly good.) Why obsesse over OOC sharpness, then have to DEAL with moire and other aliasing in post and WAIT for new and more advanced, power hungry algorithms to come out that...don't really solve the problem (not without being similarly destructive in other ways...such as blurring detail or by introducing OTHER kinds of artifacts)?

Optical low pass filters eliminate moire, stop it before it occurs, and your images can be sharpened in post to be just as good as a camera without a filter. Again...inanity. It's totally inane to obsess over OOC sharpness and removal of AA filters when the problem was already solved!!  ::)

1008
Third Party Manufacturers / Re: Nikon's D800E 30% sharper than D800
« on: June 05, 2014, 07:04:29 PM »
...
Of course, their formula for the score is secret.  They might give 25 points if the letter N or Z is in the name, for all we know.
 
There is one advantage to this, they avoid people discovering errors in their scoring methods by keeping it a secret.

And they make it harder for competition ... as yet I haven't seen anyone else come up with a better method to "score" lenses.

That would be because scoring lenses, sensors, or any other aspect of a camera is a ridiculous idea that completely decimates the consumers ability to make an EDUCATED choice about their camera purchasing decisions. Scoring hides all the details, and in the process throws away a lot of relevant information that is CRITICAL to making those decisions. DXO's lens scores are only valid within the context of DXO...they have no meaning in any other context, and are therefor valueless in a store when your holding two actual lenses in your hands. DXO's lens scores are obviously biased, as they "score" the ridiculously cheap low end 50mm f/1.4 higher than the ludicrously expensive ultra high end 600mm f/4 L II. Sensor scores completely ignore the rest of the complex systems that cameras actually are, making no allowance for anything like AF functionality, metering functionality, or even the aesthetic aspects of camera construction...ergonomics and menu systems.

No one has tried to replicate what DXO does...because what DXO does is inane.

Quote

It is not surprising that the same camera without AA filter has more resolution, it also has more moiré, which can make a image very sharp and totally unusable.  How many points for sharp moiré??  This is a huge issue for wedding photography where pin stripe suites and lace are prevalent.  It totally ruins images.  The same issue for fashion photography, fines stripes or lace, etc  ruin the image.  While its a fine camera, it is also pretty limited for money making usage.
 
I'm wondering about the new rumored version.  Is Nikon thinking they can repeal Nyquist, or do they count on DXO to tell everyone how sharp it is and give it a high score, but real world usage is limited.

Nikon has supposedly developed or is developing better algorithms to mitigate moire. Given that it occurs due to physical properties of light and materials, there should be a method through which it can be predicted and thus countered.

It isn't a matter of predicting it. We know exactly why it happens and how it happens. The problem isn't knowledge. The problem is that moire is an artifact that gets PERMANENTLY BAKED IN once the image is digitized. You took two patterns, interfered them, then saved the INTERFERENCE of the two...not the originals. The signal is permanently changed, and changed in a completely destructive way. The thing about deconvolution is, it's literally impossible to reconstruct the original signal, regardless of how well you know and understand the sources of convolution, due to the nature of error rate in the kinds of mathematical calculations involved in performing that deconvolution. The tiniest error perturbes the entire calculation, limiting how far you can push it, such that pushing it too far results in significantly worse artifacts than you started with.

That is why deblurring algorithms, for example, work exceptionally well at small scales. You can deblur an image a small amount, and it can look fantastic, as though you never missfocused. However try to deblur an image more than a few pixels, and the algorithm itself will start introducing horrible, destructive artifacts that look completely unnatural and entirely unacceptable for art (there may be practical applications for utilitarian purposes...say police deblurring blurry license plate photos, but the artifacts don't matter there, because the photo isn't about art, it's about information.)

There is no post-processing solution to moire. The only way to avoid it...is to avoid it up front.

1009
EOS Bodies - For Stills / Re: The Answer to Everyone's Complaints
« on: June 05, 2014, 06:34:16 PM »
This has to be Jrista's shortest post.

HAH! Your probably right. :P

1010
EOS Bodies - For Stills / Re: The Answer to Everyone's Complaints
« on: June 05, 2014, 06:14:29 PM »
Interesting. So are people always satisfied with Nikon or do they just move on if not satisfied?

Nikon has its own issues.  A lot of people question the execution of the Df, oil/sensor issues, green tinted displays, liveview deficiencies, etc.

Nikon just set aside $17.7 million to address D600 issues. That is not the way any company would choose to spend that money. That coupled with overall market conditions would give any investor pause.

Absolutely.

Quote
People say that now is the golden age of photography and seeing the exponential increase in number of photos taken, it's hard to disagree with that.  However, a larger market is not aways a good thing for manufacturers.  Competition can be more intense and margins/total profitability can fall.  It may be a boon to one that does not heavily invest in any company's products, but for those that do, a company's long term profitability can be a concern.

Absolutely. I lived through the last "golden age" of photography – the late 60s, early 70s SLR boon. That's a very big reason why I would hesitate to invest a lot in any system other than Canon or Nikon. Pentax, Mamiya, Konica, Yashica, Contax...all were hot SLR brands during the last golden age. Some exist today, but only because the brand name was sold off to other companies. Ten  years from now, I'm pretty sure Canon and Nikon will still be making bodies. Sony, maybe. But, I'm not willing to take that risk.

Ditto!

1011
EOS Bodies / Re: Can Canon deliver a FF sensor that is class leading?
« on: June 04, 2014, 07:23:33 PM »
Quote from: neuroanatomist link=topic=20935.msg401699#msg401699
?..Testing anything but the 'entire chain' would have no real-world utility (except fueling speculation about what might or might not be possible at some unspecified future time).
I agree. Testing without pushing to the limits is no testing. By definition the CMOS includes the ADC. No matter on or off die. And usually there are more than one on the chain serial-wise. Additionally the dark current is important too.

Have you ever tried to shoot video with the 550d? AFAIK its CMOS gets heated, which makes me believe that there still must be some kind of  ADC to each pixel.

Your confusing amplifier with ADC. The amplifiers are per-pixel in a CMOS sensor, but the ADC is a separate unit. CCD sensors push everything off-die, including the amplifier.

1012
EOS Bodies / Re: Can Canon deliver a FF sensor that is class leading?
« on: June 04, 2014, 03:26:47 PM »
...
Who says?  FWIW I don't believe that the flat spot has anything to do with the sensor at all -- it is due to the noise being dominated by that of the ADC which follows the sensor in the Canon designs.  If you really dig into things you will find that the Canon sensors are actually the class leaders in terms of dynamic range, they just foul it up in terms of their system architecture.  The answer to the question "can Canon deliver a class leading sensor" is yes... they already do.  Take a look at the data from Sensorgen for the 5DIII for example:  Min read noise 2.4e-, FWC=67531 for a DR of 14.7 stops.

And Nikon's D4 would whip everything from Canon quite nicely then.

Quote
If they were to put an ADC on-chip similar to what Sony does, they would have equivalent performance maybe even better.

What Canon needs is a "class leading" system architecture and ADC.  This is a nit, I know but it is important because Canon DOES have class leading sensors and has had them for quite some time.

If you mean "system architecture" to mean "sensor architecture", then yes because the ADC is an integral part of the sensor.
When I say system architecture, I mean exactly that -- how the system is architected.  In Sony's case they have optimized the architecture to minimize pattern noise and overall system read noise -- it is an excellent design in that respect.  They achieved this through the use of a distributed ADC structure that they were able to implement on-chip.  It is a clever and innovative approach (originally invented by IBM, BTW).   

Canon has chosen to design their system in am more traditional way using an ADC off-chip.  This has consequences, since it requires a higher speed, more complex pipelined ADC which will not yield the same effective number of bits (ENOB) of the Sony approach.

This is a classic study in system architecture where you can design two radio receivers using the same high performance front end LNA (low noise amplifier) but without optimizing your noise lineup for DR one doesn't work as well as the other.  In this case it isn't a satellite receiver but a camera but the principles are the same.

FWIW, the ADC that canon is using is state of the art, they have a 14 bit converter that is probably getting about 12.5 ENOB at 40 MSPS.  That is about as good as it gets for a pipelined ADC running that fast.  Both companies have state of the art sensors which appear to yield pretty much the same performance, Sony has done a better job on system design by optimizing the system noise line up for this one particular parameter.

If you are going to talk about sensors you need to talk about sensors, the sensor is only one part of the overall system and it is this overall system implementation that determines the noise figure and dynamic range of the camera.  In the case of Canon the sensor and ADC are separate chips.

Aah! That's a breath of fresh air! Thanks for the post! :D

1013
EOS Bodies / Re: Can Canon deliver a FF sensor that is class leading?
« on: June 04, 2014, 03:22:37 PM »
There is always noise.

Yeah, but there will be a lot less of it once we have fuel cell-powered Peltier cooled sensors….  :D

Indeed! :D I'm just counting the days...can't be that far off... ;)

1014
EOS Bodies / Re: Can Canon deliver a FF sensor that is class leading?
« on: June 04, 2014, 03:22:01 PM »
...
Who says?  FWIW I don't believe that the flat spot has anything to do with the sensor at all -- it is due to the noise being dominated by that of the ADC which follows the sensor in the Canon designs.  If you really dig into things you will find that the Canon sensors are actually the class leaders in terms of dynamic range, they just foul it up in terms of their system architecture.  The answer to the question "can Canon deliver a class leading sensor" is yes... they already do.  Take a look at the data from Sensorgen for the 5DIII for example:  Min read noise 2.4e-, FWC=67531 for a DR of 14.7 stops.

And Nikon's D4 would whip everything from Canon quite nicely then.

Only at lower ISO, and only marginally. In every other respect, the Canon 1D X trounces the D4. In the most important aspects, frame rate and AF system (arguably the critical traits of such cameras), the 1D X has the technological and performance lead.

Quote
If they were to put an ADC on-chip similar to what Sony does, they would have equivalent performance maybe even better.

What Canon needs is a "class leading" system architecture and ADC.  This is a nit, I know but it is important because Canon DOES have class leading sensors and has had them for quite some time.

If you mean "system architecture" to mean "sensor architecture", then yes because the ADC is an integral part of the sensor.

Completely incorrect. In Canon "systems", the ADC is OFF-DIE. I don't know how many times I've written that when answering you...but that means it is NOT an integral part of the sensor. Canon's ADC units are in their off-sensor die digital signal processors...the DIGIC chips. That, as it turns out, is actually how MOST CMOS image sensors are designed. Sony Exmor is currently somewhat unique in that it has a column-parallel ADC integrated right into the sensor. Canon has patents for that kind of technology...they just haven't employed it in a commercial product yet.

So for the time being, David is correct...it's "system architecture", not "sensor architecture".

1015
EOS Bodies / Re: Can Canon deliver a FF sensor that is class leading?
« on: June 04, 2014, 07:28:57 AM »
So, while I understand what your trying to say...I think it's a misnomer. Dynamic range is itself defining what's usable...so saying that you can only use part of what's usable... Redundant and inaccurate. And, incorrect. ;P

Ok, thanks for explaining, I'm really not much of a tech geek even after all this time on CR :-p ... so for my education and to avoid further confusion: With which word(s) would you label the dynamic range starting from the shadows when no banding is visible anymore? Because that's what I called "usable" as fpn is what makes your shot "unusable".

That is dynamic range. The definition, in mathematical terms, is 20 * log(MaxSignal/NoiseRMS), to get dynamic range in decibels. The NoiseRMS is the Root Mean Square of the noise...basically if you sampled the actual noise in each pixel, and computed the RMS from that, that would be your noise floor. Dynamic range describes the usable potential signal range from the noise floor to the maximum signal (saturation point, in the case of ISO, that is also the Full Well Capacity, FWC.)

Maximum signal strength defines the range from "zero to"...if you had no noise, you would have the full range, say from 0.000...1e- to 76606e- (in the case of the 6D). If you could use the full potential signal range, the formula above becomes 20 * log(MaxSignal/0)...but you can't do that, it's divide by zero, which is undefined (undefined because the value becomes infinite...but infinity is also technically undefined). So, if you had no noise, you would have infinite dynamic range. Because noise exists, dynamic range must therefor be something less than infinity. You can keep lowering the noise floor. In astrophotography, we use supercooled CCD cameras that have extremely low dark current (while a DSLR may have as much as 20e-/s/px, a cooled CCD can have as little as 0.002e-/s/px), extremely high Q.E. (77% with Sony's new ICX CCDs is not uncommon, and some of the older Kodak KAF sensors (now owned by TrueSenseImaging, TSI) had as much as 90% Q.E.), and very low read noise at optimal gain. Further, we use more extreme noise reduction techniques, such as bias and dark calibration, which eliminates fixed sensor pattern noise as well as hot pixel noise, leaving us with what is effectively a pure gaussian read noise and photon shot noise signal. We then average together dozens of individual light frames to reduce noise even more (reduction is SQRT(subCount), so if we stack 100 subs, we get 1/10th the noise). In astrophotography, dynamic range of a final integration can be 25-30 stops or more (and we do "stretching", basically shadow lifting, that is so extreme it would make the D800 cry at how utterly sucky it is! :P). But it's still not infinite DR, because there is still noise. There is always noise. :P

1016
EOS Bodies / Re: Can Canon deliver a FF sensor that is class leading?
« on: June 03, 2014, 06:16:27 PM »
almost the full theoretical dr is actually usable
This is not true. The 6D does NOT have the "full theoretical DR", not even close.

Talking of correct quoting :-) ... but thanks for the explanation anyway, always great to read your in-depth posts. You did mis-understand me in this case though: the "usable dynamic range" was related to pattern noise, not to the theoretical limits of a Canon sensor if you glue it inside a Nikon camera body.

I think you may be misunderstanding what dynamic range is. Dynamic range is defining the "usable" range of signal. I think it becomes a bit redundant to say "usable dynamic range". The entire dynamic range is usable, because it's the range from the noise floor to the maximum signal...dynamic range implicitly excludes the "unusable" part of the signal. In an "ideal" system, one without noise, dynamic range would (theoretically) be infinite (The formula: 20 log(FWC/NoiseRMS) fails if NoiseRMS == 0...that's effectively a vertical and instantaneous asymptotic explosion to infinity. :P). That would mean that you have an infinite range of USABLE signal.

So, while I understand what your trying to say...I think it's a misnomer. Dynamic range is itself defining what's usable...so saying that you can only use part of what's usable... Redundant and inaccurate. And, incorrect. ;P

however the ADC bit depth limits it to 14, and downstream electronics (namely the ADC) introduce so much noise that it flattens the curve, rather than leaving it in it's linear state.

If you're interested in how the Magic Lantern people to improve the adc chain, see their forum (I posted the link above)


I've read some of their stuff. I haven't read it all. From what I have read, a lot of what they do benefits from Canon's downstream secondary amplifier.

It doesn't matter if the 6D ISO 100 noise doesn't have banding...the problem is that it still has a ton of noise.

Noise doesn't worry me that much as the 5d3/1dx/6d manage to make it look like film nose, so that elevates it way above my 60d (and as far as I see 5d2) concerning "destructive noise". My eye is not a technical instrument to measure snr.

I totally agree that the NATURE of noise can make it acceptable or not. Horizontal and vertical bands are horrid and completely unnnatural. The 6D is indeed a massive improvement there. The noise is quite aesthetically pleasing and random. Canon has certainly made strides here.

1017
EOS Bodies / Re: Can Canon deliver a FF sensor that is class leading?
« on: June 03, 2014, 03:00:51 PM »
And if Canon can do that, can it finally deliver a FF sensor that is also class leading? By class leading, I'm referring to noise control and DR

Canon already has a class-leading sensor, it's in the 6d: nearly zero (esp. vertical) banding (better than 1dx @iso100) means almost the full theoretical dr is actually usable. It has good dr @base iso (boosted +1/3ev by ML and =15ev with dual_iso) and superior dynamic range at high iso: http://sensorgen.info/CanonEOS_6D.html

This is not true. The 6D does NOT have the "full theoretical DR", not even close. The 6D has the same problem as every other Canon camera: A flattened DR curve at low ISO. Canon's DR tops out at around 12 stops of DR. Their sensors are more than capable of more than that, according to Roger Clark the sensor itself is actually probably capable of a little over 15 stops of DR natively (in analog space, before digitization), however the ADC bit depth limits it to 14, and downstream electronics (namely the ADC) introduce so much noise that it flattens the curve, rather than leaving it in it's linear state. It doesn't matter if the 6D ISO 100 noise doesn't have banding...the problem is that it still has a ton of noise. It has 26.8e- worth of read noise, which while less than the 1DX's 38.2e-, it also has a lower FWC, so it's dynamic range is roughly the same.

The problem isn't the sensor. Canon's sensors are very good. Canon's problem is their high frequency off-die ADC units housed in the DIGIC chips. They are just plain noisy.

Roger Clark has evaluated a lot of Canon sensors. His work finds the lowest noise level in the sensor itself, which would be intrinsic sensor noise, devoid of actual read noise. That's dark current noise in the sensor, along with whatever noise the per-pixel amplifiers might introduce. In Canon sensors, that noise level is around 2e-. In the case of the 6D, the sensor's analog dynamic range would be 20log(76606/2) dB, or 91.65 dB, which in terms of stops is 15.27. In the case of the 1D X, which also has ~2e- intrinsic sensor noise, the maximum possible dynamic range would be 93.1 dB or 15.52 stops. These levels aren't realizable due to the amount of read noise at ISO 100. If Canon can get their read noise under control, and get their ISO 100 noise levels down to 3-4e-, their dynamic range would be ~14-14.5 stops. Throw in a little bit of quantization error and PRNU and a 14-bit ADC, and Canon's DR jumps up to the level of Sony Exmors. Throw in a 16-bit ADC, and Canon should be able to achieve 14.5 stops of DR pretty easily. If they can lower their read noise levels even more, they could achieve well more than 15 stops of DR.

Their SENSORS are capable. The rest of their electronics are not. Canon's biggest problem is their approach of offloading the ADC into the DIGIC chip, and running them at very high frequency. Plus, their use of a downstream secondary amplifier doesn't help, but that only kicks in at higher ISO as far as I am aware.

1018
Third Party Manufacturers / Re: This Is How You Sell a Used Lens !!
« on: June 03, 2014, 02:52:36 PM »
Now DAT  guy is a SALESMAN! BOOM! EPIC! (And where DO you get Spaghetti-Ohs?)

1019
EOS Bodies / Re: New Full Frame Camera in Testing? [CR1]
« on: June 02, 2014, 11:31:43 PM »
The most anal people I know about image colour are flower photographers and ceramicists, ever photograph a red flower and it not look anything like the flower did? Try deep blue, purple, and mauve flowers, they are a very difficult to get accurate and you have to use a camera profile specifically for the light you shot in.

A lot of trouble with flowers is even more that people seem to stick to sRGB which makes many flowers impossible to show correctly. A wide gamut monitor will give you a much better chance (of course it's true that the WB and profiles and all can still mess with things).

True, sRGB is too endemic. It's really time we started moving towards larger gamuts. Even AdobeRGB isn't quite good enough, as most of the gain with AdobeRGB is in the greens. The deep reds and blues and violets, where a lot of flower color resides, don't really change much with AdobeRGB. ProPhotoRGB may not be the best either, as its extent is even beyond that of human perception, but it's still got the ability to map almost every color at the richest saturation the human eye can discern.

Sadly, even 10-bit screens with 14- and 16-bit 3D LUTs are still not quite good enough at showing reds. I have these Peonies that are just about to burst into color. I've tried photographing them in years past, and I've never been able to get the reds and pinks to come out right...they clip and there is little tonality. Bleh. It's such a pain. My roses have a similar problem, however most of those have a deeper red that actually does fall into gamut for AdobeRGB.

No question, though...rich saturated color, particularly in the non-greens, can be a real problem.

1020
Oh, no. Jrista what happened?? About a year ago, you had finally gotten down with the concept of normalization, but now you are back to your old game of normalization doesn't make sense again. :(


Well, it's no surprise that you buy into DXO's bull. There are two values on DXO's site for DR. One is a measure, as in something actually MEASURED from a REAL RAW file. The other is an EXTRAPOLATION. It isn't even a real extrapolation, it is just a number spit out by a simple mathematical formula...they don't actually even do what they say they are doing.

And since the simple formula is simply it actually gives worse results if anything compared to fancy techniques, not better.

Quote
The first of these is Screen DR. Screen DR is the ONLY actual "measure" of dynamic range that DXO does. It is the SINGLE and SOLE value for DR that is actually based on the actual RAW data. In the case of the D800....do you know what Screen DR is? (My guess is not.)

The other of these is Print DR. Print DR is supposedly the dynamic range "taken" from a downsampled image. The image size is an 8x12 "print", or do DXO's charts say. As it actually happens to be, and this is even according to DXO themselves...Print DR is not a measure at all. It isn't a measurement taken from an actually downsampled image. You know what it is? It is an extremely simple MATHEMATICAL EXTRAPOLATION based on...what? Oh, yup...the only actual TRUE MEASURE of dynamic range that DXO has: Screen DR. Print DR is simply the formula DR+ log2 SQRT(N/N0). DR is ScreenDR. N is the actual image size, N0 is the supposed downsampled size. The formula is rigged to guarantee that "Print DR" is higher than Screen DR...not even equal to, always higher. And, as it so happens, potentially 100% unrelated to reality, since it is not actually measured.

Oh brother. It is not rigged! Why are you back to calling normalization rigged again???? Do you realize that 90% of modern tech and science wouldn't work out if what you say was true?

Quote
DXO doesn't even have the GUTS to ACTUALLY downsample real images and actually measure the dynamic range from those downsampled images. They just run a mathematical forumla against Screen DR and ASSUME that the dynamic range of an image, IF they had downsampled it, wouold be the same as what that mathematical value says it should be.

Print DR is about as bogus as "camera measurement 'science'" can possibly get. It's a joke. It's a lie. It's bullshit. The D800 does not have 14.4 stops of DR, as DXO's Print DR would indicate. The Screen DR measure of the D800? Oh, yeah...it's LESS than 14 stops, as one would expect with a 14-bit output. It's 13.2 stops, over ONE FULL STOP less than Print DR. The D600? Says Print DR 14.2, but Screen DR is 13.4. D610? Print DR 14.36, but Screen DR 13.55. D5300? Print DR 13.8, but Screen DR 13. A7? Print DR 14, but Screen DR 13.2. A7s? Print DR 14 but Screen DR 13. NOT ONE SINGLE SENSOR with 14-bit ADC output has EVER actually MEASURED more than 14 stops of dynamic range. That's because it's impossible for a 14-bit ADC to put put enough information to allow for more than 14 stops of dynamic range. There simply isn't enough room in the bit space to contain enough information to allow for more than 14 stops..not even 0.1 more stops. Every stop is a doubling. Just as every bit is a doubling. Bits and stops, in this context, are interchangeable terms. In the first bit you have two values. With the second bit, your "dynamic range" of number space doubles...you now have FOUR values. Third bit, eight values. Fourth bit, sixteen values. Fifth bit, thirty two values. To begin using numeric space beyond what the 14th bit allows, which would be necessary to start using up some of the 15th stop of dynamic range, you need at least 15 bits of information. It's theoretically, technologically, and logically impossible for any camera that uses a 14-bit ADC to have more than 14 stops of dynamic range.

no, no, no and no

comparing noise at different energy scales as if the scales were the same is what would be totally bogus!

Quote
Here is another fact about dynamic range. Dynamic range, as most photographers think about it these days, is the number of stops of editing latitude you have. While it also has connotations to the amount of noise in an image, the biggest thing that photographers think about when it comes to dynamic range is: How many stops can I lift this image? We get editing latitude by editing RAW images. RAW. Not downsampled TIFFs or JPEGs or any other format. RAW images. How do we edit RAW images? Well...as RAW images. There IS NO DOWNSAMPLING when we edit a RAW image. Even if there was...who says that we are all going to downsample our images to an 8x12" print size (3600x2400 pixels, or 8.6mp)? We edit RAW images at full size. It's the only possible way to edit a RAW image...otherwise, it simply wouldn't be RAW, it would be the output of downsampling a RAW to a smaller file size...which probably means TIFF. Have you ever tried to push the exposure of a TIFF image around the same way you push a RAW file around? You don't even get remotely close to the kind of shadow lifting or highlight recovery capabilities editing a TIFF as you do a RAW. Not even remotely close. And the editing latitude of JPEG? HAH! Don't even make me say it.

Therefor, the ONLY valid measure of dynamic range is the DIRECT measure, the measure from a RAW file itself, at original size, in the exact same form that photographers are going to be editing themselves. Screen DR is the sole valid measure of dynamic range from DXO. Print DR is 100% bogus, misleading, fake.

Please go the library and check out a book on normalization and mathematics.



Quote
Their sensors may be good, but how Sony themselves are using their sensors is crap.

sometimes they do do some annoying things, that is true at least

I don't disagree with you. But your missing my point. I tried to be clear about what I'm referring to. Noise is one thing. And noise in an image doesn't just come from read noise, it's the photon shot noise in the signal as well. And YES, downsampling normalizes results. I'm not debating that.

I'm specifically debating the notion that you actually have 14.4 stops worth of EDITING LATITUDE with a D800, or 14.2 with a D800, etc. Because that's what everyone things about. That's what everyone is referring to when they bring up the DR difference. It's not a bad thing, and there is no question that Sony Exmor has more DR than a Canon sensor. The problem I have is the missleading notion that DXO's Print DR "results" have created in the community.

We don't push the exposure of downsampled TIFF files around, so it makes no sense to refer to 14.4 stops of DR in the context of, say, discussing the benefit of DR when working with landscapes. It really doesn't make any sense to refer to 14.4 stops of 8mp image DR when discussing actual photographic editing in ANY context EXCEPT when directly comparing cameras, and then, only in a very neat and tidy context...such as when your actually on the DXO web site. In all other contexts, the only legitimate measure is that taken directly from the RAW...from the actual image we actually work with out in the actual world. In that context...the D800 has 13.2 stops of DR.

Does that make sense? As far as I'm concerned: Comparison Shmarison! :P I care about actual real-world editing latitude. Mathematically extrapolated imaginary downsampled fake "measurements" don't tell me jack about what I am ACTUALLY going to be able to do FOR REAL. Screen DR? It tells me exactly what I want to know. It tells every photographer what they want to know: How much can I lift my landscape photos? Print DR is lying...it tells you you could lift more than you actually can, because you don't edit RAW images downsampled, JPEGs don't even remotely cut it, and TIFF images, because they are RGB triples rather than RAW indepentent digital signal values, you can't lift the shadows nor compress the highlights the same way...not without significant artifacts after a push or pull of a couple stops. (i.e. you may be able to lift shadows by two, maybe three stops without artifacts with a TIFF at "14.4" stops DR, but you could easily lift a Nikon D800 RAW by six stops at 13.2 stops DR.)

So I don't disagree. I agree. I am just working within a different context. The context I believe most photographers approach the subject of DR (based on the things they reference when they approach it.) To you and me, dynamic range means signal cleanliness across the entire band. To most everyone else, it means: How much can I lift without banding in the shadows? :P

Pages: 1 ... 66 67 [68] 69 70 ... 299