September 02, 2014, 01:36:47 AM

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - jrista

Pages: 1 ... 100 101 [102] 103 104 ... 276
1516
EOS Bodies / Re: Will Canon ditch the AA Filter?
« on: January 20, 2014, 02:18:33 AM »
The quality with which LR renders my 7D images only seems to get better and better with time and each subsequent version, so as Adobe optimizes their demosaicing implementation, any inherent error is clearly diminishing.

Thanks again for all your great posts on this! The problem with Adobe is that they seem to be very secretive about any improvements concerning ACR or LR, their official changlog only reflects a small part of the changes ... or is there any Adobe or 3rd party documentation on their raw converter improvements over time?

There isn't any official information. I wouldn't expect any, either. The minutia of the algorithmic details of Adobe's demosaicing algorithm would bore most people, and would probably be incomprehensible to anyone who didn't have a math or comp. sci. degree. The specifics aren't all that important. All I do know is that LR5, compared to LR3, produces clearer, sharper results...so Adobe certainly seems to be optimizing their algorithms. Optimization over the long term is to be expected. Demosaicing is the heart of what ACR and LR do...it is going to be the single biggest processing cost when rendering RAW images to screen (since every time you change a setting, they have to rerender, and that can be many times per second.)

I also remember a fairly significant jump in overall quality, edge definition, and sharpness between LR3 and 4. In LR3, sharp edges, hairs, bird feather barbs, etc. looked more like DPP (which as you can see clearly has a less effective algorithm, as it intrinsically blurs more, yet still produces stair-stepping.) LR4 produces really crisp, clean, smooth edges, without blurring. They never mentioned any specific changes to ACR that would cause those changes, but they were clearly there. I don't really care that Adobe keeps the specifics under wrap...sometimes it isn't a good idea to expose too many details about one's technology, as it can invite too many questions for customers when a customer decides something they are seeing in their results doesn't jive with what the company is saying about their algorithms.

1517
EOS Bodies / Re: Will Canon ditch the AA Filter?
« on: January 20, 2014, 01:38:44 AM »
Removal of an AA filter is far from a high end feature. It is a gimmick for all but a very few niche photographer types who's work primarily involves photographing things with entirely random data that could not produce much aliasing regardless. For the vast majority of photographers, use of an AA filter is quite essential to producing BETTER image quality. Aliasing produces nonsense, noise, useless detail. ANTI-Aliasing restores that useless nonsense noise to a more accurate form.

On this I 100% agree with you though.

One side note, the Canon 7D isn't maybe the best body to use (I mean not in the equipment sense and taking pictures but in the testbed to compare detail vs other cameras in a lab sort of sense), since it has those heavily split greens in the CFA array so the de-mosaic routines have to do very tricky things which tend to leave a bit of residual loss of micro-contrast behind (it's actually surprising that they manage to not leave behind major resolution loss, they must be doing some pretty sneaky stuff to handle the split greens and preventing mazing artifacts while not hitting the resolution but a trace).

What do you mean by "heavily split greens"? The 7D has a pretty standard bayer sensor as far as I know...they shouldn't need to do any special processing (and ACR/LR seem to handle 7D files just fine with their AHDD algorithm.)

You know how they use two greens for each red and blue so you have a G1 R G2 B well with the 7D they decided to make G1 very much not the same as G2. It seemed like they wanted to sneak just a little bit more light to the sensor by making one the greens even yet more color blind.

If you developed 7D files right when it first came out with ACR (or even DPP!) you'd notice also sorts of maze patterns appearing. It was especially bad, if I recall correctly, in orangey-yellow blocks of color. Some of were like what the heck is with the artifacts in these 7D images and then we were noticing weird things where measuring the SNR of the G1 channels in the RAW data always gave different results than measuring on the G2 channels.

A few of us brought the complaints to the converter makers and Canon's attention and some even returned their initial 7D copies thinking that maybe that had the Bayer array somehow misaligned or something. Then a few weeks later Adobe released a new ACR and I think shortly after that a new DPP came out. The early speculation would be that getting around the split greens would cause major loss of resolution or noticeably remaining artifacts, but somehow the converter makers found a way to pretty much solve all the mazing artifacts while only barely hitting the resolution at all (if you still had the early ACR beta that supported the 7D and process a file with it, you can get a touch more micro-contrast out of it with 7D files than using the later fixed beta and final releases). It's hard to say but it seemed like the fix maybe effectively knocked 1-2MP MP off the 18MP sensor, not really a big deal, perhaps it made the files a bit more filmlike looking.

And you can see references by Adobe to split green parameters in ACR. It's not just the 7D that needs them but a few other cameras as well from some of the smaller players in the digital camera world. I think Canon is not splitting the greens so much again and I;m not sure if Nikon ever did (i'm pretty unsure about this last stuff though).

If you didn't buy a 7D within the first 2-3 weeks of their very first arrival in the U.S. you probably missed the whole mazing thing with ACR (and maybe 3-4 weeks for DPP and the others).

Hmm. I can't imagine that such a thing is a huge problem. It's not all that different from Sony's "Emerald Green" CFA that they introduced many years ago (they called it RGBE). Their "Emerald" had more blue in it than the standard green. Based on all the sample images at the time, it actually produced better color accuracy...but it would be the exact same thing as your describing with the 7D.

There have been similar approaches in the past by other companies as well. Some simply do away with the second green and make it a "white". Fuji threw in an extra tiny little white pixel between all the primaries that were very widely spaced, but gathered extra luminance data. If what you say is correct, that would have caused even more problems for demosaicing, however it improved resolution a bit, and improved DR (although the DR improvement seemed minor, especially compared to what Sony did with Exmor.) Kodak, Sony, and Fuji all now have RGBW sensor designs, and Sony is even starting to patent non-square pixels (they have patents for triangular and hexagonal pixels now, and supposedly one of these pixel designs is going to be used in their forthcoming 54mp FF sensor.)

I also can't imagine that it would cause a loss in resolution. I mean, the crux of any bayer demosaicing algorithm is interpolating the intersections between every (overlapping) set of 2x2 pixels. Because there is reuse of sensor pixels for multiple output pixels, there is an inherent blur radius. But it is extremely small, and it wouldn't grow or shrink if one of the pixel colors changed. You would still be interpolating the same basic amount of information within the same radius. I remember there being a small improvement in resolution with my 7D between LR 3 and 4, and things seem a bit crisper again moving from LR 4 to 5. I suspect any supposed loss in resolution with the 7D was due to the novelty of Adobe's implementation of support for the 7D, not anything related to having two slightly different colors for the green pixels. The quality with which LR renders my 7D images only seems to get better and better with time and each subsequent version, so as Adobe optimizes their demosaicing implementation, any inherent error is clearly diminishing.

BTW, there is no way anything Adobe has ever done that could possibly "knock off 1-2mp worth of resolution" from the 7D. The most basic demosaicing algorithm will produce noisy, mazed and stair-stepped results. Better demosaicing algorithms factor in more information from a greater area of pixels as necessary (i.e. in order to avoid maze artifacts), however the amount of blur they introduce is fractional...not even close to diminishing resolution by another two megapixels over the bayer design itself. You can clearly see this when comparing DPP to ACR/LR with something like a strand of hair. DPP will produce a fairly jagged result, ACR/LR produce a very clean result. Based on the sample below, ACR is actually sharper and supports even finer detail resolution:



Additionally, the output resolution of my 7D + EF 600/4 L II is WAY, WAY, WAY better 7D + 100-400 @ 400. The difference that a good lens makes with the 7D's resolving power would completely overwhelm any perceived difference that the ACR/LR demosaicing algorithm makes. That's entirely in line with theory as well, as output resolution is approximated by the RMS of the resolutions of each component. With the 600/4 II, the 7D produces exceptionally sharp results, and is able to very finely delineate detail with a good lens that otherwise appears quite soft with the likes of the 100-400 L, 300/4 L, 70-200 L, etc. I mean, just look at the detail resolved on these birds (all 7D, all of which are processed with Lightroom):






1518
Is it normal for an image's colors to appear slightly different, on the same calibrated system, across what appear to be color-managed (or at least ICC-aware) browsers? (I'm in Windows 7, 64-bit.)

I've attached a cropped screen shot of how IE11 and Firefox 26 render an ICC-tagged test image that I found in another forum on this site.

Both browsers correctly show a yellow car, so as I understand the issue the browsers are ICC-aware (if they weren't the cars would appear purple, as I've demonstrated to myself in some non-color managed apps I've got.) But Firefox (and Chrome, it turns out) introduce a warm cast.

Is this just how it is? Or am I missing some variable in the color management / ICC chain?

Thanks.

Absolutely. IE is NOT fully color-managed (it simply converts any image tagged as wide gamut to sRGB and that's the extent of it's management, and that actually makes things worse if you were running it on a monitor in wide gamut mode so the wide gamut images may have looked at least vaguely correct before it turned them into sRGB). Firefox is color-managed and if you use the plug-in to turn it on 100% it will even color-manage the text and background colors, every single last element on a webpage.

IE does not make any use of the monitor profile at all. So if you have an sRGB image but your monitor is set to gamma 2.2 it won't translate from sRGB tone curve to gamma 2.2 and you'll get shadows that look a bit too contrasty and dark as well as slight mistakes in the highlights. Also if the primaries on your monitor are not exactly set to sRGB coordinates it won't apply the needed transformations for that and if your profile is a complex LUT table to try to fix some of the color issues with your monitor it won't use those either.

Firefox will do all of that.

Chrome does nothing ever.

(actually i haven't tested the very latest IE that just came out, but I thought I've heard that it is the same old story, more or less not color-managed)

I'm not sure that Chrome does nothing. At least, Chromium as used in Opera, based on some tests I just ran a short while ago, seems to do color management. Firefox doesn't support the ICC test, however it does seem to properly render photos with AdobeRGB.

Seems like browser color management is a quirky scenario. Only IE supports the ICC test, however all browsers seem to do some kind of ICM on some images. Opera/Chromium, for example, doesn't seem to handle ProPhotoRGB images for whatever reason (they show up rather desaturated.)

Really wish this issue would be delt with. The major operating systems offer built-in color management engines. Browsers wouldn't have to implement them on their own, they would just need to hook into the OS ICM and let it do it's thing. Sounds rather simple, if you ask me...

IE passes the ICC test because it understands image profiles and both v2 and v4 but that ICC test only tests how browsers react to image profiles. It does nothing to tell you abut whether the browser handles monitor profiles. IE doesn't (at least not as of all versions prior to the very latest and I'm pretty sure even the latest still does not).

Firefox may fail the ICC test, but only the v4 tagged images of which there are likely only a vanishing few out there on the web but it passes everything else. It understands monitor profiles and can even handle text, backgrounds and so on, everything (well actually not Adobe Flash stuff since that renders itself and is not managed, so sadly flash based slide shows are 100% not color managed :(  ). Some browsers handle most stuff but don't handle untagged elements so all the images not tagged as sRGB (but that are sRGB) and all the web elemenets like various background colors and text colors and so on are not managed and all that stuff goes radioactive if you use those browsers on a wide gamut monitor set to wide gamut mode, even if all tagged images look perfect.

at least on Windows. MAC is different some there have been times that MAC versions of browsers have been color-managed but not Windows and even vice-versa and then in some cases the MAC itself tries to color manage which Windows never does.

The browser shouldn't "handle" monitor profiles. It isn't the browsers job, technically speaking. It's the job of the Image Color Management (ICM) engine to handle that. You have multiple profiles in play at any given time. In the case of rendering an image to the screen, you have two key profiles: The image profile and the monitor profile. The monitor profile ONLY affects things rendered to the screen. That is completely abstract from the browser, or any other ICM-aware application. It isn't the applications responsibility to deal with the monitor.

For that matter, neither is it really the application's responsibility to actually deal with the image profile. All a browser really needs to do is determine what ICC profile an image is tagged with, and render it to the screen VIA ICM. The ICM engine will then deal with any and all other devices (and their profiles) in order to spit out color-corrected values for each pixel.

In the case of a print softproofing workflow, you would technically have three profiles in play: The image, the screen, and the printer. The ICC profile assigned to an image basically calibrates the image. You can't apply a monitor profile to an image, it wouldn't make any sense. Neither can you apply a print profile to an image. Each profile only applies to the thing it was designed for. ICM will convert from the image ICC profile to the printer ICC profile, and will then in turn render that output to screen with the monitor ICC profile.

So long as a browser properly uses the OS ICM engine, it doesn't actually need to do a bloody thing, other than render via ICM with the images ICC profile (or assume sRGB if there is no profile tagged). The OS ICM engine does all the rest.

In the case of Opera/Chrome, it appears they do not support the v4 profiles like you say. That said, as far as I can tell, they do support full ICM as I've described above for v2 profiles. Is there something about IE that is different? Is there evidence that indicates it converts everything to sRGB in some sideband way first? (That doesn't seem logical to me, given that Windows has had it's own ICM for years.)

1519
Is it normal for an image's colors to appear slightly different, on the same calibrated system, across what appear to be color-managed (or at least ICC-aware) browsers? (I'm in Windows 7, 64-bit.)

I've attached a cropped screen shot of how IE11 and Firefox 26 render an ICC-tagged test image that I found in another forum on this site.

Both browsers correctly show a yellow car, so as I understand the issue the browsers are ICC-aware (if they weren't the cars would appear purple, as I've demonstrated to myself in some non-color managed apps I've got.) But Firefox (and Chrome, it turns out) introduce a warm cast.

Is this just how it is? Or am I missing some variable in the color management / ICC chain?

Thanks.

Absolutely. IE is NOT fully color-managed (it simply converts any image tagged as wide gamut to sRGB and that's the extent of it's management, and that actually makes things worse if you were running it on a monitor in wide gamut mode so the wide gamut images may have looked at least vaguely correct before it turned them into sRGB). Firefox is color-managed and if you use the plug-in to turn it on 100% it will even color-manage the text and background colors, every single last element on a webpage.

IE does not make any use of the monitor profile at all. So if you have an sRGB image but your monitor is set to gamma 2.2 it won't translate from sRGB tone curve to gamma 2.2 and you'll get shadows that look a bit too contrasty and dark as well as slight mistakes in the highlights. Also if the primaries on your monitor are not exactly set to sRGB coordinates it won't apply the needed transformations for that and if your profile is a complex LUT table to try to fix some of the color issues with your monitor it won't use those either.

Firefox will do all of that.

Chrome does nothing ever.

(actually i haven't tested the very latest IE that just came out, but I thought I've heard that it is the same old story, more or less not color-managed)

I'm not sure that Chrome does nothing. At least, Chromium as used in Opera, based on some tests I just ran a short while ago, seems to do color management. Firefox doesn't support the ICC test, however it does seem to properly render photos with AdobeRGB.

Seems like browser color management is a quirky scenario. Only IE supports the ICC test, however all browsers seem to do some kind of ICM on some images. Opera/Chromium, for example, doesn't seem to handle ProPhotoRGB images for whatever reason (they show up rather desaturated.)

Really wish this issue would be delt with. The major operating systems offer built-in color management engines. Browsers wouldn't have to implement them on their own, they would just need to hook into the OS ICM and let it do it's thing. Sounds rather simple, if you ask me...

1520
EOS Bodies / Re: Will Canon ditch the AA Filter?
« on: January 20, 2014, 12:18:09 AM »
Removal of an AA filter is far from a high end feature. It is a gimmick for all but a very few niche photographer types who's work primarily involves photographing things with entirely random data that could not produce much aliasing regardless. For the vast majority of photographers, use of an AA filter is quite essential to producing BETTER image quality. Aliasing produces nonsense, noise, useless detail. ANTI-Aliasing restores that useless nonsense noise to a more accurate form.

On this I 100% agree with you though.

One side note, the Canon 7D isn't maybe the best body to use (I mean not in the equipment sense and taking pictures but in the testbed to compare detail vs other cameras in a lab sort of sense), since it has those heavily split greens in the CFA array so the de-mosaic routines have to do very tricky things which tend to leave a bit of residual loss of micro-contrast behind (it's actually surprising that they manage to not leave behind major resolution loss, they must be doing some pretty sneaky stuff to handle the split greens and preventing mazing artifacts while not hitting the resolution but a trace).

What do you mean by "heavily split greens"? The 7D has a pretty standard bayer sensor as far as I know...they shouldn't need to do any special processing (and ACR/LR seem to handle 7D files just fine with their AHDD algorithm.)

1521
I called bull on the last topic, after Gas am claimed Amazon had exchanged four copies of the same camera four times without question, and due to the lack of any sample images to back up the claims of problematic pixels. After that, it was locked.

I still don't think the discussion here can proceed without getting some visual evidence to clearly explain what the OP is talking about. I highly suspect Gas am is complaining about the very well known, expected phenomena of hot pixels that occur during longer exposures. However, without any kind of visual evidence, it is hard to say that for sure.

So, Gas am, PLEASE...post some sample 100% crops demonstrating the problem. We can't really help you without knowing exactly what it is you are talking about.

1522
EOS Bodies / Re: Is ARRI Canon's Biggest Obstacle in Professional Cinema?
« on: January 20, 2014, 12:11:23 AM »
ARRI is not Canon's biggest obstacle. CANON is Canon's biggest obstacle. There is still a huge demand for film and Canon don't make any film cameras. it's that simple.

The future isn't film, though...and digital is rapidly gaining ground on film. Canon would have to shift resources from digital cinema to film in order to build up a product line and gain a presence...in a market that will decline in the long term. Not really a wise move. I think Canon has it right, building up a presence in digital cinema.

1523
EOS Bodies / Re: Canon naming policy
« on: January 19, 2014, 11:13:27 PM »
Canon are lazy.  They are currently just using the same numbers they had in the 1980's and 1990's, but with a "D" tacked on the back.  Given that they are moving to EVFs in the next couple of years, I suspect that they'll go back to the start, but replace the "D" with an "E" (for EVF, Evolution etc).  We'll have "C", "D", and "E" cameras, and it will be immediately obvious what type of camera it is.

Disagree that Canon marketing department is lazy.  It is wise to continue with a heritage, it will help build on brand equity.  If you think Canon should change, why not even change the name "Canon" to something else.

The biggest problem I see with the current name system is that in the future we may see names like 5D Mark 75 or the Canon Rebel T827i

At some point Canon will need to rationalize it naming convention and "reinvent" itself, like Adobe did with CS (Creative Suite).  Perhaps this is what the OP meant, though I don't think Canon is there yet.

Well, I'm ok with the 5D Mark 75. That sucker won't be rolling around for another 288 years, so I won't ever have to deal with ludicrous version numbers like that. :D For that matter, we won't even see the 5D X for another 28 years. :P I'll be retired by then, and X is a nice round roman numeral. (Assums a 4-year inter-version release period.)

1524
Ah, Windows. Well, I've already got ProRes-envy; now I've got this to worry about.

As a Mac user, I like not having to worry about this sort of thing directly. But, it's good to be reminded that people could be viewing my images through tinted Windows.   ;)

Yes, that test image was very helpful in puzzling this out. Thanks.

For what it's worth, IE 11 matches PS, LR, etc. It's also the only browser on my side that seems to be ICC-4 ready, based on the test page located here:

http://www.color.org/version4html.xalter

Hmm, very interesting. I had thought Chrome was ICM compatible, but I guess not. Opera (also a Chromium browser now) is also not compatible, and neither is Firefox. That would certainly explain why there are discrepancies...only IE is color managed!

1525
EOS Bodies / Re: 7D Mark II on Cameraegg
« on: January 19, 2014, 09:23:52 PM »
Single Digic 5+?
Single Digic 6?
Dual Digic 5+?
or
Dual Digic 6?

According to Wikipedia, the 1DX uses Dual Digic 5+ and a Separate Digic 4 for Intelligent Subject Analysis System. I don't own a 1DX so someone else can confirm.

Source: http://en.wikipedia.org/wiki/DIGIC

I want Dual Digic 6, but it hasn't been put in DSLRs yet - may be its specific for P&S?
I strongly doubt it will be a single Digic 5+ but then again...
I seem to remember that previous rumours that talked about specs were saying dual digic5+, but I would not be surprised if it were dual Digic6.... each iteration of the processor seems to be a good jump in speed.. Digic6 probably beats dual Digic5+ so I kind of hope for dual Digic5+...

And to those who say that it cant be Dual Digic6 because that's better than the 1DX has, how come a powershot has Digic6 and the 5D3 does not :)

I don't think that DIGIC 6 is what people think it is. DIGIC 6 is used in PowerShot because its new features were designed for the kind of consumer-grade features PowerShot offers. It supports 9.3fps average frame rate (12.2fps continuous up to 5 frames, after which the rate slows), but that is its minor feature. The big features are the way it handles highlight preservation, noise reduction at high ISO, etc. DIGIC 6 is about DSP image processing features, I don't think Canon has ever intended it to be the real replacement for DIGIC 5/5+.

I suspect the 7D II will use dual DIGIC 5+. It doesn't seem all that logical for Canon to create a new DIGIC 7 for the 7D II yet, as the DIGIC 5+ still offers plenty of data processing throughput. Given the derivation of the throughput for a pair of DIGIC 5+ like so:

Code: [Select]
dataRate = (14fps * 19,100,000pixels * 14bit) / 8bit/byte + overhead
dataRate = 467,950,000bps + overhead
dataRate = 468mbps + overhead

Assuming Canon didn't create DIGIC 5+ with 234mb/s, it seems logical that each one is capable of 250mb/s (~32mbps overhead per second). At 24mp, we can derive the frame rate of the 7D II if we assume a 500mb/s data rate (Dual DIGIC 5+):

Code: [Select]
500,000,000bps = (fps * 25,200,000px * 14bit) / 8bit/byte + 32,050,000bps overhead
467,950,000bps = (fps * 25,200,000px * 14bit) / 8bit/byte
467,950,000bps * 8bit/byte = fps * 25,200,000px * 14bit
3,743,600,000bps / (25,200,000px * 14bit) = fps
fps = 3,743,600,000bps / 352,800,000px/bit
fps = 10.611fps

So, with a pair of DIGIC 5+, the 7D II with a 24mp APS-C sensor could easily reach 10fps, and have even more room left over for overhead than the 1D X. Unless Canon is intending to give the 7D II a 12fps frame rate, I don't see the need for a new DIGIC 6+ or DIGIC 7. Maybe some of the image processing features in the DIGIC 6 could be useful, however I am not exactly sure what it's data throughput rate is...however I am pretty sure it isn't actually quite as good as a single DIGIC 5+ (based on what I've been able to derive from a couple PowerShot megapixel counts and the frame rate for the first five frames, it seems like the DIGIC 6 is capable of 225mb/s, it it falls short of DIGIC 5+ by about 25mb/s.)

It is possible that Canon might create a DIGIC 6+. If they did, assuming they scale DIGIC 6+ the same way they scaled DIGIC 5+ over DIGIC 5, then a single DIGIC 6+ should be about 3x as powerful as a DIGIC 6. That would put it's data throughput rate somewhere around 640mb/s to 675mb/s. That would mean that a single DIGIC 6+ would be enough to give the 7D II a 14fps frame rate.

For some reason, I don't really see that happening...not sure why, just doesn't feel like Canon is ready to drop that particular improvement on us yet. I suspect such a new DIGIC 6+ chip (or maybe they call it DIGIC 7) will arrive with the big megapixel camera. A data throughput rate of 700mb/s would be enough to support 8fps for a 46.7mp FF sensor at 14bit, and even enough to support 7fps at full 16bit!

I'm confused, why call it a DIGIC 6 if its not an improvement on the DIGIC 5+?
Isn't a dual-DIGIC 6 better than a single DIGIC 5+?

They used a DIGIC 5 in a t4i/650D...

At this point, anything that improves on a dual DIGIC 4 is better than no change...
Plus, even if it is a dual DIGIC 6 and it is less powerful than a dual DIGIC 5+ then won't it keep the 1DX supreme. It wont' hurt its sales in terms of specs.

Better doesn't necessarily mean faster. DIGIC 6 offers much improved image processing (which is more useful for sensors with smaller pixels), but it isn't faster. It is technically still better, but not in the way a 7D II would necessarily need.

1526
I don't think it is a windows thing. ... I suspect one or more of the browsers you are using is not using the Windows ICM engine, and is instead using it's own.

Potato, potaah-to.  AFAIK, OS X manages the color and overrides any browser-specific color management.

That might be true. If it is, that could be considered a detractor...there are times I've found it extremely useful to use Photoshop's own ICM (as it actually seems more accurate). Anyway, the OP's problem occurs with Windows 7, so I think my answer is quite relevant to his issues.

1527
I don't think it is a windows thing. It would depend on whether the browsers or for that matter the tools used to create the images use Windows ICM, or implement their own ICM. Photoshop, for example, can be configured to use it's own ICM or the platform ICM. The two do not produce the same results, so if you want Photoshop to render colors the same way browsers will render colors, you should change it from it's default (using it's own ICM engine) to using Windows ICM.

I suspect one or more of the browsers you are using is not using the Windows ICM engine, and is instead using it's own. Anything that uses the same ICM engine to render color should be producing identical results. I guess the one caveat would be untagged JPEG images...it is possible that different browsers are using slightly different sRGB color profiles (ICC profiles) to represent the sRGB space. They may still use Windows ICM, but are actually supplying slightly different ICC profiles to guide color conversion. If that's the case (honestly not sure how you might figure that out), there probably isn't anything you can do about it.

1528
EOS Bodies / Re: 7D Mark II on Cameraegg
« on: January 19, 2014, 07:45:57 PM »
Single Digic 5+?
Single Digic 6?
Dual Digic 5+?
or
Dual Digic 6?

According to Wikipedia, the 1DX uses Dual Digic 5+ and a Separate Digic 4 for Intelligent Subject Analysis System. I don't own a 1DX so someone else can confirm.

Source: http://en.wikipedia.org/wiki/DIGIC

I want Dual Digic 6, but it hasn't been put in DSLRs yet - may be its specific for P&S?
I strongly doubt it will be a single Digic 5+ but then again...
I seem to remember that previous rumours that talked about specs were saying dual digic5+, but I would not be surprised if it were dual Digic6.... each iteration of the processor seems to be a good jump in speed.. Digic6 probably beats dual Digic5+ so I kind of hope for dual Digic5+...

And to those who say that it cant be Dual Digic6 because that's better than the 1DX has, how come a powershot has Digic6 and the 5D3 does not :)

I don't think that DIGIC 6 is what people think it is. DIGIC 6 is used in PowerShot because its new features were designed for the kind of consumer-grade features PowerShot offers. It supports 9.3fps average frame rate (12.2fps continuous up to 5 frames, after which the rate slows), but that is its minor feature. The big features are the way it handles highlight preservation, noise reduction at high ISO, etc. DIGIC 6 is about DSP image processing features, I don't think Canon has ever intended it to be the real replacement for DIGIC 5/5+.

I suspect the 7D II will use dual DIGIC 5+. It doesn't seem all that logical for Canon to create a new DIGIC 7 for the 7D II yet, as the DIGIC 5+ still offers plenty of data processing throughput. Given the derivation of the throughput for a pair of DIGIC 5+ like so:

Code: [Select]
dataRate = (14fps * 19,100,000pixels * 14bit) / 8bit/byte + overhead
dataRate = 467,950,000bps + overhead
dataRate = 468mbps + overhead

Assuming Canon didn't create DIGIC 5+ with 234mb/s, it seems logical that each one is capable of 250mb/s (~32mbps overhead per second). At 24mp, we can derive the frame rate of the 7D II if we assume a 500mb/s data rate (Dual DIGIC 5+):

Code: [Select]
500,000,000bps = (fps * 25,200,000px * 14bit) / 8bit/byte + 32,050,000bps overhead
467,950,000bps = (fps * 25,200,000px * 14bit) / 8bit/byte
467,950,000bps * 8bit/byte = fps * 25,200,000px * 14bit
3,743,600,000bps / (25,200,000px * 14bit) = fps
fps = 3,743,600,000bps / 352,800,000px/bit
fps = 10.611fps

So, with a pair of DIGIC 5+, the 7D II with a 24mp APS-C sensor could easily reach 10fps, and have even more room left over for overhead than the 1D X. Unless Canon is intending to give the 7D II a 12fps frame rate, I don't see the need for a new DIGIC 6+ or DIGIC 7. Maybe some of the image processing features in the DIGIC 6 could be useful, however I am not exactly sure what it's data throughput rate is...however I am pretty sure it isn't actually quite as good as a single DIGIC 5+ (based on what I've been able to derive from a couple PowerShot megapixel counts and the frame rate for the first five frames, it seems like the DIGIC 6 is capable of 225mb/s, it it falls short of DIGIC 5+ by about 25mb/s.)

It is possible that Canon might create a DIGIC 6+. If they did, assuming they scale DIGIC 6+ the same way they scaled DIGIC 5+ over DIGIC 5, then a single DIGIC 6+ should be about 3x as powerful as a DIGIC 6. That would put it's data throughput rate somewhere around 640mb/s to 675mb/s. That would mean that a single DIGIC 6+ would be enough to give the 7D II a 14fps frame rate.

For some reason, I don't really see that happening...not sure why, just doesn't feel like Canon is ready to drop that particular improvement on us yet. I suspect such a new DIGIC 6+ chip (or maybe they call it DIGIC 7) will arrive with the big megapixel camera. A data throughput rate of 700mb/s would be enough to support 8fps for a 46.7mp FF sensor at 14bit, and even enough to support 7fps at full 16bit!

1529
EOS Bodies / Re: Will Canon Answer the D4s? [CR2]
« on: January 19, 2014, 07:08:13 PM »
This is the most advanced imaging device I know of:

http://spectrum.ieee.org/tech-talk/at-work/test-and-measurement/superconducting-video-camera-sees-the-universe-in-living-color

It uses true superconducting Titanium Nitride (at 0.1 Kelvin, basically absolute zero) to "detect time and energy (thus wavelength) of each photon in real time with zero intrinsic noise." Since it must operate at 0.1 K, it is WELL outside of the realm of consumer grade technology...it's only application at the moment is for space telescopes. The intriguing thing about it is the fact that it detects photon energy, so it knows the exact wavelength and therefor the true color of each and every photon encountered during an exposure. It also detects EVERY photon, so it has 100% Q.E., and it's design as a spectral power detector means there isn't any electronic noise (however, there would theoretically still be photon shot noise).

The readout mechanism uses a microwave frequency comb to "interrogate" each pixel 2500 times per second. This allows the sensor to be equally sensitive to color from about 100nm (deep ultraviolet) to 5000nm (very deep infrared.) Since the readout is basically achieved by multiple short interrogations, there is no reason that for longer exposures, dynamic range could effectively be infinite (however for shorter exposures, dynamic range would become limited...however I am unsure of what kind of signal strength this thing achieves for exposures at or below 1/2500th of a second. It would still offer more dynamic range than any current standard CMOS or CCD sensor, probably by several fold.)

If, at some point in the distant future, the ability to supercool electronics to absolute zero becomes "easy", this would basically be the ultimate pinnacle of image sensor technology. We would have perfect color reproduction, perfect electronic current, near-infinite dynamic range (basically only limited by exposure time), etc. The energy requirements for maintining temperature at 0.1K would probably drain even a high capacity DSLR battery like that found in the 1D X in seconds, so I suspect this kind of technology would need an always-on power source (i.e. outlet), or some kind of fuel cell that provided MASSIVE power.

Anyway...given the prior discussion, I remembered this sensor. Had to dig it up again, but it basically represents the ultimate in imaging sensor technology. I don't think you can get better than the ability to detect every single photon, it's sensor position, incident time, as well as it's exact energy frequency. I guess the only real improvement would be to increase the number of actual pixels in the device (the article uses a 2024 pixel (44x46) sensor for deep field astrophotography....that could probably be increased to megapixels.)

1530
EOS Bodies / Re: Patent: Microadjustment Automated
« on: January 19, 2014, 03:56:37 PM »
Automation of the AF adjustment

Magic Lantern dot_tune module does this, cost: zero, availability: now ... there may be concerns over precisions and whatnow, but I'm positive it's quick & worth a shot, esp. over manual attempts. Works for me.

The dot tune approach is not accurate. I've tried using it on multiple occasions with all of my lenses in the 7D, and dot tune always results in incorrect AFMA (sometimes, wildly incorrect). Manually tuning and/or Focal are the best ways to tune so far. Automatic AFMA with CDAF+PDAF is probably the only way to really generate accurate AFMA settings.

Pages: 1 ... 100 101 [102] 103 104 ... 276