And the idea that few people understand all the tech behind it, I agree. I am one of them, trying to learn more. But I'm definitely not convinced that the progression to 16 bit is a bad idea. It needs to be implemented properly, and I wouldn't be surprised if it became something we all understood a bit better in the coming year or two...
Going to 16 bit wouldn't hurt anything in the imaging process. It just isn't going to help anything. It's not going to record better color transitions, just extra random noise. All it would do is make RAW files larger, because the files would now contain extra information on the random noise which previously was not being recorded. Larger files without any actual benefit to images is not something we should be asking for.
Why are you so sure it will record just extra noise but no more useful information?
People have studied this issue. See the following technical explanation related to noise and bit depth: http://theory.uchicago.edu/~ejm/pix/20d/tests/noise/noise-p3.html
Also, see this rather lengthy discussion: http://www.luminous-landscape.com/forum/index.php?topic=60672.0
In short, represntatives from a medium format digital back dealer essentially concede that when the term "16 bit" is applied to MF sensors, it is done so as marketing shorthand in order to convey to potential customers that the MF sensor will have better tonal qualities compared to DSLRs of equal megapixels, even though the actual reasons for those better tonal qualities lie elsewhere. This may be fine for MFDB buyers since they are not being misled and their camers do produce better tonal qualities. However the problem which has resulted from this seemingly innocent bit of marketing is that some people have been led to think that if you make DSLR sensors 16 bit they will produce the tonal qualities of MF.
I've read this article from Chicago some time ago and thanks - I've read it again :-) It's hard not to agree with it's contents written by prof Martinec (the more that I've got just a master and engineer degrees in computer science, not a professor
but let me point to some circumstances:
1. Examples showing no difference between the original image and image with clipped 2 bits don't make sense in this discussion - there it was to indicate no difference on screen while viewing, we are talking about the useful information used later for image manipulation. If we intend to get one picture 14 bits deep and another 16 bits deep, convert them directly to 8bits jpegs and display on a screen, then most probably we won't see too much difference, I'd say - no difference. But if you'd like to manipulate it in PS, then depending on how much you want to manipulate, you'll see the difference sooner or later.
2. The long part of the article regarding noise is based on real values measured in real devices like 1d3 or 40d and compared to other devices. The read noise in sensor plays an important part. How about changing it a little in next generation of sensors? I mean - what if? What if in a new sensor some other technology would be used? Let's assume they would find a method to read each pixel's value not once after exposing it to light but could introduce sampling with frequency let's say 1MHz, which could eliminate some read noise and improve DR? I'm not saying that such sampling would help but I indicate, that some conclusions might not be same true in such a new type of sensor. So what if those additional bits were not just to record more noise?
3. I don't think that 16bit RAWs would make my photos any better than 14bit RAWs because in most cases I wouldn't know what to do with this. At the same time I think that guys at Adobe could know :-)
4. I think that everyone here has heard that there was a world market for no more than 5 computers and that 640KB of computer memory was enough :-) So why not 32 bit RAWs?
BTW: In the era of 386s@40Mhz and 486s@50Mhz if I would say to my professor that in 20 years there will be processors working at 2.4GHz and a graphics memory would utilize 7GHz clock it would be the best joke he'd hear that week. In one cycle of 2Ghz clock light (or other electromagnetic wave) travels like 15cm in vacuum...