The problem is that on current cameras (DSLRs in live view and MILCs), the live image, the histogram and the highlight warnings (‘blinkies’) are based on the jpg conversion algorithm. From the WYSIWYG standpoint, that makes sense – the live image in the VF/LCD should reflect your picture style, HTP, etc. But for judging exposure, it’s not that helpful because what appears blocked or blown on the jpg may be completely recoverable from the RAW file. It’s not a perfect solution, but Google ‘Canon uniWB’ for an interesting workaround.If I were making an AE system for a mirrorless, the sensor can see the image... so you can actually see how bright the brightest stuff is, and how dark the darkest stuff is. If you're getting 0's and max values, then by default try to equalize the number of pixels falling off each end.
Upvote
0