I wouldn't call it HDR. HDR is a very misused term as it is. In its proper form, a High Dynamic Range image is an image with an EXCESSIVBLY HIGH dynamic range, stored as 32-bit floating point numbers with extremely fine precision and a dynamic range that could potentially equal thousands of stops (i.e. it can represent numbers from a couple billion down to billionths.)
HDR as it is commonly (mis)used simply refers to the mapping of tones into a limited dynamic range from a source file that might have slightly higher dynamic range. What Canon is doing isn't exactly HDR...it is a specialized read process that will allow them to better utilize the dynamic range they already have access to, but which is otherwise being diminished by read noise.
I didn't mean to call this process HDR in the strict sense - you're right, it is a misused term.
Regardless of the exact meaning, I was thinking of the common understanding of HDR along the lines of:
...HDR compensates for this loss of detail by capturing multiple photographs at different exposure levels and combining them to produce a photograph representative of a broader tonal range... (wikipedia)
I could be wrong, but isn't that the same idea? Creating the equivalent of two different exposures by applying two different gain levels and then combining them. The difference is pushing it onto the sensor rather than post-processing in software, so there is no need to take multiple shots at different exposure.
Yeah, pretty much. I don't know exactly how they get the two reference signals, but in the end, the gain isn't huge. Canon sensors currently get around 11.5 stops on average. This could allow them to get ~13.5 stops on average unless they move to an ADC with a higher bit depth. If they do move beyond 14-bit ADC, then it would definitely be a lot more line hardware HDR (imagine 15.5 stops or around there for a 16-bit ADC.)