Review of Fujifilm's image sensor phase-detect AF is explained on dpreview so it can be done. http://www.dpreview.com/news/1008/10080505fujifilmpd.asp
They claim AF speed of 0.158 seconds but obviously DSLR AF is much faster since shooting at 10 frames per second and autofocusing in between frames must be less than 0.1 s. So at least in those fuji compacts, that system is not as fast.
Im afraind you may confuse DSLR with AF. they may be quit different concept.
To be frank, 0.16s is rather quick for a mechanism. but DSLR is a optical route reflected by prism. which can can get a accurate image positoin on the film through finder.
or say, ccd or cmos sensetivity is faster than mechanism movement which drive lens to a fit position. one is electronic, another is mechanism.
however, lens has a large mass.
I assure you, the poster Meh has not confused the two.
I somewhat understand your point, though.
Meh is correct (great link by the way!) that there is an essential problem for mirrorless cameras in phase-detection AF, perhaps.
My speculation is that the Nikon V1 and J1 could tilt the sensor to induce the phase shift. Looking at the simple geometry of the tilted mirror, you'll see that substituting the actual image sensor for a mirror and a set of secondary sensors placed further away makes no real difference (assuming that the difference in distances compared to the slightly longer path of an SLR AF system is not a minimum required, and I can't see any reason to assume it must be so), except that distances and the length of the sensors will have to be computed slightly differently in setting up the system, and you'll have to apply those figures to the portion of the image sensor being sampled.
I rather hope they figured out a better way to do it, though. The "hybrid" description, if true, suggests they're using the image sensor already to provide some contrast detection as well, so why not also sample a few pixels at a time and apply a phase detection routine to those as well?
Phase detection AF is inherently only slightly faster - in practice this may be irrelevant if your scene fools either system; phase detection may be fooled as well if there are two coincidental peaks in the frequencies hitting the phase sensors that don't correspond to the same point (though this seems to be a nearly freak situation, and I have a hard time envisioning how it could occur if the light hitting each phase detect AF sensor comes from the same portion of the image, except perhaps maybe at very close distances where some parallax error could occur) - and definitely not more accurate than contrast detection.
Both systems have to physically move the lens in order to get a second point for comparison purposes - if we aren't already focused on a sharp point that satisfies the camera's AF system (of any type). As Meh's link demonstrates, phase detection doesn't work on an entire image at once, but instead is comparing two one-dimensional samples (two lines with different values strung along).
Software contrast (or better yet, fast edge-detection as seen in "focus peaking," seen here
) on a full image, at least in theory, still allows the camera to immediately confirm correct focus without moving the lens elements at all, and even better gives the camera (assuming sufficient CPU resources and a fast enough EVF, OVF, or Live View display, as Cetalis mentioned) the ability to display this so you can see this in action yourself - whereas phase detect is a process hidden from the user (though that is often considered a good thing because it happens while you are viewing the scene through the viewfinder). Of course, as the Stanford graphics resource link says - phase detect works immediately in practice, but there is a chance (as with contrast detection) the lens movement will initially be in the wrong direction or by the wrong amount (which I mentioned earlier). An addition problem is presented, as we all know, when there isn't a physical phase detection sensor present where we want to focus. A big potential benefit of contrast detection AF is that it can theoretically be applied to as many pixels from the image sensor as you can capture quickly - though of course reading a full image sensor quickly is nowhere near as fast as using a phase detection array. That'd also help contrast detection overcome its potential to be misled by certain patterns in scenes. Cost savings for manufacturers are probably a big draw of contrast detect AF, because no separate AF array is required, and thus eliminating one fixed cost and a complex CMOS (or CCD) circuit from production.
Looking at the phase detection chart, you'd think that it will work immediately and accurately every time, but it assumes that there is a bright (contrasty) point in the scene to lock focus onto - we all know from experience that even good AF systems coupled with good lenses can misfocus on a low-contrast target, so the essential problem for contrast detection holds true for phase detection. This can be seen clearly in the Java applet - the lines hitting the sensor do not merely represent a "point of sharp focus," but a part of the visible spectrum that contrasts with the light hitting the other portions of the linear AF sensor.
And yes, contrast detection does guess at the direction and magnitude of corrections: link