A friend called me randomly on Sunday to tell me a story. He said that a schoolkid had invented a system for cars which automatically started applying the brakes if you took your foot off the accelerator really quickly, implying that you were about to do an emergency stop. Because the brakes were now applied faster, a speedup of the time it takes to transfer your foot from one pedal to the other, this change saved 50 yards of stopping distance at 70mph. And I agreed it was a great idea, and I wished I’d thought of it first :-)
But then he said: why can’t we do the same thing with web browsers? When someone’s mouse mouses over a link, why can’t we start to prefetch it – either DNS prefetch (if we don’t automatically DNS prefetch all links on load; the docs aren’t quite clear on the policy) or even main HTML page prefetch? This would save the amount of time between the mouse coming over the link, and us registering the click (which is at mouseup, not mousedown). I know I pause over links, so the speedup could be significant.
Now I know prefetching which is not explicitly requested by the page (e.g. using <link> tags) has had issues in the past. But FasterFox still seems to do it, so it can’t break all that much of the web. (He says, naïvely.) Is the problem solely one of bandwidth? Or can some web apps break in this case, if we prefetch a link the user doesn’t actually end up requesting? And are those apps just badly coded? Would this idea fly?