Microstuttering is, in fact, a real phenomenon. That particular bit of 'proof', on the other hand, is a crock of shite.
All that video is, is a clip of someone playing Crysis at WAY too high of detail for their card to handle. Considering the fps count never breaks 20, the picture's fluidity is far too poor for microstuttering to even be a meaningful issue. Play me a Crysis demo that never goes below 40fps, or a Source game that never drops below 60fps, and THEN start looking for problems.
It's a brutally overblown problem these days, but it does actually exist. The best explanation I've seen involves the video cards/cores simply getting information too fast for the rendered frames to appear at regular intervals. In a situation with highly-clocked processors and memory hubs, it's much easier for such a thing to occur. With the latest generation of cards, the imbalance between speed/power of the GPU's and the rest of the system is no longer as pronounced, making it less of an issue.
There are countless people who NEVER notice anything with their multi-GPU setups, and game happily ever after (until they find out there's a bigger card coming out, anyway). And there are a couple of us who wondered why our gaming experiences were WAY out of whack with what FRAPS was claiming, and finally learned the reasons later on.
i7 2600K | ASUS Maximus IV GENE-Z | 580GTX | Corsair DDR3-2133
I can honestly say I've never experienced this, not with SLi'd 6800GS's, not with Crossfired HD3870's, and certianly not with the HD4870X2. Playing games at stupid resolutions doesn't prove there's a serious issue, it just magnify's a smaller one which the majority of people won't even notice.