“Taipei (Taiwan) – AMD pushed Fusion as one of the main reasons to justify its acquisition of ATI. Since then, AMD’s finances have changed colors and are now deep in the red, the top management has changed, and Fusion still isn’t anything AMD wants to discuss in detail. But there are always “industry sources” and these sources have told us that Fusion is likely to be introduced as a half-node chip.
It appears that AMD’s engineers in Dresden, Markham and Sunnyvale have been making lots of trips to little island of Formosa lately - the home of contract manufacturer TSMC, which will be producing Fusion CPUs. Our sources indicated that both companies are quite busy laying out the productions scenarios of AMD’s first CPU+GPU chip.
The first Fusion processor is code-named Shrike, which will, if our sources are right, consist of a dual-core Phenom CPU and an ATI RV800 GPU core. This news is actually a big surprise, as Shrike was originally rumored to debut as a combination of a dual-core Kuma CPU and a RV710-based graphics unit. A few more quarters of development time gave AMD time to continue working on a low-end RV800-based core to be integrated with Fusion. RV800 chips will be DirectX 10.1 compliant and are expected to deliver a bit more than just a 55 nm-40 nm dieshrink.
While Shrike will debut as a 40 nm chip, the processor is scheduled to transition to 32 nm at the beginning of 2010 - not much later than Intel will introduce 32 nm - and serve as a stop-gap before the next-gen CPU core, code-named "Bulldozer" arrives. The Bulldozer-based chip, code-named “Falcon”, will debut with TSMC's 32nm SOI process, instead of the originally planned 45 nm.
As Fusion is shaping up right, we should expect the chip be become the first half-node CPU (between 45 and 32 nm) in a very long time.”
I can see something like this going nicely in small portable computers&very lowend desktops or things being integrated into appliances(fridge, car, etc) but never in the mid to high end desktops or portables.
That does not mean it will be bad though, it has it's time and place.
Is there an advantage to combining the cpu and gpu that I'm missing? It just seems like a better version of integrated graphics
Incorrect, this is far more advanced than any lowly integrated graphics. This is the combination of a high-end vpu with a high-end cpu. Granted there will be some similarities though. But this is still far from 'integrated' as with this 'fusion' technology you have the option to upgrade at your discretion. With integrated, you can't do that. However, thats not the main reason here. The main reason for making such a cpu is that the cpu and vpu can interface more directly. If AMD/ATI executes this move in the same way that AMD executed the release of the dual core; the cpus and vpu(s) will be able to communicate somewhat directly through a super high speed Bus such as a memory controller rather than the Northbridge.
The other big reason for such a move is heat. Granted, AMD/ATI vpus are steadily getting more power friendly (thus generating less heat), but the size of the heatsink you can put on such a this is determined by placement of said device to be cooled. Vpu heatsinks must be relatively small compared to cpu heatsinks.
If executed properly (which it probably will be); we will see this as a very good move that Intel can't even come close to matching for a long while. Remember nVidia is still quiet selfish and self-centered as they won't share their SLI technology with Intel. And AMD owns ATI so they have ready access to the Crossfire technology. The advantage here is quite obvious.
Last edited by Mysteryous; August 6, 2008 at 09:03 AM.