Crossfire X vs SLI

Apr 29
15:49

2009

Sandra Prior

Sandra Prior

  • Share this article on Facebook
  • Share this article on Twitter
  • Share this article on Linkedin

New cards means another new round in the ongoing multi-GPU battle.

mediaimage

With the launch of the HD4800 series,Crossfire X vs SLI Articles AMD is again forcing its competitor, NVIDIA to go toe-to-toe on price, rather than performance. The GT200 dominates everything that has come before in a straight one-on-one battle, but the Radeons are once more ganging up and attacking en masse. AMD's argument is that many low-cost cards working together are better than one big expensive one.

Viewed this way, CrossFire - as AMD's multi-card tech is known - makes more sense than the competing SLI. After all, why take two NVIDIA cards into the shower when one will do? A single GTX 280 will easily outperform anything else on the market without needing to be paired up. On the other hand, anyone who bought a GeForce 8800GT last year - and there were loads of us - will surely be watching as the price for a suitable partner tumbles.

While it's still being presented as a revolutionary idea, we're used to hardware zerging like this now. Indeed, it's a cheeky move by AMD to claim for its own the territory that NVIDIA first broached with SLI, all those years ago. The question is, with single CPUs getting ever more powerful and games engines standing relatively still, is this so much smoke and marketing mirrors?

Both companies use similar techniques to get their cards working together in harmony. Games are -wherever possible - profiled for the best possible performance increases. By default, the drivers use Alternate Frame Rendering (AFR), where one card is used to render one frame, while the other card prepares the next frame. In rarer cases, split-frame rendering - where pixels from a single frame are load balanced between the two cards - will make a game run faster.

Some competition gamers swear by split-frame rendering, arguing that the minor latency introduced in AFR can affect fast-paced games, but for most of us the drivers will simply select AFR and we won't be any the wiser. Indeed, with AMD's control panel you won't have any choice; but while you can customize profiles for NVIDIA cards, it's unlikely you'll ever need to.

Both companies, too, require a hardware bridge to connect the cards together using internal connectors inside the PC. This gives a direct point of communication between the cards independent of the potentially congested PCI Express bus, but isn't fast enough to carry all the data they need to share. So, are you better off going for the very best single GPU card you can lay your hands on, or should you look for a more arcane arrangement of graphics chips? And if you do, should you opt for SLI or CrossFire?

Back to Basics

A superficial glance back at the last 12 months and the answer would seem to favor multi-GPU arrays. NVIDIA's 9800GX2 - two G92 chips on one card - reigned supreme in the performance stakes up until the launch of the GTX 280. By coupling two GX2s together you got the Quake-like power-up of Quad SLI, and framerates that would make your eyes bleed.

AMD, meanwhile, stuck to its guns and released the HD3870X2, a dual-chip version of its highest-end card. In the same kind of performance league as a vanilla 9800GTX, it may not have been elegant but it was great value for money.

That's just the off-the-shelf packages. With the right motherboard two, three or even - in AMD's case - four similar cards can be combined to create varying degrees of gaming power. AMD also had a paper advantage with the fact that HD3850s and HD3870s could be combined together in configurations of up to four cards too.

Both companies even went as far as to release Hybrid SLI and Hybrid CrossFire, matching a low-end integrated graphics chip with a low-end discrete graphics chip. The result in both was much less than the sum of their parts: two rubbish GPUs which, when combined, were still poor for gaming.

And right there, at the very bottom, is where the argument for multi-GPU graphics starts to steadily unravel. Despite all the time that's passed since SLI first reappeared, the law of diminishing returns on additional graphics cards remains. Unfortunately, two cards are not twice as fast as one card, and adding a third card will often increase performance by mere single figure percentages.

That, of course, is if they work together at all. Even now, anyone going down the multiple graphics route is going to spend a lot more time installing and updating drivers to get performance increases in their favorite games. Most infamously, Crysis didn't support dual-GPUs until months after its release, and even then it still required a hefty patch from the developers to get two cards to play nicely together. It's now legend that the one game that could really have benefited from a couple of graphics cards refused point blank to make use of them.

That's very bad news for owners - or prospective owners - of GX2 or X2 graphics cards, which require SLI or CrossFire drivers; so another strike then for the single card. It would be churlish to say things haven't improved at all recently, but suffice to say that in the course of putting together this feature, we had to re-benchmark everything three times more than is normally necessary, because driver issues had thrown up curious results.

Before you even get to installing software, though, there's a bucket load of other considerations to take into account. First and foremost is your power supply: people looking to bag a bargain by linking together two lower end cards will often find that they will have to spend another $l00 or so on a high-quality power supply that's capable of not only putting out enough juice for their system, but has enough six or eight pin molex connectors for all the graphics cards, too.

Many is the upgrader who's witnessed the dreaded 'blue flash of death' when the PC equivalent of St Elmo's Fire indicates that the $30 power supply that you thought was a bargain is, in fact, destined for a quick trip to the recycling centre.

Even more critical with the current generation of cards, though, is heat dissipation. All of AMD's HD4800 series can easily top 90°C under load, and a couple of cards in your PC will challenge any amount of airflow you've painstakingly designed for. To make matters even worse, many motherboards stack their PCI-Express ports so closely together the heatsinks are almost touching. The HD3850s are single slot cards, but that means they vent all their heat inside the case.

On the NVIDIA side of things, size is more of an issue. The new cards - both GTX 260 and GTX 280 - are enormous. Even though they're theoretically able to couple up on existing motherboards, it's unlikely you'll find one with absolutely nothing at all - not even a capacitor - protruding between the CPU socket and the bottom of the board.

Because the merest jumper out of place will prevent these two sitting comfortably. If all this is beginning to sound a little cynical, let's point out that there have been bright developments in recent history. Most notable is the introduction last year of PCI-Express 2.0, which means there are more motherboards out there with at least two PCI-e sockets that are fully l6x bandwidth, so it's easier to keep both cards fed full of game data at all times.