AMD’s New Space Heater…Err… HD 6990 Hits The Streets With a Thud Stephen Fung March 8, 2011 Press Release AMD’s new GPU is out on the streets if you haven’t noticed. Actually, it’s kind of hard to not notice it. I mean, it’d be like missing Kool Aid man as he walked by. Only Kool Aid man is a cool cat and this new GPU is probaby going to keep things nice and toasty for the tail end of the Winter (but I guess they are both red?). I’d sat on a conference call for this GPU about a week ago and already made up my mind on it somewhat based on what I saw from the slides. Given that a single GTX 580 can trounce an HD 6970 in most benchmarks, simply sandwiching two of them together and calling it an HD 6990 is only going to buy you a TINY head start till nVidia decides to sandwich two GTX 580s together (aka GTX 590). Not to mention, SLI typically scales better than CrossFire. I should have known that when they said that they couldn’t send good ole Futurelooks one due to “limited quantities” that this was basically a technology leadership statement. One that probably won’t sell very many units so why bother making enough of them to matter. While Futurelooks would be celebrating a new GPU launch today with benchmarks, they aren’t. So instead, I’m going to give you a quick low-down of what this thing is made of. It’s a BIG Radeon HD 6970 Sandwich As speculated, the HD 6990 (code name: Antilles) is merely two HD 6970’s sandwiched together with a shared 4GBs of DDR5 memory. A PLX bridge chip links the two GPUs together (that silver thing to the left) allowing them to CrossFire in a single slot. The 4GB’s of DDR5 run at 1250MHz and translates to 320GB/s of bandwidth using at 256 bit bus while the two cores are capable of pushing out 5.4 TFLOPS (1.27 TFLOPS double precision) of computing power. Both cores run at 880MHz stock though I’m sure that someone will attempt to overclock that. Of course, it supports DirectX 11 and everything before it including OpenGL and supports their new Morphological Anti-Aliasing (MLAA) that makes things prettier by cheating a little (depending on how you look at it). But it really is all about the experience and there’s nothing really that could ruin a gaming experience. It’ll Suck Your Power Supply Dry This GPU sucks up over 375 Watts of power when under load. That’s just the GPU alone! This translates into a recommend PSU of 1000 watts. But that’s the consumption we know of because AMD’s new PowerTune technology starts to throttle power consumption down to control thermals as things heat up. Higher quality power components like the ones pictured above were required to tame this beast and provide an adequate supply of juice. They even added a switch onto the PCB that allows you to increase voltage for overclocking, allowing you to get over the 6 GBps limit if you so choose. Now this is probably going to be more power efficient than anything nVidia puts out, but their card will be faster. And if you want this level of performance, who cares about a few more watts? It Would Burn Your House Down If It Wasn’t For This Cooler AMD has designed a new massive cooler to control this beast of a card. According to AMD each card is as good as you’re going to get in terms of thermals. They’ve even said that the thermal interface compound used is as good as its going to get because it’s based on some military grade/space stuff. That means that a lot of sites that yanked the heatsink off before their benchmarks may get some toastier temps. It wouldn’t do AMD any good to lie in this case so if they say your Arctic Silver 5 isn’t going to beat their stuff, then it probably won’t. The cooling solution is rated for up to 450 Watts of heat dissipation. That means that once you hit 375 Watts maxing this thing out at stock clocks, overclocking it will probably send it over the edge. That doesn’t sound very good. And remember, AMD put a voltage increase switch on this thing too. You can already see the potential turbulence caused by the rotary blower style fan which will be no surprise if this thing sounds like a dust buster. Plus the Volterra VRMs that this thing needs are known to pop and hiss at you. Damn. EyeFinity is Back. Well, Five out 6 Ain’t Bad? One of the things that is well played on AMDs part is the productivity angle and EyeFinity is clearly one of my favourite features about AMD cards in general. Triple monitors is a huge productivity booster and AMD GPUs generally have much sharper pictures than nVidia cards. You also only need one card for triple monitor support vs having to buy two if you are wanting it on nVidia GPUs. While the HD 6990 does support EyeFinity 3, it doesn’t do their flagship EyeFinity 6 out of the box. Stock cards will only have four miniDisplayPorts and a single Dual-Link DVI port for a total of five monitors. This is kind of silly considering that most of AMD’s marketing is either “EyeFinity 3” or “EyeFinity 6”. There is no “EyeFinity 5” last time I checked. Even certain reference HD 5870s in their “E6” editions supported this at launch and this thing has TWO GPUs squished onto it and costs substantially more. Kinda lame not to have the top end features on their flagship cards. And no, not a single display port hub mentioned in the official press releases. That means no hooking this up to a Samsung MD230X6 HEXA display for you on this GPU. But then again, with only 70 games out there that support it, I guess it’s not that big of a deal, but that’s one less monitor to throw your spreadsheets, twitter and emails up on. You’ll Need a Second Mortgage The Radeon HD 6990 is a horrible value. It’s like someone decided that because it’s an HD 6990, that they should price it at $699 US. Considering that two potentially more useful HD 6970’s will cost you less, you’re not buying this for any reason. Why? Because if you do, you’ll kick yourself when a dual GPU GTX 580 comes along. Don’t believe me? Check out the slim margins in the reviews of the HD 6990 we posted up earlier today from the peeps that already wasted their time. I think we’re kinda glad you passed up firing one over because it sounds like a colossal waste of our time. I give this product about a month, two at most, till its dethroned. So in a sense, I’m happy no one had to review one and waste their time at Futurelooks. And I think I want to bill back for that conference call now. Share This With The World!