Picking up immediately from where we left off yesterday with our review of NVIDIA’s new GeForce GTX 580, we have a second GTX 580 in house courtesy of Asus, who sent over their ENGTX580. With our second GTX 580 in hand we’re taking a look at GTX 580 SLI performance and more; we’ll also be taking a look at voltage/power consumption relationship on the GTX 580, and clock-normalized benchmarking to see just how much of GTX 580’s improved performance is due to architecture and additional SMs, and how much is due to the clockspeed advantage.

  Asus ENGTX580 GTX 580 GTX 480 GTX 460 1GB
Stream Processors 512 512 480 336
Texture Address / Filtering 64/64 64/64 60/60 56/56
ROPs 48 48 48 32
Core Clock 782MHz 772MHz 700MHz 675MHz
Shader Clock 1544MHz 1544MHz 1401MHz 1350MHz
Memory Clock 1002MHz (4008MHz data rate) GDDR5 1002MHz (4008MHz data rate) GDDR5 924MHz (3696MHz data rate) GDDR5 900Mhz (3.6GHz data rate) GDDR5
Memory Bus Width 384-bit 384-bit 384-bit 256-bit
Frame Buffer 1.5GB 1.5GB 1.5GB 1GB
FP64 1/8 FP32 1/8 FP32 1/8 FP32 1/12 FP32
Transistor Count 3B 3B 3B 1.95B
Manufacturing Process TSMC 40nm TSMC 40nm TSMC 40nm TSMC 40nm
Price Point ~$510 $499 ~$420 ~$190

As you may recall from our launch article yesterday, NVIDIA would only make a second GTX 580 available to us for SLI testing if we also accepted and reviewed a high-end gaming system, an offer which we declined. As a result we were unable to look at GTX 580 SLI performance right away. However Asus quickly came to our aid and sent us one of their first GTX 580s, giving us a second card to work with both for SLI testing and as a second data point. Since yesterday afternoon we’ve been busy at work seeing what a pair of NVIDIA’s latest and greatest are capable of doing.

It shouldn’t come as any surprise that as a launch-day card, the ENGTX580 is an almost identical carbon-copy of the GTX 580 reference design. Asus is using the reference PCB and cooler, and are differentiating the card through a very token 10MHz factory overclock and the possibility of a much greater overclock through voltage adjustment using their SmartDoctor utility (which we do not have in hand at this time). At this point the factory overclock has us scratching our heads however, as this is the second Asus card we’ve received with such an overclock. We’re not ones to look a gift horse in the mouth when it comes to a free performance boost, but 10MHz (1.2%) core overclock? It’s the very definition of a token overclock – it’s not enough of an overclock to actually make a difference in performance. We’re still trying to get to the bottom of this one…

About Last Night

Prior to the actual launch of the GTX 580, we were concerned about what the availability would be like. With NVIDIA engaging in such a quick development cycle for GF110 and being unwilling to discuss launch quantities, we didn’t think they could do it. We’re glad to report that we were wrong, and the GTX 580 has been in steady supply since the launch yesterday morning. Kudos to NVIDIA for proving us wrong here and hitting a hard launch – it’s the kind of action that helps to make up for the drawn out launch of the GTX 480 and GTX 470.

Actually getting a GTX 580 is turning out to be a curious affair however. When we first saw Newegg post their GTX 580s for sale our jaw dropped as they were all $50-$80 over NVIDIA’s MSRP; the GTX 580 is already an expensive card and selling it over MSRP isn’t doing NVIDIA any favors. However after checking out MWave, Tiger Direct, the EVGA Store, and others, we saw at least 1 card at MSRP at each store. Were NVIDIA and their partners price gouging, or was it something else? The truth is often in the middle.

At this point Newegg is the 800lb gorilla of computer parts; they have the largest volume and as far as we can figure they get the bulk of the launch cards allocated to the United States. So what they’re doing is usually a good barometer of what pricing and availability is going to be like – except for this week. As it turns out Newegg is running a 10% sale on all video cards via a well-known promo code; and as best as we can tell rather than not including the GTX 580 in their sale, they simply hiked up the price on all of their GTX 580 cards so that prices were at or around MSRP after the promo code was applied. The end result being that the cards look like they’re going well over MSRP when they’re not. Judging from pricing at Newegg and elsewhere it looks like there is some slight gouging going on (we can only turn up a couple of cards that are actually at $499 instead of $509/$519), but ultimately GTX 580 prices aren’t astronomical like they appeared at first glance. After this stunt, this will probably go on the record as being one of the weirder launches.

Asus’s ENGTX580: A Second Data Point

With a second GTX 580 in hand we have a second data point to look at with respect to the GTX 580’s physical attributes. As we’ve noted time and time again, with the GeForce 400 (and now, 500) series, NVIDIA has moved to having a range of VIDs for each product instead of only a single VID for every card. The result is that much like CPUs the power consumption and resulting cooling/noise properties of a product can vary from card to card.

Our reference GTX 580 shipped with a load voltage of 1.037v, notably higher than the sub-1v load voltages of the GTX 480 and a solid example of how NVIDIA has been able to reduce leakage on their GPUs. By luck our Asus GTX 580 comes with a different voltage, 1.000v, giving us some idea of what the VID range is going to be for the GTX 580 and what a card with a “good” GPU might be like.

GeForce GTX 480/580 Voltages
Ref 480 Load Ref 580 Load Asus 580 Load
0.959v 1.037v 1.000v

Not surprisingly, with a lower load voltage our Asus card consumes less power in all of our tests. We’ll just jump right in to the charts here and dissect things.

Under Cyrsis system power consumption is 20W lower, putting this GTX 580 under the Radeon HD 5970 instead of over, but also within 10W of the 6850CF, the GTX 470 (with its fused off SMs), and even the GTX 285. Going by power consumption this card is only slightly worse than the GTX 285, a far cry from the GTX 480 and the 421W system power consumption we see.

The situation is much the same with Program X, where power consumption has dropped 26W to 426W. Here it’s a not quite as close to the GTX 470, but it’s still only a dozen watts or less off of the GTX 285 and 6850CF.

However it turns out the effect on temperature & noise isn’t as great as we’d assume. These two aspects are of course dependent on each other, as temperatures drive fan speeds and vice versa. For whatever reason our Asus GTX 580 gets slightly warmer and slightly louder than our reference GTX 580, even though we’ve already determined that power consumption – and hence heat dissipation – are lower. Some of this may come down to BIOS programming by Asus, but at the moment we don’t really have a great explanation for why power consumption can drop but heat/temperatures can slightly rise. At the moment we’re entertaining the idea that the difference may be in assembly, and that the reference GTX 580 has a different thermal paste application than the Asus card.

In any case from these two data points we can clearly determine that power consumption can differ from our reference card, however whether temperatures and noise can differ are still in question. Ultimately we’d like to find out the full VID range of the GTX 580, if only to get an idea of how our cards compare to the complete spectrum of possibilities.

GTX 580 SLI: Setting New Dual-GPU Records
Comments Locked

82 Comments

View All Comments

  • slick121 - Thursday, November 11, 2010 - link

    Wow so true, this is a slap in the face, off to other sites for a more unbiased review.
  • Gonemad - Wednesday, November 10, 2010 - link

    ...blow a $1100 hole in your pocket? Yes it can!

    Can it make you consider purchasing a 1200W power supply as something not THAT preposterous? Yes it can! (well, the 480 pair already did, so...)

    Considering a caseless or vapor mod case also not that insane? That too!

    I guess waiting until this card reaches the price/performance charts will take a while. On the other hand as far as performance goes...
  • iwodo - Wednesday, November 10, 2010 - link

    One of the thing i never liked SLI or Crossfire like, it needs Drivers support for the specific game to take advantage of 2nd Gfx card.

    Have we solve this problem yet?
  • Spazweasel - Wednesday, November 10, 2010 - link

    I've never had to install game-specific drivers to take advantage of SLI in the games I play, and I've had an SLI rig for nearly three years (2x 8800GT). I just update my vid drivers once every few months. It's true that there are often performance tweaks for individual games in a given driver version, but I've never found a game that just doesn't work under SLI with whatever the driver version at hand. When did you last try?

    As for Crossfire... my "guest" PC is all-AMD (Athlon II X4 620, 4870, 785 chipset) and is a fine machine. Every time I consider going Crossfire on that rig, I check the various tech sites and game support sites, and see issues with Crossfire being reported far more frequently than SLI. This points to a situation that has existed for a while: AMD makes faster hardware for the money, but nVidia overall does a better job with drivers, particularly in multi-GPU scenarios, and from the game developers I know, seems more interested in working closely with game devs.

    Which is why when friends ask me about gaming builds, my usual answer (depending upon the products both vendors have at the time) is "Single vid card, go with AMD, dual vid card, go with nVidia". There have been exceptions: 8800GT in its day was just plain the best, and 460 GTX until very recently was also the best single-GPU solution in its price bracket. The overall trend seems pretty steady with regard to single-GPU vs. multi-GPU, though.
  • Finally - Wednesday, November 10, 2010 - link

    "Single vid card, go with AMD, dual vid card, go download a proper brain"

    Who drops in another card after 2 years, if there is a new card available that's not only about 100% faster but also brings new features to the table? (E.g. DirectX11, Tesselation, Eyefinity etc.)
  • Sufo - Thursday, November 11, 2010 - link

    Um, i got 2 5850s for less than the price of a single gtx 580, which they consistently outperform. Dual gpu is a legitimate solution in the short term at least.

    You're right that it becomes a less sensible option after a fair amount of time, assuming the tech has moved on significantly - however, expect pc gpu tech to stagnate for a while (as evidenced by the very marginal improvements displayed by the 6xxx and 5xx series) at least until the next round of consoles are out.

    Right now is a great time to buy a top of the line system.
  • Finally - Thursday, November 11, 2010 - link

    You DO know that the marginal improvements from 58xx to 68xx stem from the fact that the new top of the line 69xx are yet to be launched?

    Yes, GPU tech will stagnate because all they have to master are some 3rd grade console ports that only turn out so few fps because the process of porting them over to the pc is done as quickly and cheaply as possible?

    If there was such a thing as a native PC game anymore, you probably would see all those DX11 features put into practice.

    Right now it's simply ridiculous. A HD4870 or a GTX580 will play any console-ported crap you throw at it... more performance has become irrelevant as there is no game to request it.

    Oh, there is Crysis, right.
    And it's out since when exactly?
    I'm really not in the mood to pick up this vegetation benchmark in-disguise and look at it again...

    And then there are games that run with 200+ fps instead of 60+ fps. *yawn* Please wake me up, when you reach 500+ fps with your GTX580 SLI so I can walk over to my bed for some real deep sleep...
  • mapesdhs - Wednesday, November 10, 2010 - link


    Spazweasel, please see my site for useful info, comparing 8800GT SLI vs.
    4890 CF vs. GTX 460 SLI:

    http://www.sgidepot.co.uk/misc/pctests.html
    http://www.sgidepot.co.uk/misc/stalkercopbench.txt

    and these new pages under construction:

    http://www.sgidepot.co.uk/misc/uniginebench.txt
    http://www.sgidepot.co.uk/misc/x3tcbench.txt

    Hope this helps! :)

    Ian.
  • Spazweasel - Wednesday, November 10, 2010 - link

    Thanks, Ian... still happy with my 8800GT SLI setup though. :) It's been nothing but amazing for me. Not looking to upgrade just yet. Let's see what the 560 GTX looks like...
  • mapesdhs - Wednesday, November 10, 2010 - link


    Yep, 8800 GT SLI does run rather well, though as my results show
    they fall behind for newer games at higher res/detail.

    Summaries I've posted elsewhere show that if one is playing older
    games at lesser resolutions, then using a newer card to replace
    an older SLI setup (or instead of adding an extra older card) will
    not give that much of a speed boost, if any (look at the 4890 data
    vs. 8800GT). For older games, newer cards only help if one also
    switches to a higher res/detail mode. Newer cards' higher performance
    is focused on newer features, (eg. SM3, etc.); performance levels
    for older features are often little changed.

    Ian.

Log in

Don't have an account? Sign up now