NVIDIA GeForce GTX 580 Video Card Performance E-mail
Reviews - Featured Reviews: Video Cards
Written by Olin Coles   
Tuesday, 09 November 2010
Table of Contents: Page Index
NVIDIA GeForce GTX 580 Video Card Performance
GeForce GTX 580 Closer Look
GeForce GTX 580 Detailed
Features and Specifications
Video Card Testing Methodology
DX10: 3DMark Vantage
DX10: Crysis Warhead
DX11: Aliens vs Predator
DX11: Battlefield Bad Company 2
DX11: BattleForge
DX11: Lost Planet 2
DX9 SSAO: Mafia II
DX11: Metro 2033
DX11: Tom Clancy's HAWX2
DX11: Unigine Heaven 2.1
Overclocking and Temperatures
VGA Power Consumption
NVIDIA APEX PhysX Enhancements
NVIDIA 3D-Vision Effects
Editor's Opinion: NVIDIA Fermi
NVIDIA GeForce GTX 580 Conclusion

NVIDIA 3D-Vision Effects

Readers familiar with Benchmark Reviews have undoubtedly heard of NVIDIA GeForce 3D Vision technology; if not from our review of the product, then for the Editor's Choice Award it's earned or the many times I've personally mentioned it in out articles. Put simply: it changes the game. 2010 has been a break-out year for 3D technology, and PC video games are leading the way. Mafia II is expands on the three-dimensional effects, and improves the 3D-Vision experience with out-of-screen effects. For readers unfamiliar with the technology, 3D-Vision is a feature only available to NVIDIA GeForce video cards.

Mafia 2 is absolutely phenomenal with 3D-Vision... and with its built-in multi-monitor profiles and bezel correction already factored this game is well suited for 3D-Vision Surround. Combining two GeForce GTX 580's into SLI allowed this game to play at 5760 x 1080 resolution across three monitors using upper-level settings with APEX PhysX enabled to deliver a thoroughly impressive experience. If you already own a 3D Vision kit and 120Hz monitor, Mafia II was built with 3D Vision in mind.


The first thing gamers should be aware of is the performance penalty for using 3D-Vision with a high-demand game like Mafia II. Using a GeForce GTX 480 video card as a point of reference, we experienced frame rate speeds up to 33 FPS with all settings configured to their highest and APEX PhysX set to high. However, when 3D Vision is enabled the video frame rate usually decrease by about 50%. This is no longer the hardfast rule, thanks to '3D Vision Ready' game titles that offer performance optimizations. Mafia II proved that the 3D Vision performance penalty can be as little as 30% with a single GeForce GTX 480 video card, or a mere 11% in SLI configuration. NVIDIA Forceware drivers will guide players to make custom-recommended adjustments specifically for each game they play, but PhysX and anti-aliasing will still reduce frame rate performance.


Of course, the out-of-screen effects are worth every dollar you spend on graphics hardware. In the image above, an explosion sends the car's wheel and door flying into the players face, followed by metal debris and sparks. When you're playing, this certainly helps to catch your attention... and when the objects become bullets passing by you, the added depth of field helps assist in player awareness.


Combined with APEX PhysX technology, NVIDIA's 3D-Vision brings destructible walls to life. As enemies shoot at the brick column, dirt and dust fly past the player forcing stones to tumble out towards you. Again, the added depth of field can help players pinpoint the origin of enemy threat, and improve response time without sustaining 'confusion damage'.


NVIDIA APEX Turbulence, a new PhysX feature, already adds an impressive level of realism to games (such as with Mafia II pictured in this section). Watching plumes of smoke and flames spill out towards your camera angle helps put you right into the thick of action.


NVIDIA 3D-Vision/3D-Vision Surround is the perfect addition to APEX PhysX technology, and capable video games will prove that these features reproduce lifelike scenery and destruction when they're used together. Glowing embers and fiery shards shooting past you seem very real when 3D-Vision pairs itself APEX PhysX technology, and there's finally a good reason to overpower the PCs graphics system.



# Your power consumption is wrongDS 2010-11-09 06:45
Because the GTX485, sorry GTX580, throttles clocks in Furmark and OCCT. Please redo the tests, with this future turned off, or with some other stressing program.
Report Comment
# RE: Your power consumption is wrongOlin Coles 2010-11-09 06:52
I can always tell when a visitor comments without reading the article, because if you had then you'd know that it will do this with every application because it's controlled at the hardware level. Since there's circuitry limiting power consumption, how do you suggest we 'turn this off'?

I'm asking you, the empowered visitor, since you obviouly know the 'right' way to do this after telling me that our way was wrong.
Report Comment
# GTX485ChrisW 2010-11-13 21:45
He may be wrong about Furmark and OCCT, but he hit the GTX580's more realistic name on the head!

If anything, at most a GTX490...
Report Comment
# RE: Your power consumption is wrongOlin Coles 2010-11-13 23:45
If you rename the exe's to something else (like Crysis.exe) they reveal full power consumption statistics, which are reported in this review.
Report Comment
# No Can Do....BruceBruce 2010-11-09 06:55
The "throttling" of the card is based on input from thermal sensors and current sensors. It is not software-specific. It doesn't matter how you stress the card; if it gets too hot, or draws too much current it will take corrective action.
Report Comment
# ActuallyIntratech 2010-11-09 07:20
Actually the limiter is only engaged when the driver detects Furmark or OCCT at present so you can test it with some other stress testing application.
Report Comment
# RE: ActuallyOlin Coles 2010-11-09 07:25
This isn't a software function. You'll notice from the chart that was linked that the GPU isn't being throttled, and renaming furmark had no effect. Do you have evidence of it working otherwise?
Report Comment
# Power consumptionLED Guy 2010-11-09 12:13
Anandtech had this to say about the power consumption:

Compared to the GTX 480 NVIDIA?s power consumption is down 10%...


Anand's numbers also seem to be more in line with other reviews I've looked at. So unfortunately it looks like you need to retest power consumption with another program, why not ask Anand what program they used?

Otherwise, from what I read, good review. Not planning on upgrading so I didn't read the whole article unfortunately. Been a reader for some time now, first comment so thanks for the great articles so far. BTW I have a suggestion/request for the graphics card articles: Add minimum frame rate numbers to the tests, as these are as important as average frame rates, if not more so.

Report Comment
# TechPowerUpIntratech 2010-11-09 07:33
NVIDIA also explained that this is just a work in progress with more changes to come. From my own testing I can confirm that the limiter only engaged in Furmark and OCCT and not in other games I tested.

Report Comment
# Sorry, full quote hereIntratech 2010-11-09 07:35
At this time the limiter is only engaged when the driver detects Furmark / OCCT, it is not enabled during normal gaming. NVIDIA also explained that this is just a work in progress with more changes to come. From my own testing I can confirm that the limiter only engaged in Furmark and OCCT and not in other games I tested.

Real-time clock monitoring does not show the changed clocks, so besides the loss in performance it could be difficult to detect that state without additional testing equipment or software support.

Report Comment
# ThanksRealNeil 2010-11-09 07:47
This was the first GTX-580 review to hit my in-box today. I like the fact that this card uses such a small amount of power and still achieves very nice performance. While it's true that the top-end Radeon cards bested it in some of the tests, they don't offer the same compatibilities with CUDA and Phys-X processing and that shows up glaringly in some of your results. Not knowing for sure how game development will go as to what technologies each company will embrace, it makes more sense to me to buy the NVIDIA card.
Report Comment
# RE: ThanksBanzai 2010-11-09 07:57
Well, it matches fairly well to a 5970, which is a dual gpu card. Honestly, it shows some good results in my opinion, although it may not beat crossfired 6870's, it can match/beat a 5970 for a similiar price.
Report Comment
# Mr. NobodyFranck 2010-11-09 08:23
The only thing i find a quite deceiving is the choice of the competing cards.
For the sake of performance, it is obvious that a single 470 wouldn t do it. It would have been tem time more interesting to know how would perform a SLI with more popular 460 gtx 1g HAWK or FTW or even a 470 SLI since you put a 6870 Crossfire on the stake, and that such are in pair with the mentioned cards.
That would be relevant to know how this newby 580 perform agaisnt his own kind.
Report Comment
# RE: Mr. NobodyOlin Coles 2010-11-09 09:03
I'm sorry to ruin your conspiracy theory, but I simply didn't have a second GTX 470 to combine and test in SLI. I don't want to deceive you, so I'll add that I might never have a second 470.

I'm also sorry that all of the work I put into this article didn't provide enough information for you to develop a decent idea of where this card fits.
Report Comment
# Que'RealNeil 2010-11-09 09:19
So you did this review, and I'm not sure how long it takes to do one of these, but I really appreciate having them to look at. But I wonder if you ever just relax with some of these Wazoo cards in a PC and just game a little. Time is probably a factor.
Report Comment
# RE: RE: Mr. NobodyFranck 2010-11-09 17:37
Not even 2 460 ?
Ususally those review are to make an opinion of what to buy according to what you use, and guide the pretenders to an upgrade.
Well it doesn t take rocket science to guess the 580 is to topple ATI best dog, this is the obvious part, everybody will do it.
But what of the average joe ? Since GTX 460 is the average card on the market and not the 470, what should he do, acquire a 580, or double his 460 ? That s the second more important question on the market, now.
If i take that the 6870 is equal to a hawk and less than a FTW,according to your own reviews, average joe could guess that 2 of those would let the 580 on the floor panting as did the double 6870.
But as other factors kick average joe can t be sure then the revue fails him, and the average joe drive the market bulk, and want the not so obvious answers.
If the review is to give an "en passant" review of the 580 its excellent, but its too obvious, everysingle review will do it.
Report Comment
# RE: RE: RE: Mr. NobodyFranck 2010-11-09 17:42
You reviews usually fill my doubts and i hardly consult others sites as they tend to say all the same that you do, so your very thrustable. . SLI or buying a new card ? no answer for NVIDIA owners.
I hope you ll have 2 NVIDIA card when your start reviewing the non vanilla 580. I can assure you its all that matters for much people.
You solved the ATI side and its great for them but we Nvidia side are left hanging.
Report Comment
# RE: RE: RE: RE: Mr. NobodyWayne Manor 2010-11-09 19:56
I have 2 1GB GTX 460's and unless I really wanted the brute power of 2 580's, I'm sticking to the 460's as they're probably on par with 1 580. I'll probably wait till the 680's or 7900's till I upgrade next.
Report Comment
# Updated with SLI GTX 460'sOlin Coles 2010-11-10 21:39
I purchased a second overclocked 1GB EVGA GeForce GTX 460 FTW video card specifically for SLI testing, and have updated all of the charts with the new results. I may write a separate article discussing value and performance, but there are no plans to re-write this article at the moment.
Report Comment
# man these guyssweatshopking 2010-11-09 09:21
Man, you guys always get a tough time on video cards reviews. that sucks guys. take it easy, quit whining about omg no 470, etc. look and see. buy or don't. up to you, but stop complaining.
Report Comment
# Mmm competitionHaters gonna hate 2010-11-09 11:31
I think this a great GPU I just think the price is perhaps a little high at the moment. Plus, you may want to see what the 6970 and 6950 yield from AMD. If nothing else, I bet it forces the price lower on the GTX 580. Olin, I liked the article and appreciate all the time I'm sure it took to get all this info.
Report Comment
# 580Pawnshock 2010-11-09 12:04
I just want to say thanks a million for all the effort and time you put into this benchmark review Olin Coles. It is detailed, precise and exactly what I was looking for and I am sure many more will find it helpful as well. Much appreciated.
Report Comment
# Oui...Chris H 2010-11-09 13:07
Why couldn't they have just called it the GeForce 485 or 490?! Looking at these performance numbers, it would of been more appropriate. When I see a 480 and 580, I expect the 580 to be at-least 60% faster than that their last model...

Are they really going to have to start their next gen chip in the 600's?

Hey Olin, GREAT article BTW!
Report Comment
# RE: Oui...Jack 2010-11-09 14:26
You mean like they did w/ the 5870 and 6870 ? Wait that's slower.

Same reason they slap 2011 on car models .... it works to sell things.
Report Comment
# RE: RE: Oui...Franck 2010-11-09 17:43
Exactly MKT stuff only.
Report Comment
# RE: Oui...David Ramsey 2010-11-10 09:19
Because the underlying chip has actually changed. Admittedly, the changes were minor: fine-tuning the types of transistors used in various parts of the chip, moving fan control onboard, et cetera so on and such forth. Still, a case could be made for a "4xx" designation. But although NVIDIA has a track record of confusing model numbers, they're still way better than AMD, whose 6xxx series cards are slower than the existing 5xxx series.
Report Comment
# GTX 580 3DRobert17 2010-11-09 15:04
Olin, having trouble posting questions in the forums; thought you'd like to know.

So here you go, both on and off topic. Nice review, thanks for all your efforts.

With the preponderance of video hardware turning to 3D enabled products I'd like to know how I can un-chart the costs of 3D enabling via hardware if that's possible (I have no depth perception, just ask the US Navy). So I get no bang for the buck with this evolution that others are likely fascinated by. Any general rule of thumb that you may have already gotten your mind around?

Thanks again.
Report Comment
# RE: GTX 580 3DOlin Coles 2010-11-09 15:29
Robert: please try to start a thread in the forum, or send me a message with the problems you're having. I want to keep the comments on-topic, or else I would discuss here.
Report Comment
# RE: NVIDIA GeForce GTX 580 Video Card Performanceivor 2010-11-09 21:49
I'm inclined to think that Benchmarkreviews received a GTX580 with a lower vid core than some of the other reviewers, which explains the meager overclock but sipping power. But anyhow, it just represents the other end of the spectrum where some cards have lower vid. It may not be a good thing though, because Nvidia does not think the gpu can stand higher voltage.
Report Comment
# Outperformed by CrossFire Radeon HD 6870'sTiago 2010-11-10 06:34
How can you add this as a con!?
You are comparing TWO 6870s to ONE gtx 580 and btw the 580 is a SINGLE gpu card while one 6970 is a DUAL gpu card!!

Ofcourse it'll always outperform a 580 but i bet if you guys at benchmarkreviews added another 580 in SLI you'd get very different results.
Report Comment
# RE: Outperformed by CrossFire Radeon HD 6870'sOlin Coles 2010-11-10 06:42
How can I compare two Radeon HD 6870's to one GeForce GTX 580? Well, two 6870's cost less and provide better FPS performance. That's how. People want to know how much performance they can get for their money, not just how one card stacks up against another single card.
Report Comment
# RubbishJoesph 2010-11-10 15:15
Ive seen tests of the 6870 and there worse than the 5870 as it is a less hungry card, yeh your tests show it isw better than gtx580
what a load of crap
Report Comment
# RE: RubbishOlin Coles 2010-11-10 15:21
Joseph, please tell me that beneath your tormented grammar that you're also illiterate and blind. These tests used two 6870's in CrossFireX. That means 6870 times two.
Report Comment
# RE: NVIDIA GeForce GTX 580 Video Card PerformanceChris 2010-11-10 15:45
A more substantial upgrade than I though it would be. Thanks for taking the many hours it takes to make an objective review. It seems that a pair of 6870s represents a better deal for the buyer today that this card. I don't think that much has changed ... yet. Nvidia still has the fastest in absolute per GPU performance while ATI is better for price:performance ratios. Of course, we'll have to see the high-end ATI cards before making a final judgment.
Report Comment
# Cost Analysis for GTX 460's in SLIDon 2010-11-10 22:50
Great review, and just wanted to let you know that you are missing the cost analysis for the 460's in SLI, and just want to point out that there are much better 460 cards at a cheaper price like the MSI Cyclone 1GB, or the Hawk which are only $199 and $215 respectively, right now.

A pair of these in SLI would be $400 - $430, and would perform better than the EVGA cards.
Report Comment
# RE: Cost Analysis for GTX 460's in SLIOlin Coles 2010-11-10 22:55
Thanks Don, but I literally JUST added the GTS 460 SLI results a few minutes ago. I may or may not update the value analysis later this week.
Report Comment
# RE: Cost Analysis for GTX 460's in SLIOlin Coles 2010-11-10 23:02
How would they perform better? I think you're mistaken, because the $230 EVGA GTX 460 FTW is clocked to 850/1700/1000 MHz, and the $215 MSI HAWK is 810/1620/975 MHz.
Report Comment
# They perform better when overclockedDon 2010-11-11 09:44
Sorry I didn't clear that up.

The Hawk & the Cyclone overclock very well, especially the Hawk as it can easily reach and exceed 900 MHZ. But yeah, the EVGA has pretty good clocks as well as it comes nicely overclocked.

IMO the Hawk is the better value card for a GTX 460 right now @ $215.

Excellent job on the review!

Question - Can you confirm if NVIDIA did in fact get rid of most of their Hyper Compute Performance on the 580?
Report Comment
# No update on power consumption?LED Guy 2010-11-11 10:46
You still haven't updated the article which says:

"The temperatures discussed below are absolute maximum values..."

But they are NOT absolute maximum values, nor is the power consumption. You state that loaded system power with the 580 is 191W and 315W with the 480. But you also say that you tested temperatures and power consumption with Furmark, which throttles the card, this can be checked if you run the benchmark test where you will get a LOT lower performance compared to the 480 for example. ALL other reviews out there state the 580 gets you 15-25% better performance than the 480 for the same or SLIGHTLY less power.

But not 124 watts less. And the difference isn't just down to a lower VID like ivor said. Test again with another program, Vantage, Crysis or, like Nordichardware did, try with Kombustor and using the "Post-FX" setting.

Please update the article as soon as possible as it is hugely misleading right now.
Report Comment
# RE: No update on power consumption?Olin Coles 2010-11-11 13:51
The power section has been updated with details on how we arrived at our results. Essentially, GeForce GTX 580 hits 246W maximum power while using a modified FurMark tool.
Report Comment
# gtx460 slibelzazar 2010-11-14 18:30
i didnt read the whole article but i read the fps testings and to me it seems that geforce 460 sli rocked all the other cards or am i mistaking ? i am about to buy my self a new card and my choise is between the 580 and 460 sli it seems that the 460sli is the best perfomance and is cheaper but my question is if it is possible that the 580 will perform way better after a newer driver perhaps ? plx help me make up my mind i reaally would like the newest card but why buy it when and older in sli beforms way better.
thanx for the great articles
Report Comment
# RE: gtx460 sliOlin Coles 2010-11-14 19:52
Hopefully you read enough of the article to realize that the GTX 460's we used had the highest factory overclock available: EVGA FTW Edition. Two stock GTX 460's in SLI do not beat a GTX 580, and instead compete with the 480.
Report Comment
# TNX 1680x1050 + settingsKrusher 2010-11-14 22:48
Thanks for doing these tests in 1680x1050 AND showing your settings; most of the reviews I've seen elsewhere only show 1920x1200 and leave out the valuable setup info (so I could not attempt to reproduce it here.)
Report Comment
# RE: TNX 1680x1050 + settingsOlin Coles 2010-11-14 23:17
Back when I was a avid hardware review reader (a little over four years ago), it would always frustrate me to find very little detail on the settings and specifications for each video card tested. I always had to guess at the speeds of a particular product, and if it was an overclocked version (XFX 7900 GT doesn't say a lot). So when I started Benchmark Reviews, I made it a point to ensure our details made it possible to reproduce our results.
Report Comment
# RE: TNX 1680x1050 + settingsKrusher 2010-11-14 23:58
I have a single eVGA GTX 460 SC (OC'd to the FTW speeds) and halved your SLI numbers to give me a ball-park figure. With 3 days left on my Step-up option to the 580, your results make it really tempting...if eVGA ever catches up on the huge back order! :)
Report Comment
# Thank youJay Cee 2010-11-15 08:57
Thank you for the review and all your effort, it seems many people tend to post complaints, but those who are satisfied generally never bother to post a thanks.

Will you comment on the noise level? This seems to be a greater concern to me more and more as I seek to spend more time and money seeking out quieter components. The only reason I didn't purchase a GTX480 was due to the noise, so I have been having to do with a GTX460. The majority of reviews point out that the GTX580 is much quieter which seems to be a positive move on nVidia's part.
Report Comment
# RE: Thank youOlin Coles 2010-11-15 09:09
The noise level is nearly silent at idle (as in no audible sound), and under full gaming load it becomes slightly more audible. If I turn the fan up to full power using Afterburner, it's loud enough to hear but still more quiet than a GTX 460. That's right, less noise than a GTX 460. The fan noise on a GTX 580 doesn't even compare to a GTX 480, which seemed to get better later into production.
Report Comment
# 3d Digital ArtistKeign 2010-11-15 14:35
This car fits my needs perfectly because not everything takes advantage of SLI, nearly 90% of the programs I use that are graphic card intensive do not support SLI configs. SO I will be upgrading my two 295's to a 580 this very day ;). Thanks for the review.
Report Comment
# yetKeign 2010-11-15 14:36
And yet it will still probably outperform my 295's in gaming, which is versatile and usable on every medium I need it for.
Report Comment
# dsfgatygun 2010-11-16 08:46
Nice review, bad luck you got such horrible clueless reply's going on here.

A 580 = a package ( it is a stable solution for every game there is )
sli/crossfire/cf = raw power + cheaper but a more unstable way of dealing with older games even new top titles like aliens of predator.

While videocard do the same, the different versions of it are ment for different people.

Even if the gtx580 cost more and doesn't provide on every little aspect on the best way, the card = still a far better choise for most people
then any sli/crossfire/cf unstable solution.

The 580 = a perfect naming for this product. its 20-25% faster / lesser usage of watt / lesser heat / lesser noise.

2x 460's or 2x other budget cards or x2 solutions are great if the card actually produces 200% faster speeds then the 1 cored version. Which it clearly isn't doing. Its even below the 580 or with a minor fps above it. ( on only "newer games")
Report Comment
# Video Card PricesOlin Coles 2010-11-21 13:50
After the GeForce GTX 580 launch, I felt a little irritated to see only one model sell for $500 while all of the others sold for $520 or more. This throws off my Cost Analysis math on the very first day. Making matters worse AMD's Radeon HD 5970 could be found in a few places for $500 before launch, but now costs no less than $550 or (much) more.

This game of sinking prices just ahead of launch is getting old, and it ruins my cost analysis every time. So visitors, keep this in mind when you read these reviews, and understand how prices change on a daily (and sometimes hourly) basis.
Report Comment
# Great review12Xu 2010-12-09 09:27
Nice job, thanks for the thorough review. Glad to see Nvidia coming back around after such a long drought.
Report Comment
# meabla 2010-12-14 06:03
the only Nvidia card that has ever interested me is GTX460
in that can buy two of them for the price of one GTX480 or GTX580 and have more performance

other wise I have become disgruntled with the swift obsolesce of today's technology
Report Comment

Comments have been disabled by the administrator.

Search Benchmark Reviews

Like Benchmark Reviews on FacebookFollow Benchmark Reviews on Twitter