NVIDIA GeForce GTX 580 Video Card Performance E-mail
Reviews - Featured Reviews: Video Cards
Written by Olin Coles   
Tuesday, 09 November 2010
Table of Contents: Page Index
NVIDIA GeForce GTX 580 Video Card Performance
GeForce GTX 580 Closer Look
GeForce GTX 580 Detailed
Features and Specifications
Video Card Testing Methodology
DX10: 3DMark Vantage
DX10: Crysis Warhead
DX11: Aliens vs Predator
DX11: Battlefield Bad Company 2
DX11: BattleForge
DX11: Lost Planet 2
DX9 SSAO: Mafia II
DX11: Metro 2033
DX11: Tom Clancy's HAWX2
DX11: Unigine Heaven 2.1
Overclocking and Temperatures
VGA Power Consumption
NVIDIA APEX PhysX Enhancements
NVIDIA 3D-Vision Effects
Editor's Opinion: NVIDIA Fermi
NVIDIA GeForce GTX 580 Conclusion

Editor's Opinion: NVIDIA Fermi

My opinion of NVIDIA's Fermi architecture has changed over the past several months since it was first announced, due largely in part to refinements made to their graphics processor. Testing with NVIDIA's GF100 GPU held its own set of challenges, and many times the video cards based on this graphics processor seemed condemned by the inherited legacy of issues. From the flagship GeForce GTX 480 down to the GTX 465, Fermi impressed gamers with strong FPS performance... and that was about it. Thermal output and power consumption were unfashionably high, to which AMD constantly and consistently focused their marketing attacks. Then along comes GF104 on the GeForce GTX 460, a video card that completely changed the game.

NVIDIA's GeForce GTX 460 not only changed the collective opinion regarding their Fermi architecture, it also changed the GPU landscape. AMD held the upper hand by releasing a DirectX-11 video card first, but they've painted themselves into a corner with their Evergreen GPU. Unlike NVIDIA's Fermi architecture, which can shape-shift as desired, AMD's Cedar, Redwood, and Juniper GPUs are all simply slices of the same processor: Cypress. This is where intelligent consumers will spot the flaw: AMD came to the (video) card game and showed their entire hand from the first deal, while NVIDIA had a few spare aces up their sleeves. NVIDIA's GeForce GTX 480 is only 15/16 of the complete GF100 package, and we're just beginning to see what's possible with a 7/8-whole GF104 GPU with GTX 460. Now that AMD has unwrapped their Barts (Cypress refresh) GPU, NVIDIA returns to offer all 512 Fermi cores in the GF110 GPU.

Now that NVIDIA has finally cooked Fermi with the perfect blend of tessellation, shaders, and texture units, consumers are able to see what they've been missing. We poked around the inner workings for our NVIDIA GF100 GPU Fermi Graphics Architecture article, but the modular nature of this processor left a lot of uncertainty at the time. With AMD soon to launch their own counter-attack with the Caymen-equipped Radeon HD 6970, it's unclear just how long NVIDIA will keep it's position on the throne of discrete graphics. AMD proved that Barts can do more with less, just like NVIDIA has done with their reworked Fermi architecture, and so now it becomes a matter of finding the acceptable price point for the segment. Of course, this all depends on yield... something both vendors have struggled with as they continue to depend on TSMC for results.

NVIDIA-GeForce-Fermi-Product-Family.jpg

NVIDIA GeForce Fermi Graphics Card Family

With Barts already on the shelf and GTX 580 arriving this week, AMD and NVIDIA are once again even-Steven in their competition for DirectX-11 supremacy. Now all that they need are some highly anticipated video games to showcase tessellation like never before and increase demand for their products as a result. Titles such as Tom Clancy's H.A.W.X.2 certainly showcase a beautifully realistic terrain, but low-end graphical processing requirements don't exactly help to sell the high-performance video cards. It's the games like Lost Planet 2 that will crush performance and push sales for top-end products, but the interest in this genre isn's nearly as strong as FPS titles such as Call of Duty: Black Ops or BattleField: Bad Company 2. Unfortunately for AMD and NVIDIA, most game developers are working to the hardware available inside console gaming systems, and not the more powerful desktop computers.

Even if you're only after raw gaming performance and have no real-world interest in CUDA, there's reason to appreciate the Fermi GF100-series GPU. New experience-enhancing products such as the NVIDIA GeForce 3D Vision double the demands on frame rate output and require more powerful graphics processing power as a direct result. This is where Fermi-based products deliver the performance necessary to enjoy the extended gaming experience. I'm a huge fan of the 3D experience, which is why 3D Vision earned our Editor's Choice Award and I've written a NVIDIA 3D-Vision Multimedia Resource Guide, and at the moment only Fermi-based GeForce video cards deliver the power necessary to drive up to three monitors for 3D-Vision Surround.

Some older game titles will also benefit from the Fermi architecture, beyond a simple increase in video frame rates. For example, Far Cry 2 (among others) will receive 32x CSAA functionality native to the game, but future NVIDIA Forceware driver updates could also further add new features into existing co-developed video games. NVIDIA's R260 Forceware release introduced new features enthusiasts have been wanting for quite some time, my favorite is the removal of previous driver files and extensions. Additionally, NVIDIA NEXUS technology brings CPU and GPU code development together in Microsoft Visual Studio 2008 for a shared process timeline. NEXUS also introduces the first hardware-based shader debugger. NVIDIA's GF100-series are the first GPUs to ever offer full C++ support, the programming language of choice among game developers.

Fermi is also the first GPU to support Error Correcting Code (ECC) based protection of data in memory. ECC was requested by GPU computing users to enhance data integrity in high performance computing environments. ECC is a highly desired feature in areas such as medical imaging and large-scale cluster computing. Naturally occurring radiation can cause a bit stored in memory to be altered, resulting in a soft error. ECC technology detects and corrects single-bit soft errors before they affect the system. Fermi's register files, shared memories, L1 caches, L2 cache, and DRAM memory are ECC protected, making it not only the most powerful GPU for HPC applications, but also the most reliable. In addition, Fermi supports industry standards for checking of data during transmission from chip to chip. All NVIDIA GPUs include support for the PCI Express standard for CRC check with retry at the data link layer. Fermi also supports the similar GDDR5 standard for CRC check with retry (aka "EDC") during transmission of data across the memory bus.



 

Comments 

 
# Your power consumption is wrongDS 2010-11-09 06:45
Because the GTX485, sorry GTX580, throttles clocks in Furmark and OCCT. Please redo the tests, with this future turned off, or with some other stressing program.
Report Comment
 
 
# RE: Your power consumption is wrongOlin Coles 2010-11-09 06:52
I can always tell when a visitor comments without reading the article, because if you had then you'd know that it will do this with every application because it's controlled at the hardware level. Since there's circuitry limiting power consumption, how do you suggest we 'turn this off'?

I'm asking you, the empowered visitor, since you obviouly know the 'right' way to do this after telling me that our way was wrong.
Report Comment
 
 
# GTX485ChrisW 2010-11-13 21:45
He may be wrong about Furmark and OCCT, but he hit the GTX580's more realistic name on the head!

If anything, at most a GTX490...
Report Comment
 
 
# RE: Your power consumption is wrongOlin Coles 2010-11-13 23:45
If you rename the exe's to something else (like Crysis.exe) they reveal full power consumption statistics, which are reported in this review.
Report Comment
 
 
# No Can Do....BruceBruce 2010-11-09 06:55
The "throttling" of the card is based on input from thermal sensors and current sensors. It is not software-specific. It doesn't matter how you stress the card; if it gets too hot, or draws too much current it will take corrective action.
Report Comment
 
 
# ActuallyIntratech 2010-11-09 07:20
Actually the limiter is only engaged when the driver detects Furmark or OCCT at present so you can test it with some other stress testing application.
Report Comment
 
 
# RE: ActuallyOlin Coles 2010-11-09 07:25
This isn't a software function. You'll notice from the chart that was linked that the GPU isn't being throttled, and renaming furmark had no effect. Do you have evidence of it working otherwise?
Report Comment
 
 
# Power consumptionLED Guy 2010-11-09 12:13
Anandtech had this to say about the power consumption:

Quote:
Compared to the GTX 480 NVIDIA?s power consumption is down 10%...


##anandtech.com/show/4008/nvidias-geforce-gtx-580/17

Anand's numbers also seem to be more in line with other reviews I've looked at. So unfortunately it looks like you need to retest power consumption with another program, why not ask Anand what program they used?

Otherwise, from what I read, good review. Not planning on upgrading so I didn't read the whole article unfortunately. Been a reader for some time now, first comment so thanks for the great articles so far. BTW I have a suggestion/request for the graphics card articles: Add minimum frame rate numbers to the tests, as these are as important as average frame rates, if not more so.

Cheers
Report Comment
 
 
# TechPowerUpIntratech 2010-11-09 07:33
Quote:
NVIDIA also explained that this is just a work in progress with more changes to come. From my own testing I can confirm that the limiter only engaged in Furmark and OCCT and not in other games I tested.

##techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/3.html
Report Comment
 
 
# Sorry, full quote hereIntratech 2010-11-09 07:35
Quote:
At this time the limiter is only engaged when the driver detects Furmark / OCCT, it is not enabled during normal gaming. NVIDIA also explained that this is just a work in progress with more changes to come. From my own testing I can confirm that the limiter only engaged in Furmark and OCCT and not in other games I tested.

Real-time clock monitoring does not show the changed clocks, so besides the loss in performance it could be difficult to detect that state without additional testing equipment or software support.


##techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/3.html
Report Comment
 
 
# ThanksRealNeil 2010-11-09 07:47
This was the first GTX-580 review to hit my in-box today. I like the fact that this card uses such a small amount of power and still achieves very nice performance. While it's true that the top-end Radeon cards bested it in some of the tests, they don't offer the same compatibilities with CUDA and Phys-X processing and that shows up glaringly in some of your results. Not knowing for sure how game development will go as to what technologies each company will embrace, it makes more sense to me to buy the NVIDIA card.
Report Comment
 
 
# RE: ThanksBanzai 2010-11-09 07:57
Well, it matches fairly well to a 5970, which is a dual gpu card. Honestly, it shows some good results in my opinion, although it may not beat crossfired 6870's, it can match/beat a 5970 for a similiar price.
Report Comment
 
 
# Mr. NobodyFranck 2010-11-09 08:23
The only thing i find a quite deceiving is the choice of the competing cards.
For the sake of performance, it is obvious that a single 470 wouldn t do it. It would have been tem time more interesting to know how would perform a SLI with more popular 460 gtx 1g HAWK or FTW or even a 470 SLI since you put a 6870 Crossfire on the stake, and that such are in pair with the mentioned cards.
That would be relevant to know how this newby 580 perform agaisnt his own kind.
Report Comment
 
 
# RE: Mr. NobodyOlin Coles 2010-11-09 09:03
I'm sorry to ruin your conspiracy theory, but I simply didn't have a second GTX 470 to combine and test in SLI. I don't want to deceive you, so I'll add that I might never have a second 470.

I'm also sorry that all of the work I put into this article didn't provide enough information for you to develop a decent idea of where this card fits.
Report Comment
 
 
# Que'RealNeil 2010-11-09 09:19
So you did this review, and I'm not sure how long it takes to do one of these, but I really appreciate having them to look at. But I wonder if you ever just relax with some of these Wazoo cards in a PC and just game a little. Time is probably a factor.
Report Comment
 
 
# RE: RE: Mr. NobodyFranck 2010-11-09 17:37
Not even 2 460 ?
Ususally those review are to make an opinion of what to buy according to what you use, and guide the pretenders to an upgrade.
Well it doesn t take rocket science to guess the 580 is to topple ATI best dog, this is the obvious part, everybody will do it.
But what of the average joe ? Since GTX 460 is the average card on the market and not the 470, what should he do, acquire a 580, or double his 460 ? That s the second more important question on the market, now.
If i take that the 6870 is equal to a hawk and less than a FTW,according to your own reviews, average joe could guess that 2 of those would let the 580 on the floor panting as did the double 6870.
But as other factors kick average joe can t be sure then the revue fails him, and the average joe drive the market bulk, and want the not so obvious answers.
If the review is to give an "en passant" review of the 580 its excellent, but its too obvious, everysingle review will do it.
Report Comment
 
 
# RE: RE: RE: Mr. NobodyFranck 2010-11-09 17:42
You reviews usually fill my doubts and i hardly consult others sites as they tend to say all the same that you do, so your very thrustable. . SLI or buying a new card ? no answer for NVIDIA owners.
I hope you ll have 2 NVIDIA card when your start reviewing the non vanilla 580. I can assure you its all that matters for much people.
You solved the ATI side and its great for them but we Nvidia side are left hanging.
Report Comment
 
 
# RE: RE: RE: RE: Mr. NobodyWayne Manor 2010-11-09 19:56
I have 2 1GB GTX 460's and unless I really wanted the brute power of 2 580's, I'm sticking to the 460's as they're probably on par with 1 580. I'll probably wait till the 680's or 7900's till I upgrade next.
Report Comment
 
 
# Updated with SLI GTX 460'sOlin Coles 2010-11-10 21:39
I purchased a second overclocked 1GB EVGA GeForce GTX 460 FTW video card specifically for SLI testing, and have updated all of the charts with the new results. I may write a separate article discussing value and performance, but there are no plans to re-write this article at the moment.
Report Comment
 
 
# man these guyssweatshopking 2010-11-09 09:21
Man, you guys always get a tough time on video cards reviews. that sucks guys. take it easy, quit whining about omg no 470, etc. look and see. buy or don't. up to you, but stop complaining.
Report Comment
 
 
# Mmm competitionHaters gonna hate 2010-11-09 11:31
I think this a great GPU I just think the price is perhaps a little high at the moment. Plus, you may want to see what the 6970 and 6950 yield from AMD. If nothing else, I bet it forces the price lower on the GTX 580. Olin, I liked the article and appreciate all the time I'm sure it took to get all this info.
Report Comment
 
 
# 580Pawnshock 2010-11-09 12:04
I just want to say thanks a million for all the effort and time you put into this benchmark review Olin Coles. It is detailed, precise and exactly what I was looking for and I am sure many more will find it helpful as well. Much appreciated.
Report Comment
 
 
# Oui...Chris H 2010-11-09 13:07
Why couldn't they have just called it the GeForce 485 or 490?! Looking at these performance numbers, it would of been more appropriate. When I see a 480 and 580, I expect the 580 to be at-least 60% faster than that their last model...

Are they really going to have to start their next gen chip in the 600's?

Hey Olin, GREAT article BTW!
Report Comment
 
 
# RE: Oui...Jack 2010-11-09 14:26
You mean like they did w/ the 5870 and 6870 ? Wait that's slower.

Same reason they slap 2011 on car models .... it works to sell things.
Report Comment
 
 
# RE: RE: Oui...Franck 2010-11-09 17:43
Exactly MKT stuff only.
Report Comment
 
 
# RE: Oui...David Ramsey 2010-11-10 09:19
Because the underlying chip has actually changed. Admittedly, the changes were minor: fine-tuning the types of transistors used in various parts of the chip, moving fan control onboard, et cetera so on and such forth. Still, a case could be made for a "4xx" designation. But although NVIDIA has a track record of confusing model numbers, they're still way better than AMD, whose 6xxx series cards are slower than the existing 5xxx series.
Report Comment
 
 
# GTX 580 3DRobert17 2010-11-09 15:04
Olin, having trouble posting questions in the forums; thought you'd like to know.

So here you go, both on and off topic. Nice review, thanks for all your efforts.

With the preponderance of video hardware turning to 3D enabled products I'd like to know how I can un-chart the costs of 3D enabling via hardware if that's possible (I have no depth perception, just ask the US Navy). So I get no bang for the buck with this evolution that others are likely fascinated by. Any general rule of thumb that you may have already gotten your mind around?

Thanks again.
Report Comment
 
 
# RE: GTX 580 3DOlin Coles 2010-11-09 15:29
Robert: please try to start a thread in the forum, or send me a message with the problems you're having. I want to keep the comments on-topic, or else I would discuss here.
Report Comment
 
 
# RE: NVIDIA GeForce GTX 580 Video Card Performanceivor 2010-11-09 21:49
I'm inclined to think that Benchmarkreviews received a GTX580 with a lower vid core than some of the other reviewers, which explains the meager overclock but sipping power. But anyhow, it just represents the other end of the spectrum where some cards have lower vid. It may not be a good thing though, because Nvidia does not think the gpu can stand higher voltage.
Report Comment
 
 
# Outperformed by CrossFire Radeon HD 6870'sTiago 2010-11-10 06:34
How can you add this as a con!?
You are comparing TWO 6870s to ONE gtx 580 and btw the 580 is a SINGLE gpu card while one 6970 is a DUAL gpu card!!

Ofcourse it'll always outperform a 580 but i bet if you guys at benchmarkreviews added another 580 in SLI you'd get very different results.
Report Comment
 
 
# RE: Outperformed by CrossFire Radeon HD 6870'sOlin Coles 2010-11-10 06:42
How can I compare two Radeon HD 6870's to one GeForce GTX 580? Well, two 6870's cost less and provide better FPS performance. That's how. People want to know how much performance they can get for their money, not just how one card stacks up against another single card.
Report Comment
 
 
# RubbishJoesph 2010-11-10 15:15
Ive seen tests of the 6870 and there worse than the 5870 as it is a less hungry card, yeh your tests show it isw better than gtx580
what a load of crap
Report Comment
 
 
# RE: RubbishOlin Coles 2010-11-10 15:21
Joseph, please tell me that beneath your tormented grammar that you're also illiterate and blind. These tests used two 6870's in CrossFireX. That means 6870 times two.
Report Comment
 
 
# RE: NVIDIA GeForce GTX 580 Video Card PerformanceChris 2010-11-10 15:45
A more substantial upgrade than I though it would be. Thanks for taking the many hours it takes to make an objective review. It seems that a pair of 6870s represents a better deal for the buyer today that this card. I don't think that much has changed ... yet. Nvidia still has the fastest in absolute per GPU performance while ATI is better for price:performance ratios. Of course, we'll have to see the high-end ATI cards before making a final judgment.
Report Comment
 
 
# Cost Analysis for GTX 460's in SLIDon 2010-11-10 22:50
Great review, and just wanted to let you know that you are missing the cost analysis for the 460's in SLI, and just want to point out that there are much better 460 cards at a cheaper price like the MSI Cyclone 1GB, or the Hawk which are only $199 and $215 respectively, right now.

A pair of these in SLI would be $400 - $430, and would perform better than the EVGA cards.
##newegg.com/Product/Product.aspx?Item=N82E16814127534&cm_re=gtx_460_1gb-_-14-127-534-_-Product
Report Comment
 
 
# RE: Cost Analysis for GTX 460's in SLIOlin Coles 2010-11-10 22:55
Thanks Don, but I literally JUST added the GTS 460 SLI results a few minutes ago. I may or may not update the value analysis later this week.
Report Comment
 
 
# RE: Cost Analysis for GTX 460's in SLIOlin Coles 2010-11-10 23:02
How would they perform better? I think you're mistaken, because the $230 EVGA GTX 460 FTW is clocked to 850/1700/1000 MHz, and the $215 MSI HAWK is 810/1620/975 MHz.
Report Comment
 
 
# They perform better when overclockedDon 2010-11-11 09:44
Sorry I didn't clear that up.

The Hawk & the Cyclone overclock very well, especially the Hawk as it can easily reach and exceed 900 MHZ. But yeah, the EVGA has pretty good clocks as well as it comes nicely overclocked.

IMO the Hawk is the better value card for a GTX 460 right now @ $215.

Excellent job on the review!

Question - Can you confirm if NVIDIA did in fact get rid of most of their Hyper Compute Performance on the 580?
Report Comment
 
 
# No update on power consumption?LED Guy 2010-11-11 10:46
You still haven't updated the article which says:

"The temperatures discussed below are absolute maximum values..."

But they are NOT absolute maximum values, nor is the power consumption. You state that loaded system power with the 580 is 191W and 315W with the 480. But you also say that you tested temperatures and power consumption with Furmark, which throttles the card, this can be checked if you run the benchmark test where you will get a LOT lower performance compared to the 480 for example. ALL other reviews out there state the 580 gets you 15-25% better performance than the 480 for the same or SLIGHTLY less power.

But not 124 watts less. And the difference isn't just down to a lower VID like ivor said. Test again with another program, Vantage, Crysis or, like Nordichardware did, try with Kombustor and using the "Post-FX" setting.

Please update the article as soon as possible as it is hugely misleading right now.
Report Comment
 
 
# RE: No update on power consumption?Olin Coles 2010-11-11 13:51
The power section has been updated with details on how we arrived at our results. Essentially, GeForce GTX 580 hits 246W maximum power while using a modified FurMark tool.
Report Comment
 
 
# gtx460 slibelzazar 2010-11-14 18:30
i didnt read the whole article but i read the fps testings and to me it seems that geforce 460 sli rocked all the other cards or am i mistaking ? i am about to buy my self a new card and my choise is between the 580 and 460 sli it seems that the 460sli is the best perfomance and is cheaper but my question is if it is possible that the 580 will perform way better after a newer driver perhaps ? plx help me make up my mind i reaally would like the newest card but why buy it when and older in sli beforms way better.
thanx for the great articles
Report Comment
 
 
# RE: gtx460 sliOlin Coles 2010-11-14 19:52
Hopefully you read enough of the article to realize that the GTX 460's we used had the highest factory overclock available: EVGA FTW Edition. Two stock GTX 460's in SLI do not beat a GTX 580, and instead compete with the 480.
Report Comment
 
 
# TNX 1680x1050 + settingsKrusher 2010-11-14 22:48
Thanks for doing these tests in 1680x1050 AND showing your settings; most of the reviews I've seen elsewhere only show 1920x1200 and leave out the valuable setup info (so I could not attempt to reproduce it here.)
Report Comment
 
 
# RE: TNX 1680x1050 + settingsOlin Coles 2010-11-14 23:17
Back when I was a avid hardware review reader (a little over four years ago), it would always frustrate me to find very little detail on the settings and specifications for each video card tested. I always had to guess at the speeds of a particular product, and if it was an overclocked version (XFX 7900 GT doesn't say a lot). So when I started Benchmark Reviews, I made it a point to ensure our details made it possible to reproduce our results.
Report Comment
 
 
# RE: TNX 1680x1050 + settingsKrusher 2010-11-14 23:58
I have a single eVGA GTX 460 SC (OC'd to the FTW speeds) and halved your SLI numbers to give me a ball-park figure. With 3 days left on my Step-up option to the 580, your results make it really tempting...if eVGA ever catches up on the huge back order! :)
Report Comment
 
 
# Thank youJay Cee 2010-11-15 08:57
Thank you for the review and all your effort, it seems many people tend to post complaints, but those who are satisfied generally never bother to post a thanks.

Will you comment on the noise level? This seems to be a greater concern to me more and more as I seek to spend more time and money seeking out quieter components. The only reason I didn't purchase a GTX480 was due to the noise, so I have been having to do with a GTX460. The majority of reviews point out that the GTX580 is much quieter which seems to be a positive move on nVidia's part.
Report Comment
 
 
# RE: Thank youOlin Coles 2010-11-15 09:09
The noise level is nearly silent at idle (as in no audible sound), and under full gaming load it becomes slightly more audible. If I turn the fan up to full power using Afterburner, it's loud enough to hear but still more quiet than a GTX 460. That's right, less noise than a GTX 460. The fan noise on a GTX 580 doesn't even compare to a GTX 480, which seemed to get better later into production.
Report Comment
 
 
# 3d Digital ArtistKeign 2010-11-15 14:35
This car fits my needs perfectly because not everything takes advantage of SLI, nearly 90% of the programs I use that are graphic card intensive do not support SLI configs. SO I will be upgrading my two 295's to a 580 this very day ;). Thanks for the review.
Report Comment
 
 
# yetKeign 2010-11-15 14:36
And yet it will still probably outperform my 295's in gaming, which is versatile and usable on every medium I need it for.
Report Comment
 
 
# dsfgatygun 2010-11-16 08:46
Nice review, bad luck you got such horrible clueless reply's going on here.

A 580 = a package ( it is a stable solution for every game there is )
sli/crossfire/cf = raw power + cheaper but a more unstable way of dealing with older games even new top titles like aliens of predator.

While videocard do the same, the different versions of it are ment for different people.


Even if the gtx580 cost more and doesn't provide on every little aspect on the best way, the card = still a far better choise for most people
then any sli/crossfire/cf unstable solution.

The 580 = a perfect naming for this product. its 20-25% faster / lesser usage of watt / lesser heat / lesser noise.

2x 460's or 2x other budget cards or x2 solutions are great if the card actually produces 200% faster speeds then the 1 cored version. Which it clearly isn't doing. Its even below the 580 or with a minor fps above it. ( on only "newer games")
Report Comment
 
 
# Video Card PricesOlin Coles 2010-11-21 13:50
After the GeForce GTX 580 launch, I felt a little irritated to see only one model sell for $500 while all of the others sold for $520 or more. This throws off my Cost Analysis math on the very first day. Making matters worse AMD's Radeon HD 5970 could be found in a few places for $500 before launch, but now costs no less than $550 or (much) more.

This game of sinking prices just ahead of launch is getting old, and it ruins my cost analysis every time. So visitors, keep this in mind when you read these reviews, and understand how prices change on a daily (and sometimes hourly) basis.
Report Comment
 
 
# Great review12Xu 2010-12-09 09:27
Nice job, thanks for the thorough review. Glad to see Nvidia coming back around after such a long drought.
Report Comment
 
 
# meabla 2010-12-14 06:03
the only Nvidia card that has ever interested me is GTX460
in that can buy two of them for the price of one GTX480 or GTX580 and have more performance

other wise I have become disgruntled with the swift obsolesce of today's technology
Report Comment
 

Comments have been disabled by the administrator.

Search Benchmark Reviews
QNAP Network Storage Servers

Follow Benchmark Reviews on FacebookReceive Tweets from Benchmark Reviews on Twitter