Archive Home arrow Reviews: arrow Video Cards arrow NVIDIA GeForce GTX 680 Graphics Performance

NVIDIA GeForce GTX 680 Graphics Performance E-mail
Reviews - Featured Reviews: Video Cards
Written by Olin Coles   
Thursday, 22 March 2012
Table of Contents: Page Index
NVIDIA GeForce GTX 680 Graphics Performance
First Look: GeForce GTX 680
NVIDIA Kepler GPU Details
Video Card Testing Methodology
DX10: Crysis Warhead
DX11: 3DMark11
DX11: Aliens vs Predator
DX11: Batman Arkham City
DX11: Battlefield 3
DX11: Gugila GroundWiz RTS
DX11: Lost Planet 2
DX11: Metro 2033
DX11: Unigine Heaven 3.0
Temperature and Power Consumption
GeForce GTX 680 Conclusion

GeForce GTX 680 Conclusion

IMPORTANT: Although the rating and final score mentioned in this conclusion are made to be as objective as possible, please be advised that every author perceives these factors differently at various points in time. While we each do our best to ensure that all aspects of the product are considered, there are often times unforeseen market conditions and manufacturer changes which occur after publication that could render our rating obsolete. Please do not base any purchase solely on our conclusion as it represents our product rating specifically for the product tested, which may differ from future versions of the same product. Benchmark Reviews begins our conclusion with a short summary for each of the areas that we rate.

NVIDIA have designed their latest GPU with several goals: operate faster, offer more features, deliver more functionality, use less energy, and generate less heat. These days, consumers generally react favorably to any product that can deliver impressive performance gains over competing alternatives, so it seems that NVIDIA's rather large shopping list of goals could serve them very well in the marketplace... especially since they delivered beyond most expectations. There will still be multi-GPU graphics cards to contend with, but as far as single-GPU solutions go the GeForce GTX 680 captures star status as the best graphics card available on the market.

Fan boys often argue one brand against another based on personal attachment, but as an industry critic it's difficult to avoid agreement when our tests prove NVIDIA video cards offer a better total graphics solution than the closest competition. As of this launch, that competition comes in the shape of AMD's Radeon HD 7970, a video card that costs $50 more, consumes more electricity, produces more heat, and trails in frame rate performance. After running benchmarks on each video card through fifteen different tests, the FPS results almost always favored NVIDIA's GeForce GTX 680. Let's look at the break-down:

In the DirectX 10 game Crysis Warhead, the GeForce GTX 680 and Radeon HD 7970 appear even at 1680x1050 resolution, but once the strain of 1920x1080p is added the GTX 680 pulls ahead 7 FPS. DirectX 11 tests followed the trend, resulting in the GeForce GTX 680 to lead significantly in most tests. In one of the few exceptions, Aliens vs Predator gave a noteworthy lead to AMD Radeon products over their NVIDIA counterparts. The demanding DX11 graphics of Batman: Arkham Asylum made use of Kepler's optimized architecture, delivering a staggering lead to the GeForce GTX 680 over every other graphics card tested. Battlefield 3 continued the run, pushing the stock GTX 680 more than 10 FPS beyond the Radeon HD 7970. Lost Planet 2 played well on all graphics cards when set to high quality with 4x AA, yet the GeForce GTX 680 still surpassed Radeon HD 7970 performance by 12 FPS before an overclock that sent it another 10 FPS higher. Metro 2033 is another demanding game that requires high-end graphics to enjoy quality settings, but like AvP this game really took to the Radeon HD 7970 and helped push it 4-6 FPS ahead of the GTX 680.

Synthetic benchmark tools offered a similar read on these products, mirroring some of the results seen from our video game tests. Futuremark's 3DMark11 benchmark suite strained our high-end graphics cards with only mid-level settings displayed at 720p, forcing the less-powerful Radeon HD 7970 to trail the GeForce GTX 680 by nearly 10 FPS. Then there was the Gugila GroundWiz RTS Demo, which uses the Alpine scene to cripple graphics cards... and cripple it did: this benchmark is so demanding that we had to run tests at 1280x720p just to get somewhat decent frame rate results. NVIDIA's GeForce GTX 680 did extremely well, but it's no contest when the only card that fails the test is your competitions flagship model. Unfortunately AMD did not consider this issue to be worthy of response, even though I reported nearly a week prior to publication. Finally, the Unigine Heaven 3.0 benchmark confirmed what we've seen in most other tests: NVIDIA's GeForce GTX 680 leading the AMD Radeon HD 7970 in stock form, and then leaping way past it once overclocked to maximum GPU Boost.


Appearance is a much more subjective matter, especially since this particular rating doesn't have any quantitative benchmark scores to fall back on. NVIDIA's GeForce GTX series has traditionally used a recognizable design over the past two years, and with the exception to more angular corners, the GTX 680 looks very similar to their GTX 580 and 570 models. Some add-in card partners may offer their own unique cooling solution design, but this might not happen with the GeForce GTX 680 since it operates so efficiently and allows nearly all of the heated air to exhaust outside of the computer case. Expect most partners to dress up the original reference design by placing exciting graphics over the fan shroud or using colored plastic components. While looks might mean a lot to some consumers, keep in mind that this product outperforms the competition while generating much less heat and producing very little noise.

Construction is the one area NVIDIA continually shines, and thanks in part to extremely quiet operation paired with more efficient cores that consume less energy and emit less heat, I'm confident that GeForce GTX 680 will continue this tradition. Reducing the flagship model to use two 6-pin PCI-E power connections is a step in the right direction, while tweaking heatsink and fan placement to optimize cooling performance proves there are still ways to improve on a commonplace technology. Even better yet, now consumers have a single-GPU solution capable of driving three monitors in 3D Vision Surround with the inclusion of two DL-DVI ports with supplementary HDMI and DisplayPort output.

Defining value at the premium-priced high-end segment isn't easy, because hardware enthusiasts know that they're going to pay top dollar to own the top product. Even still, rating value is like chasing a fast moving target, so please believe me when I say that prices change by the minute in this industry. The premium-priced GeForce GTX 680 "Kepler" graphics card demonstrates NVIDIA's ability to innovate the graphics segment while maintaining a firm lead in their market, but it comes at a cost. As of launch day 22 March 2012, the GeForce GTX 680 has been assigned a $499 MSRP. For those with an impeccable memory, back to November 2010 the GeForce GTX 580 also launched with the exact same $499 MSRP (which is still available at Newegg for around $400). So with regard to value, the GeForce GTX 680 delivers more features and better performance than the less-powerful AMD Radeon HD 7970 that currently sells for $550, but matches frame rate performance while costing slightly more than the older less efficient GTX 590. To compare one cards value to another based solely on video frame rate is a fools game, because features and functionality run off the chart with GTX 680. Furthermore, only video card can offer multi-display 3D gaming, PhysX technology, GPU Boost, FXAA, and now TXAA.

GeForce GTX 680 is the ultimate enthusiast graphics card intended for affluent top-end gamers, but I see this product becoming so popular that it draws more interest than previous flagship models. Our test sample took the standard 1006 /1058 MHz GPU clock and easily reached a 1187/1240 MHz overclock without any additional voltage. Add this to the record-setting 6.0 GHz GDDR5 memory clock (which we also overclocked to 6.4 GHz) and vSync on everything becomes a possibility... especially with NVIDIA Adaptive VSync now available to smooth the frame rate gaps. Using just one GeForce GTX 680 video card is enough to display millions of pixels at the speed of light, so imagine the graphics quality settings possible with two combined into a SLI set.

Overall I'm quite impressed with the NVIDIA GeForce GTX 680, but it's the 28nm GK104 'Kepler' GPU that really has my attention. This article has covered many of the new product features and added functionality possible through Kepler, but imagine beyond the GTX 680. By reducing the TDP footprint to an easily manageable 175W operating range, it won't take much effort to combine two of these GPUs into the yet-to-be-announced GeForce GTX 690. I can picture it now: 4GB of GDDR5 video frame buffer memory pushed to 6.0 GHz, combined with two Kepler GPUs operating at 880 MHz before GPU Boost... and it would still run cold and quiet with a combined 300W TDP. Give a few months, and we'll see how accurate my prediction was. EDITOR'S NOTE: As it turns out, I was extremely close: NVIDIA GeForce GTX 690 Video Card Features

So what do you think of the NVIDIA GeForce GTX 680 Kepler graphics card, and do you plan to buy one?

Related Articles:



# RE: NVIDIA GeForce GTX 680 Kepler Video Card Performancedanwat1234 2012-03-22 07:43
I hope all of you buying these high end GPUs put them to good use when your not gaming! [email protected] or another distributed computing project, do your part to help science!
Peace out
Report Comment
# RE: RE: NVIDIA GeForce GTX 680 Kepler Video Card Performancebob 2012-03-23 02:05
I hope you are joking. These gpu are bad for gpgpu like boinc. 7970 are great!
Report Comment
# RE: RE: RE: NVIDIA GeForce GTX 680 Kepler Video Card PerformanceOlin Coles 2012-03-23 08:37
Based on what factual evidence? I notice the 7970 couldn't even run a DirectX 11 tessellation test, so how will it compete with GPGPU tests?
Report Comment
# On what Evidence! Are You Blind, Deaf, Stupid & Insane?tophat killer 2012-03-27 09:55
REady any, and I mean any direct compute review of kepler and tahati,
tahati beats gk104 by 10 to over 500%. Nv castrated direct computer in gk104.
Report Comment
# Is there going to be a difference?Christopher Fields 2012-03-22 08:04
I have 2 GTX 580's in SLI and they are water cooled. After months of rumors you guys have seemed to clear things up. I read things like "3 Times more powerful than a GTX 580 OC" to "The Card will be 8-9" Long".

I guess my question is this. My monitor is a 60Hz LED 32" 1920x1080 with a 5ms refresh rate. I play Battlefield 3 100% of the time. If I buy this card will my experience be any different?

I can prob answer this one Nvidia makes great products and the nice thing is when you buy a top end card you can usually skip a generation and wait for the next knock out contender. I would say if you have a GTX480 Series to upgrade. But if you have a Single GTX580 then put on the brakes and wait for the next show.

This of course is just my opinion. This is a great article, I always like reading your stuff Olin. Thanks for the time & effort you guys put into these reviews.
Report Comment
# RE: Is there going to be a difference?Olin Coles 2012-03-22 08:27
As a BF3 player myself (Das Capitolin), I can say that you're not going to SEE a significant difference between an overclocked GTX 580 and the new GTX 680. That being said, Active vSync is something worth considering, as are the extremely low sound levels and heat output. Since you've already purchased the 580's, I see no solid reason to upgrade. But if someone's deciding between the two, the GTX 680 easily wins my vote.
Report Comment
# RE: RE: Is there going to be a difference?Christopher Fields 2012-03-22 08:49
I added you Olin, S1W3A3T7
Report Comment
# Very good POWReSeRe 2012-03-22 12:35
i'm talking about skiping 1 generation GPU (and usually CPU too). Yes that's my philosophy either.
now i'm BF3in' w/ 460 1Gb SLI on an (pretty old but excellent LCD) EIZO 1600/1200. high settings (not ultra). Actually i could go ultra BUT with a drastic tweak down AA and other 1, maybe 2 settings.
In BF3 my nick: mantuitoru.

Once again, good review.
Long live the competition, anyway. in this case nVidia vs AMD.
Report Comment
# gk110godrilla 2012-03-22 08:43
I'm going to wait for gk110 chip according to fudzilla will be out by July , currently using classified gtx 580 ultra.
Report Comment
# RE: NVIDIA GeForce GTX 680 Kepler Video Card Performanceroy 2012-03-22 10:04
Good card and excellent review! I love the low power usage and the new tech(txaa,adaptive v-sync,28nm,gpu boost,etc). A little concerned by the 256mb memory bus.
I wanna upgrade but until Nvidia (must have Phys-x) releases a single gpu card that can run Metro 2033 at max settings (4xAA,16xAF,very high quality,advanced DOF on and phys-x on) and average 60fps at 1920x1200 I guess I am stuck with my GTX480 sli for a long while.
Report Comment
# RE: NVIDIA GeForce GTX 680 Kepler Video Card Performanceclaydough 2012-03-22 13:34
at 3x the cuda cores than the card that was supposed to replace the 560ti that was supposed to be released first the benchmarks do not even seem to reflect the numbers of that affordable version? :(

Rumors however that this is the card was supposed to replace the 560ti but the performance was so good that they just gave it the 680 designation is very unsettling...

What happened to the consumer friendly mid powered card that was supposed to be released first?

What the heck happened to the Kepler super powered card?

Better than 480 to 580 but is this just another incremental release or hardware without drivers?

Or worse yet...
Did we just get version screwed?
Report Comment
# RE: RE: NVIDIA GeForce GTX 680 Kepler Video Card PerformanceOlin Coles 2012-03-22 16:16
I don't recall NVIDIA ever making a statement on Kepler, it's roadmap, or what it was intended to replace. In other words, consider the source.
Report Comment
# RE: NVIDIA GeForce GTX 680 Kepler Video Card Performanceclaydough 2012-03-22 13:50
Maybe the rumors of gtx 780 were true?
If this is the replacement for the gtx 560ti..
and the gtx780 is coming out at the end of this year...
Then that would be fine with me!
If the following gtx 780 benchmarks are real:

Like I said in next gen console/xbox 720 article:
Anything without the benefit of present day maxwell development is sure to lead to a another very console limiting 10 year cycle!
Report Comment
# This..luay 2012-03-23 10:26
AMD replaced their mid-end product names with the higher end when they released their 6000 series, so not a first or even second time it has been done. This is a 560ti replacement and it will be marketed to death as a high-end product now and mid-end after July with a price reduction when the 580 replacement comes around.
If this card sells well, console makers should take note and try to cut a deal with Nvidia instead of AMD. I think this is why MS and Sony were not publicizing any future commitments of delivering next-gen consoles..
Report Comment
# RE: NVIDIA GeForce GTX 680 Kepler Video Card PerformanceDoug Dallam 2012-03-22 14:40
NVIDIA GeForce GTX 680 Reference Design

Load: 243 W


Also consider, extrapolating from your price comparison of other nVidia top end offerings, the GTX295 also came in at 500+ USD. That's pretty incredible.
Report Comment
# awesomecube 2012-03-22 19:50
If i buy this card, it means i will have to buy a new motherboard, LG2011 which means ill have to buy a new CPU and of course, the newest RAM. so all in all, the only thing i get to keep is my SSD and HDD. At the same time, i will also get a bran new case. so... the games i currently play and will play in the future dont require this much power.

this is what happens when devs dumb down the #ing games for consoles. people like me that used to upgrade every other cycle are now holding off and keeping their system alot longer. Means less PC part sales. =/ hope # will get better for us.
Report Comment
# RE: awesomeDavid Ramsey 2012-03-22 19:54
You don't have to buy an LGA2011 motherboard to use this will work fine in any PCI-E x16 slot in older motherboards.
Report Comment
# RE: RE: awesomecube 2012-03-22 19:58
but they must have PCI Express 3.0 right?
Report Comment
# RE: RE: RE: awesomeOlin Coles 2012-03-22 20:02
Just like PCI-E 2.0 is backward compatible with PCI-E 16x, PCI-E 3.0 is backwards to 2.0 and 16x as well.
Report Comment
# RE: RE: RE: RE: awesomecube 2012-03-22 20:03
yeah, but what about later on when games do take advantage of 3.0, if ever... then ill have to buy a new mobo for that.
Report Comment
# RE: RE: RE: RE: RE: awesomeDavid Ramsey 2012-03-22 20:08
That's a long way off in the future. Current games don't come anywhere near to saturating a PCI-E 2.0 x16 slot.
Report Comment
# RE: RE: RE: RE: RE: RE: awesomecube 2012-03-22 20:34
can you please go to the forums under hardware. I posted a question and need some help.
Report Comment
# RE: RE: RE: RE: RE: RE: awesomeclaydough 2012-03-26 13:30
I like to emphasize the only factor keeping any hardware pipe from melting away from saturation is compromise and economics. The artist and imagination exists today for any game that is possible with tech at the end of the 21rst century. Creativity is both cpu and gpu limited!

The game that is possible in your lifetime is only limited by the rate of hardware acceptance. ( keep buying till cinematic levels of hyper real realtime rendering is possible... after that wait until holograms become a reasonable reality? )
Report Comment
# RE: RE: RE: RE: RE: RE: RE: awesomeChristopher Fields 2012-03-26 13:35
WTF? Holograms?..................I think we are a way off from gaming on holograms, lol, 3D Gaming has just barely been breaking the surface their Spock, lol. But it would be cool. PS beware of SUPER NERDS!!!!
Report Comment
# MrSteven 2012-03-24 14:50
I find it hard to believe that this is the replacement for the GTX 560ti when it costs 430. Ouch! Budget card my ass!
Report Comment
# Bought 2: Selling Bothkzinti1 2012-03-27 23:02
I realized that this is merely a replacement for mid-range cards.
Even though it beats the best of the 5xx series, it's still only a mid-range card.
The REAL high performers will be out in a couple of months.
Nvidia's hype-machine got me once again.
I should know better by now.
Report Comment
# AdamAdam 2012-04-01 16:03
I have one of these on order. Had 2 GTX 480 in SLi. This one card should easily beat them. My GTX 480 ran under water @ 900/1800/1848. They would out bench a stock 580 as they were the same GPU. Just tweaked a little.

This is a whole NEW GPU. Like Fermi was. The 670 has already been annouced and waterblock manufactors are shipping the bocks. Expect theor next release to be a lower version. Also Laptops are shipping with mobile versions. Check nVidia's homepage.

The next release will be the 780. Time range anything from 6 - 12 months. AMD could influence this.

I have my waterblock on order. I will also be SLi'ing them from about 2 months in.
Report Comment
# RE: NVIDIA GeForce GTX 680 Video Card Performancestelios 2012-05-14 21:56
i did a test to see where im at with those settings and i got 46.7fps gtx580 @962Mhz compared to 39 is a 20% increase in performance and gtx 680 is 56fps compared to a stock gtx 580 is 41%increase in performance but compared to gtx 580 oc in my case 22-23% performance increase pretty good if u ask me
Report Comment
# RE: NVIDIA GeForce GTX 680 Graphics Performanceramin 2012-11-11 07:42
NVIDIA?s newest flagship card(680) is superior to the HD 7970 in almost every way. Whether it is performance, power consumption, noise, features, price or launch day availability, it currently owns the road and won?t be looking over its shoulder for some time to come.
Report Comment
# RE: RE: NVIDIA GeForce GTX 680 Graphics PerformanceDavid Ramsey 2012-11-11 09:38
Hm, no, not necessarily. The 7970 is superior in some games (as these benchmarks show), and NVIDIA still can't match the 7970's lower power usage, especially in multi-card setups where Radeons not being used are all but disabled-- sub 5-W power consumption, fan stopped, etc.-- when they're not needed (i.e. if you're not playing a game.)

Still, I'd agree GTX680 performance is better overall, and they still own the world in GPGPU.
Report Comment

Comments have been disabled by the administrator.

Search Benchmark Reviews

Like Benchmark Reviews on FacebookFollow Benchmark Reviews on Twitter