Reviews - Featured Reviews: Video Cards
Written by Olin Coles   
Thursday, 03 May 2012
Table of Contents: Page Index
NVIDIA GeForce GTX 690 Benchmark Performance
DX10: Crysis Warhead
DX11: 3DMark11
DX11: Aliens vs Predator
DX11: Batman Arkham City
DX11: Battlefield 3
DX11: Gugila GroundWiz RTS
DX11: Lost Planet 2
DX11: Metro 2033
DX11: Unigine Heaven 3.0
Temperature and Power Consumption
NVIDIA Kepler Overclocking
GeForce GTX 690 Conclusion

NVIDIA GeForce GTX 690 Dual-Kepler GPU Video Card Performance

Benchmark Reviews tests performance for the world's most powerful graphics solution

Manufacturer: NVIDIA
Product Name: GeForce GTX 690
Suggested Retail Price: $999.99 MSRP

Full Disclosure: The product sample used in this article has been provided by NVIDIA.

NOTE: Benchmark Reviews has published our NVIDIA GeForce GTX 690 Features Overview in a separate article.

Back on 22 March 2012, the NVIDIA GeForce GTX 680 video card made headlines and became the best-performing single-GPU graphics card on the market. Only six weeks later NVIDIA engineers have successfully combined two 28nm GK104 GPUs together to create their new GeForce GTX 690. In this article Benchmark Reviews tests game performance with the NVIDIA GeForce GTX 690, a double-slot graphics card equipped with a pair of Kepler GPUs. Featuring NVIDIA's cutting-edge GPU Boost technology, the GeForce GTX 690 video card can dynamically adjust power and clock speeds based on real-time application demands. Using EVGA Precision-X, the GeForce GTX 690 has both GPUs overclocked beyond 1200 GHz to produce ultimate graphical performance in PC video games.

NVIDIA targets top-end enthusiasts with their ultra-premium GeForce GTX 690 discrete graphics card, which includes only the most affluent PC gamers. In order to best illustrate the GTX 690s dual-GPU performance, we use the most demanding PC video game titles and benchmark applications available. Video frame rate performance is tested against a large collection of competing desktop graphics products, such as the AMD Radeon HD 7970 (Tahiti). Crysis Warhead compares DirectX 10 performance levels, joined by newer DirectX 11 benchmarks such as: 3DMark11, Batman: Arkham City, Battlefield 3, and Unigine Heaven 3.


VGA Testing Methodology

The Microsoft DirectX-11 graphics API is native to the Microsoft Windows 7 Operating System, and will be the primary O/S for our test platform. DX11 is also available as a Microsoft Update for the Windows Vista O/S, so our test results apply to both versions of the Operating System. The majority of benchmark tests used in this article are comparative to DX11 performance, however some high-demand DX10 tests have also been included.

In each benchmark test there is one 'cache run' that is conducted, followed by five recorded test runs. Results are collected at each setting with the highest and lowest results discarded. The remaining three results are averaged, and displayed in the performance charts on the following pages.

A combination of synthetic and video game benchmark tests have been used in this article to illustrate relative performance among graphics solutions. Our benchmark frame rate results are not intended to represent real-world graphics performance, as this experience would change based on supporting hardware and the perception of individuals playing the video game.GPUZ-NVIDIA-GeForce-GTX-690.gif

Intel X79 Express Test System

DirectX-10 Benchmark Applications

  • Crysis Warhead v1.1 with HOC Benchmark
    • Settings: Airfield Demo, Very High Quality, 4x AA, 16x AF

DirectX-11 Benchmark Applications

  • 3DMark11 Professional Edition by Futuremark
    • Settings: Performance Level Preset, 1280x720, 1x AA, Trilinear Filtering, Tessellation level 5)
  • Aliens vs Predator Benchmark 1.0
    • Settings: Very High Quality, 4x AA, 16x AF, SSAO, Tessellation, Advanced Shadows
  • Batman: Arkham City
    • Settings: 8x AA, 16x AF, MVSS+HBAO, High Tessellation, Extreme Detail, PhysX Disabled
  • BattleField 3
    • Settings: Ultra Graphics Quality, FOV 90, 180-second Fraps Scene
  • Gugila GroundWiz RTS 2.1 Demo: Alpine
    • Settings: DirectX 11 Renderer, 1280x720p Resolution, Tessellation Normal, Shadow Mapping 1024, CPU 1t, 60-Second Duration
  • Lost Planet 2 Benchmark 1.0
    • Settings: Benchmark B, 4x AA, Blur Off, High Shadow Detail, High Texture, High Render, High DirectX 11 Features
  • Metro 2033 Benchmark
    • Settings: Very-High Quality, 4x AA, 16x AF, Tessellation, PhysX Disabled
  • Unigine Heaven Benchmark 3.0
    • Settings: DirectX 11, High Quality, Extreme Tessellation, 16x AF, 4x AA

PCI-Express Graphics Cards

Graphics Card GeForce GTX570 Radeon HD6970 GeForce GTX580 Radeon HD7970 GeForce GTX680 Radeon HD6990 GeForce GTX590 GeForce GTX690
GPU Cores 480 1536 512 2048 1536 3072 Total 1024 3072
Core Clock (MHz) 732 880 772 925 1006 (1187 OC) 830/880 608 915 (1053 OC)
Shader Clock (MHz) 1464 N/A 1544 N/A Boost 1058 (1240 OC) N/A 1215 Boost 1020 (1215 OC)
Memory Clock (MHz) 950 1375 1002 1375 1502 (1600 OC) 1250 854 1502 (1601 OC)
Memory Amount 1280MB GDDR5 2048MB GDDR5 1536MB GDDR5 3072MB GDDR5 2048MB GDDR5 4096MB GDDR5 3072 GDDR5 4096MB GDDR5
Memory Interface 320-bit 256-bit 384-bit 384-bit 256-bit 256-bit 384-bit 256-bit



# Gtx 690ahmed alqallaf 2012-05-03 06:31
i think is not worth it because if i have 1000 bucks i would buy two gtx 680 and am really sure that 680sli surpasses the 690 espicailly when you over clock it. and that is that. 680 FTW
Report Comment
# right dudeleignheart 2012-05-03 08:51
well you can think 2 680s are better all ya want but atleast it wont put out twice the heat and require more power, and also if i liquid cool, i dont have to spend 300 on waterblocks for 2 680s. and also like always, everything that involves dual gpus are better, the waterblocks, the perks. everything. not saying sli 680s arent good, im just saying they arent a better solution to a 690. you just probably already own one and want to feel as if you have the better buy.
Report Comment
# Seriously?Visara 2012-05-03 10:40
You're concerned with power? Who gives a flying crap how much power it uses? If you have a decent PSU, you're fine.
GTX 680s in SLI completely destroy a single 690.
Report Comment
# RE: Seriously?Geoff 2012-05-04 00:04
The 690 actually manages to hold up quite well when compared with 2 680's its usually only off by 2 fps and such so in my opinion having a single gtx 690 is better taking all things into consideration including the power draw and heat etc; still for multi monitor gaming i would go with 2 680's over a single 690.. or if i had the money, 2 690's
Report Comment
# RE: Seriously?leignheart 2012-05-04 03:12
im not worried about my power supply, im worried about my electric bill. and your completely out of your mind if you honestly think 2 680s completely destroy a 690, that obviously shows you didnt read the benchmarks and know not what you say.
Report Comment
# RE: Gtx 690Erta 2012-05-03 11:04
its worth it..

saves you space on you motherboard.

price of 2x gtx 680 is same as 1x gtx 690.

gtx 680 2x power consumption.

SLi/crossfire, 2x gpu setup delays because they need to communicate between each other first before making computations

everyone knows that SLi or crossfiring 2 gpu doesnt give you 2 times performance. some games doesnt fully support SLi usually you have to wait for game updates. You also need to wait gpu driver updates to make sli/crossfire work efficiently on new games especially on new release gpu cards
Report Comment
# RE: RE: Gtx 690Agent X 2012-05-03 15:03
How do you think a dual chip card works? It's still 2 cards in SLI they are just on the same pcb or card assembly. Why do you think there is an onbaord sli chip?
Report Comment
# RE: RE: RE: Gtx 690Erta 2012-05-04 11:10
@Agent X try to understand my comment clearly I was replying to the guy whos comparing sli 2x gtx680. i wasnt talking about sli on gtx 690

which by the way an sli of gtx680 needs to be plugged in two pcie which in many cases causes bottleneck because some mobo has diffrent pcie gen2, gen3 and sometimes even if on 2x pcie gen3 sli some mobo will run it as gen2 which is the bottle.. tooo much hassle..

a single gtx690 = on a single pcie running at gen3
Report Comment
# RE: Gtx 690Shane 2012-05-04 03:36
I think it is more than worth it. There is more power and heat from 680 SLI setup than a single 690! It also matches 680 SLI in performance is quieter as well. And looks way sexier with the LED lit logo,Perfect for windowed cases. I currently run 480's in SLI and this 690 blows it away in all regards, I think these space heaters of mine are about to be replaced with a 690! ;)
Report Comment
# Get real.mat 2012-05-04 20:07
Go buy a 680 and consider yourself "cool" for the post.

I'll buy a 690, and in less than a year I'll buy one to SLI for half the price. They've already stated that it'll clock over 1,100 on stock cooling, so as far as you being "really sure", i'm not sure what you're smoking.

I had the 7950 gx2 back in the day and it ran as well as the 8800, and it was cheaper, minus dx10. A couple months later I bought another for half the price because i HAD ROOM. It lasted another 3-4 years, and still kicked ass.

If you're going to buy anything and they're the same price, why would you buy 2 that have less performance, instead of one at the same price using the same chips and more room to expand?

I'm not sure of your reasoning on this, it makes no sense.
Report Comment
# Exactly the samegajbooks 2012-09-21 06:41
The 690 is two 680s in one case, it's exactly the same.
Report Comment
# ownergodrilla 2012-09-21 07:23
True but uses 100 watts less.
Report Comment
# gtx 690 vs 2 680Justin 2012-10-23 14:10
2 680s = a little bit more power and 200$ more money u have to spend. and more power usage. So if your getting 2 680s then your wasting money and may as well just get 1 690, Its not worth the extra money to get 2 680s
Report Comment
# WrongChris Whatsitooya 2012-11-20 04:33
Well you spend MORE buying two 680's. 1250 = 999.99 I'd go with the 690, due to better power efficiency and less heat. Think before you post.
And don't correct me on price, that is the price for two, actually useful 680's.
Report Comment
# RE: WrongOlin Coles 2012-11-20 08:02
Prices have changed in the seven months since this article was published, and they were different a month ago when the comment you responded to was published. Also, consider this your warning and drop the attitude.
Report Comment
# GTX 690Seearex 2012-05-03 06:54
Almost brought a 680 the other day but held off. Glad I did now. This will be my next card. The results of this review justify the price tag. Great review and I wouldnt mind seeing the results of how two of these cards perform in sli.
Report Comment
# RE: NVIDIA GeForce GTX 690 Benchmark PerformanceSardaukai 2012-05-03 07:11
The GTX 690 is great. Maybe youre able to overclock two 680 more but why? Only for a few % more? There is no need to do atm....

Ive read an article about GeForce GTX 680 3-way SLI an that didnt impress me much... i think an 690 or an 2-way SLI of 680 is a gamers dream for the next time. Maybe NVIDIA has some driver work to do for more power with more cards, dunno....

But hey maybe they will benchmark an GTX 690 Quad SLI Pack here.... that would be more than great....
Report Comment
# RE: NVIDIA GeForce GTX 690 Benchmark PerformanceDaniel 2012-05-03 08:22
Too bad they focus on being the king showing off they have the fastest card. That's not want people want to buy. I mean yeah maybe a minority but they should focus on mid-range where most of us are all waiting to update!
Report Comment
# RE: RE: NVIDIA GeForce GTX 690 Benchmark Performanceleignheart 2012-05-03 08:55
sorry dude, i love mid range because i want everyone to enjoy games at what they can afford, but mid range doesnt push technology. only cards like the 690 and 680 and 590s and 6990's push technology forward. i love mid range gpus, but the high end stuff will forever be where its at. that said, i hope they release a great 670 or 660 for ya to enjoy.
Report Comment
# Fastest cardmat 2012-05-05 12:46
They really don't focus on it, but people with money want the greatest card out there, regardless of price. There's a flagship product in every manufacturer's garage, regardless of what they make. This card it top dog right now, but it keeps competition from getting stale and makes it better for us in the long run. All the lower-end cards will now make a drop cheaper, and if somone ups the ante, they'll drop the price on this too. It's good... not "too bad".
Report Comment
# RE: NVIDIA GeForce GTX 690 Benchmark PerformanceAndreas 2012-05-03 08:34
My current videocard is a GTX295 and I would love to upgrade, but I have an Intel i7 920 cpu.

Would the cpu hold the GTX690 or the GTX680 back too much to be worth the investment, or will a GTX680 or GTX690 be able to deliver nevertheless?
Report Comment
# RE: RE: NVIDIA GeForce GTX 690 Benchmark PerformanceOlin Coles 2012-05-03 08:42
It would deliver nevertheless. CPU-bound games are rare, and the GPU can easily compensate for most tasks... especially video games and transcoding.
Report Comment
# RE: RE: RE: NVIDIA GeForce GTX 690 Benchmark PerformanceAndreas 2012-05-03 09:20
Thanks for the reply Olin Coles.

You have somewhat put my worries to rest.

I have been thinking of upgrading since GTA V was announced. My GTX 295 can handle all current demanding games with surprising ease, but it is at its limits when I want to push the GTA IV graphics. I hope the GTA V engine is better optimized for PC.

I will try to hold back my eagerness to buy a new video card until I feel it is really necessary.
Report Comment
# RE: RE: NVIDIA GeForce GTX 690 Benchmark Performanceleignheart 2012-05-03 08:56
you should be good for sure, but if i were you i would be saving some cash and get ready for ivy bridge or the next cpu after ivy bridge.
Report Comment
# Power usage?Chris 2012-05-03 09:13
Am I crazy or did I miss the power consumption of this baby in the VGA Product Description list?
Report Comment
# RE: Power usage?Olin Coles 2012-05-03 09:20
It's there, but it's hard to find with so many others in the table with it. Look for 'NVIDIA GeForce GTX 690 Reference Design'
Report Comment
# 25/321 wattsgodrilla 2012-05-03 10:28
Sorry I see it wow that's amazing, my old gtx 480 sc in sli are 1 st in the lead, thank God I sold them.
Report Comment
# pci e 3 vs 2 test? for dual gpu setupgodrilla 2012-05-03 09:33
How come no pci e 3 vs 2 test for dual gpu setup?
And I didn't see the power usage idle/load on power chart.
Report Comment
# RE: pci e 3 vs 2 test? for dual gpu setupOlin Coles 2012-05-03 10:05
It's there - see comment above.

This isn't a motherboard review, so there's no point in testing PCI-E 2 vs 3. If someone has $1000 to burn on a video card, they're not using it on old hardware.
Report Comment
# opiniongodrilla 2012-05-03 10:18
I have a westmere setup, PCI e. 2.0 on rampage 3 extreme mobo. I and most others who can afford a $1000 Gpu will want to know if there is a benefit to upgrade when most games are Gpu bound. I'm personally waiting for haswell toc out in a few quarters.
Report Comment
# Missing Pros/Cons/Scores?WhyNotV2 2012-05-03 09:48
The things I could do with $1000...or buy THE most bad-ass video card currently out. I wonder how well pogo games will run on it? ;)

Seriously, good stuff, but did I miss the usual scoring in the conclusion section or is it just not present?
Report Comment
# RE: Missing Pros/Cons/Scores?Olin Coles 2012-05-03 10:06
It's invisible. The card received a final score of one-trillion.
Report Comment
# RE: RE: Missing Pros/Cons/Scores?WhyNotV2 2012-05-04 04:01
Nice!!!!!!!!!!!!!!! :)
Report Comment
# pricegodrilla 2012-05-03 10:39
I'm thinking the price might drop once 7990 shows, depending on the pric and performance. But not by a lot if anything, just like the article concludes 2 7970 in crossfire will not beat the gtx 690 so the 7990 will probably be the same. Wishful thinking.
Report Comment
# Waste of moneyVisara 2012-05-03 10:46
A single 680 gets you over 75 FPS on Arkham City. The 690 give you 127. But it makes no difference because human eyes are INCAPABLE of processing anything about 60 FPS. It could give you 1,000,000,000 FPS and it would look exactly the same at 60. Not to mention, most monitors run at 60hz anyway, so they're incapable of displaying anything about that. And if you have a 120hz monitor, you wasted you money because like I said, your eyes can't tell the difference between the two.

Sure, if you're running a multiple monitor setup, higher FPS is what you want, but STILL, dual 680s in SLI is way better than a single 690.
Report Comment
# RE: Waste of moneyOlin Coles 2012-05-03 11:37
That's the average frame rate, not the minimum. My tests show that the minimum dropped to 13-28 in five tests. Having played Batman, I can tell you that there are points in the game where FPS drops and get you killed.

Also, I disagree that SLI 680s is better than one GTX 690. It's your money, so buy what you want, but there's no 'right' answer.
Report Comment
# RE: Waste of moneyjules 2012-05-04 02:57
You sir are incorrect, you've obviously never used a 120Hz monitor before. When there's a lot of fast moving images on screen, hell even just looking around using the mouse in a game, there is definitely an increase in smoothness per frames with a 120Hz, the motion just looks far more fluid. It's like going from 30-40Hz to 60Hz. Even just moving the mouse curser quickly on screen your eyes can detect a less jerky curser movement.
Report Comment
# what?leignheart 2012-05-04 03:26
your kidding me right? obviously you have never owned a 120hz monitor because you would not be saying you couldnt see a difference past 60hz, i mean thats like saying you couldnt tell the difference between 1080p and 1600p. go buy a 120hz monitor and you will know your wrong.
Report Comment
# RE: what?David Ramsey 2012-05-04 08:09
There's no such thing as "1600p", fwiw...
Report Comment
# RE: RE: what?leignheart 2012-05-04 10:41
there is no such thing as 1600p? well you better go tell Dell that because they seem to think those 30inch monitors they sell are 1600p. its called 2560x1600. i own the dell u3011. look it up, im kinda surprised that someone who is looking up the gtx 690 knows nothing about its max resolution. strange fwiw....
Report Comment
# RE: RE: RE: what?David Ramsey 2012-05-04 10:58
I did look it up. Dell doesn't use the term "1600p" anywhere on the pages for the monitor that I saw, including in the detailed tech specs.

Do you know what the "p" in terms like "720p" actually stands for?
Report Comment
# RE: RE: RE: RE: what?leignheart 2012-05-04 11:09
yes, it stands for progressive. the u3011 is definately not interlaced so it is progressive. monitors dont always use the term 1600p or 1200p or 1080p, they normally just use their real resolution suchs as 1920x1080, or 1920x1200, or 2560x1600 and so on. just because they dont throw on the p in the description doesnt mean it isnt progressive scan. resolutions are definately 1080p,1200p,1600p, they even have a new higher res right now at like 4k
Report Comment
# RE: NVIDIA GeForce GTX 690 Benchmark Performancedanwat1234 2012-05-03 12:48
I want my next laptop to have a video card to be as fast as this, for CUDA work mostly. Maybe in 10 years when we reach the end of transistor shrinkage, AMD integrated graphics will do it in a lightweight laptop.
Report Comment
# gtx 680 and gtx 690DigitalDemolition 2012-05-03 13:21
Obviously there would be alot more heat when overclocking two gtx 680's and let's be realistic. Almost anyone who would get two gtx 680's would most likely overclock, wich in turn would produce more heat. And with a reference design this would not be a good decision to overclock two reference gtx 680's. As you know the gtx 690 does not have chopped downed chips in it. it is two full working gtx 680's in one solution for around the same price. and obviously two gtx 680's would NOT obliterate a gtx 690. all that's really missing is a few hundred MHZ, wich is not really highly noticable in the long run.Now talking down the line when gtx 680's are Zotac highly overclocked and cheaper than a gtx 690, that might be a great alternative to one single gtx 690. I mention zotac, because i have shopped for many gtx and gt cards back in 2010 and 2011 and although i purchased a pny gtx 470, zotac still had the better design and fastest overclocks i have seen honestly.what this comes down to is, is the customer angry that they just bought a reference design gtx 680 and is stuck with this because they cannot afford a gtx 690, maybe you should have waited and saved some more cash??? and also, does the customer prefer two graphics cards over 1, for a few hundred dollars less than a gtx 690, and having the two gtx 680's be slightly more powerful than a single gtx 690 solution.
Report Comment
# [email protected] newegggodrilla 2012-05-04 21:19
that's messed up, while you get wet just an fyi.
Report Comment
# frustratedLesfry 2012-05-09 21:12
Damm I just got a 580 classified, then the 680 came out, and now the 690 is out, Can I sli with different models of grafix cards, ex 580 with 560 ?? or 680 with 580 ? Id like to get a new 690, but you know I just got this 580 classified, and I dont want a $600 Paperweight.
Report Comment
# RE: frustratedOlin Coles 2012-05-09 21:15
I don't believe so, but you might want to see if EVGA still offers their 'step up' program that gives you full value towards and upgrade.
Report Comment
# HD 7970 TOXIC CFlost_ 2012-08-18 06:55
Just get the crossfire 7970. Radeon > nVidia. nVidia is for all the fan boys...
Report Comment
# RE: HD 7970 TOXIC CFArgos 2012-08-18 08:14
I used to have a few ATI/AMD Radeon cards, but I always had horrible driver problems and in game graphics bugs. I used to think that these problems were normal until I switched to Nvidea.
So I am afraid to buy Radeon now.

I have tried a few generations of Nvidia cards and all my driver problems have gone away.

So you can see why I would be hard pressed to ever go back to the video cards that gave me so much trouble.
I am not saying I will never ever try an AMD card again, But at the moment I decline.
Report Comment
# driversgodrilla 2012-08-18 08:41
I own the gtx 690 and I'm using certified drivers from May because the recent beta drives are horrible, that said, certified drivers from May are good, but we need a update soon!
Report Comment
# or notgodrilla 2012-08-18 08:50
Also if one wanted to try 3d vision 1 or 2, use 100 watts less power of not more, likes EVGA products in general , and driver thing too.
Report Comment
# RE: HD 7970 TOXIC CFSicNNasty 2012-09-24 02:39
Just made the switch from HD6990 to GTX690. The first time in nearly a decade that I've chosen Nvidia over AMD. I didn't really need an upgrade, but was sick and tired of buying games and actually completing them weeks before AMD shipped their crossfire optimisations... So far the way it feels is that AMD ships more powerful hardware, but this extra power is wasted by poor driver quality.
Report Comment

Comments have been disabled by the administrator.

Search Benchmark Reviews

Like Benchmark Reviews on FacebookFollow Benchmark Reviews on Twitter