|NVIDIA GeForce GTX 580 Video Card Performance|
|Reviews - Featured Reviews: Video Cards|
|Written by Olin Coles|
|Tuesday, 09 November 2010|
Page 16 of 21
Fermi GF110 GPU Overclocking
AMD and NVIDIA already stretch their GPUs pretty thin in terms of overclocking head room, but there's a difference between thin and non-existent. In this section, Benchmark Reviews overclocks the NVIDIA GeForce GTX 580 video card using MSI's free Afterburner utility. The MSI Afterburner "Graphics Card Performance Booster" application offers several adjustable variables to reach the desired overclock, and allows for voltage changes (increase/decrease). The aim of this project is to push the Fermi GF110 GPU as far as it could go without any extra power applied. Beginning with the maximum stable GPU clock speed, I slowly increased the settings until I began to see screen tearing or the Forceware driver crashed. Once I reached the most stable speeds for both GPU and GDDR5, I put the video card back into action with high-demand video games for additional benchmark tests. Here are the results:
GeForce GTX 580 Overclocking Results
Overclocking Summary: After re-testing the overclocked GeForce GTX 580 on eight different benchmarks, the increased performance amounted to 4.0-7.3% improvement in video frame rates. This may not seem like much of an overclock, and it's not, but considering that the GeForce GTX 580 already rests at the very top of NVIDIA's food chain it's not that surprising. The recent ASUS ENGTX480 Overclocking project used a refined GF100 GPU that yielded 12-17% improvements, while the AMD Radeon HD 6870 was limited to 6.5-9.3%. This reinforces the notion that higher-end processors have the least amount of headroom, but every extra frame translates into an advantage over your enemy.
GeForce GTX 580 Temperatures
Benchmark tests are always nice, so long as you care about comparing one product to another. But when you're an overclocker, gamer, or merely a PC hardware enthusiast who likes to tweak things on occasion, there's no substitute for good information. Benchmark Reviews has a very popular guide written on Overclocking Video Cards, which gives detailed instruction on how to tweak a graphics cards for better performance. Of course, not every video card has overclocking head room. Some products run so hot that they can't suffer any higher temperatures than they already do. This is why we measure the operating temperature of the video card products we test.
To begin my testing, I use GPU-Z to measure the temperature at idle as reported by the GPU. Next I use FurMark's "Torture Test" to generate maximum thermal load and record GPU temperatures at high-power 3D mode. The ambient room temperature remained at a stable 20°C throughout testing, while the inner-case temperature hovered around 37°C.
FurMark does two things extremely well: drive the thermal output of any graphics processor much higher than any video games realistically could, and it does so with consistency every time. Furmark works great for testing the stability of a GPU as the temperature rises to the highest possible output. The temperatures discussed below are absolute maximum values, and not representative of real-world performance:
As a result of NVIDIA's new hardware power monitoring circuitry, temperatures are kept to their lowest level in many years. At first I suspected GPU load or power throttling, but there's no evidence of this on the GPU-Z histogram when we re-tested (at 26°C ambient room temp). Regardless, the nearly-ambient 32°C idle temperature and modestly warm 70°C loaded temp are something NVIDIA should be proud of... and the competition should take notice of.