|Overclocking the NVIDIA GeForce Video Card|
|Articles - Featured Guides|
|Written by Olin Coles|
|Saturday, 06 September 2008|
Page 4 of 7
Alexey Nicolaychuk, also known as Unwinder, is the brilliant mastermind behind the excellent tool named RivaTuner. Much more than just an overclocking tool for video cards, RivaTuner can manage all aspects pertaining to the computers display: from multimedia and video game settings, to monitor refresh rates and positioning. Once installed, the application opens to a complex assortment of tabbed menus. Even though I have used this tool for many years, the ergonomics of the program can leave the inexperienced enthusiast a little confused. For this guide, I will focus on the tools required to complete temporary overclocking on a graphics card.
Although ATITool is a past favorite because of BIOS level features, it doesn't offer many of the settings needed for basic temporary changes like RivaTuner does. Low-level system setting changes can be completed after the hardware is manually detected or after a complete system reboot. The fan controls (illustrated in the last section) and overclocking can be completed from the system settings section, which is opened by clicking the "Customize" arrow on the far right side of the Driver settings bar.
For this article I used the ZOTAC GeForce GTX 280 AMP! Edition Video Card, which takes the NVIDIA reference values (602 MHz GPU, 1296 MHz shader, and 1107 MHz RAM) and factory overclocks them (to 700 MHz GPU, 1400 shader, and 1150 MHz RAM). Most people would leave well-enough alone, but I have faith in NVIDIA's chip-binning process and believe that there's plenty more performance to be squeezed from this graphics card. So after several very small incremental increases to the core clock (which I kept linked to the shader clock), I would return to ATITool for and FurMark for artifact scanning and stress testing.
Because this is a low-level process that modifies settings through direct access to the hardware driver, whenever my overclocking was too zealous and resulted in a crash all I had to do was restart my computer. This would happen several times, but eventually my effort paid off. After several trial-and-error experiments to find the optimal combination of GPU, shader, and memory speeds, I settled at a stable configuration. Using both the synthetic tests and real-world game play, the final overclock was: 715 MHz GPU (113 MHz over stock and 15 MHz over factory OC), 1430 MHz shader (134 MHz over stock and 30 MHz over factory), and finally 1385 MHz RAM (278 MHz over stock and 235 MHz over ZOTAC's factory overclock). Was it all worth it? Wait until you see the results in the following section!
Once the maximum stable speeds for the GPU, shader, and RAM have been found and tested, it's time to make a big decision: do I keep using the ATITool or RivaTuner software to overclock my video card, or should I program the new settings into the video card BIOS and make the changes permanent? If I ever want to use this video card as part of an SLI set, there's only one choice: I will need to flash the settings to the video BIOS of each card. But if I only use a single video card, then the choice is purely personal.