|Gigabyte GeForce GTX 480 SOC GV-N480SO-15I|
|Reviews - Featured Reviews: Video Cards|
|Written by Bruce Normann|
|Wednesday, 22 December 2010|
Page 18 of 21
Gigabyte GTX 480 SOC Temperatures
It's hard to know exactly when the first video card got overclocked, and by whom. What we do know is that it's hard to imagine a computer enthusiast or gamer today that doesn't overclock their hardware. Of course, not every video card has the head room. Some products run so hot that they can't suffer any higher temperatures than they generate straight from the factory. This is why we measure the operating temperature of the video card products we test.
To begin testing, I use GPU-Z to measure the temperature at idle as reported by the GPU. Next I use FurMark 1.8.2 to generate maximum thermal load and record GPU temperatures at high-power 3D mode. The ambient room temperature remained stable at 24C throughout testing. I have a ton of airflow into the video card section of my benchmarking case, with a 200mm side fan blowing directly inward, so that helps alleviate any high ambient temps.
The Gigabyte GTX 480 SOC video card recorded 38C in idle mode, and increased to 77C after 30 minutes of stability testing in full 3D mode, at 1920x1200 resolution, and the maximum MSAA setting of 8X. With the fan set on Automatic, the speed rose to 57% under full load. The idle fan speed is a relatively high 48%, which is fine because the three fans are pretty much inaudible at that setting. I then did a run with manual fan control and 100% fan speed. I was rewarded by a modest increase in fan noise and a reduced load temperature of 71C.
77C may seem like a not very good result for temperature stress testing, but in comparison to a stock GTX 480 it's darn good. The first batch of GTX 480 cards got up to 93C when Benchmark Reviews tested them, and a later model from ASUS hit 82 under load with an ambient of 20C. I've become used to seeing video card manufacturers keeping the fan speeds low, especially with radial blower wheels, but Gigabyte takes advantage of their Windforce 3X fan design to keep the idle speeds up. Unless you've got the luxury, and the maniacal streak needed, to play video games 24 hours a day, your graphics card spends a lot of time idling while you're at work. With this card, the fan controller keeps the idle speed up to 48% and your card stays cool during the off-hours. There is definitely some thermal benefit to running the fan harder, as you can see from the 100% fan results above, and the increase in noise is not too bad at full tilt. Most users will not have to make custom software profiles to optimize the fan speeds on this non-reference design.
Load temps got up to a maximum of 73C when running continuous gaming benchmarks, with automatic fan speeds ramping up to 51% with the most challenging titles. This is fairly close to stress-test-maximums, so despite all the industry protests about using an extreme tool like FurMark for stress testing, it's doing a good job of emulating a real-world graphics load, IMHO. That temperature is higher than I like to see, but the chip can obviously take it.
FurMark is an OpenGL benchmark that heavily stresses and overheats the graphics card with fur rendering. The benchmark offers several options allowing the user to tweak the rendering: fullscreen / windowed mode, MSAA selection, window size, duration. The benchmark also includes a GPU Burner mode (stability test). FurMark requires an OpenGL 2.0 compliant graphics card with lot of GPU power!
FurMark does do two things extremely well: drive the thermal output of any graphics processor higher than any other application or video game, and it does so with consistency every time. While FurMark is not a true benchmark tool for comparing different video cards, it still works well to compare one product against itself using different drivers or clock speeds, or testing the stability of a GPU, as it raises the temperatures higher than any program. But in the end, it's a rather limited tool.
In our next section, we discuss electrical power consumption and learn how well (or poorly) each video card will impact your utility bill...