Archive Home arrow Reviews: arrow Review Websites Discover AMD Driver Reduces Image Quality
Review Websites Discover AMD Driver Reduces Image Quality E-mail
Written by NVIDIA - Nick Stam   
Friday, 19 November 2010

Review Websites Discover AMD Driver Reduces Image Quality

According to NVIDIA: PC gaming enthusiasts understand image quality (IQ) is a critical part of the PC gaming experience. They frequently upgrade their GPUs to play the latest games at high frame rates, while also dialing up the display resolution and graphical IQ effects to make their games both look and play great. Image quality is important, and if it were not important, we'd all be playing at 10x7 with no AA!

Important Benchmarking Issues and Questionable Optimizations
We are writing this blog post to bring broader attention to some very important image quality findings uncovered recently by top technology Web sites including ComputerBase, PC Games Hardware, Tweak PC, and 3DCenter.org. They all found that changes introduced in AMD's Catalyst 10.10 default driver settings caused an increase in performance and a decrease in image quality. These changes in AMD's default settings do not permit a fair apples-to-apples comparison to NVIDIA default driver settings. NVIDIA GPUs provide higher image quality at default driver settings, which means comparative AMD vs. NVIDIA testing methods need to be adjusted to compensate for the image quality differences.

What Editors Discovered
Getting directly to the point, major German Tech Websites ComputerBase and PC Games Hardware (PCGH) both report that they must use the "High" Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default "Quality" setting in order to provide image quality that comes close to NVIDIA's default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase.

AMD's optimizations weren't limited to the Radeon 6800 series. According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIA's "default" driver settings.

Going forward, ComputerBase and PCGH both said they would test AMD 6800 series boards with Cat AI set to "High", not the default "Quality" mode, and they would disable Cat AI entirely for 5800 series boards (based on their findings, other 5000 series boards do not appear to be affected by the driver change).

Filter Tester Observations
Readers can observe AMD GPU texture shimmering very visibly in videos posted at TweakPC. The popular Filter Tester application from 3DCenter.org was used with its "ground2" texture (located in the Program Files/3DCenter Filter Tester/Textures directory), and texture movement parameters were set to -0.7 in both X and Y directions with 16xAF enabled. Each video shows the split-screen rendering mode of the Filter Tester application, where the GPU under test is on the left side, and the "perfect" software-based ALU rendering is on the right side. (Playing the videos with Firefox or Google Chrome is recommended). NVIDIA GPU anisotropic quality was also tested and more closely resembles the perfect ALU software-based filtering. Problems with AMD AF filtering are best seen when the textures are in motion, not in static AF tests, thus the "texture movement" settings need to be turned on in the Filter Tester. In our own testing with Filter Tester using similar parameters, we have seen that the newly released Catalyst 10.11 driver also has the same texture shimmering problems on the HD 5870. Cat 10.11 does not work with HD 6000 series boards as of this writing.

AF Tester Observations
ComputerBase also says that AMD drivers appear to treat games differently than the popular "AF Tester" (anisotropic filtering) benchmark tool from 3DCenter.org. They indicate that lower quality anisotropic filtering is used in actual games, but higher quality anisotropic filtering is displayed when the AF Tester tool is detected and run. Essentially, the anisotropic filtering quality highlighted by the AF Tester tool on AMD GPUs is not indicative of the lower quality of anisotropic filtering seen in real games on AMD GPUs.

NVIDIA's own driver team has verified specific behaviors in AMD's drivers that tend to affect certain anisotropic testing tools. Specifically, AMD drivers appear to disable texture filtering optimizations when smaller window sizes are detected, like the AF Tester tool uses, and they enable their optimizations for larger window sizes. The definition of "larger" and "smaller" varies depending on the API and hardware used. For example with DX10 and 68xx boards, it seems they disable optimizations with window sizes smaller than 500 pixels on a side. For DX9 apps like the AF Tester, the limit is higher, on the order of 1000 pixels per side. Our driver team also noticed that the optimizations are more aggressive on RV840/940 than RV870, with optimizations performed across a larger range of LODs for the RV840/940.

FP16 Render Observations
In addition to the above recent findings, for months AMD had been performing a background optimization for certain DX9 applications where FP16 render targets are demoted to R11G11B10 render targets, which are half the size and less accurate. When recently exposed publically, AMD finally provided a user visible control panel setting to enable/disable, but the demotion is enabled by default. Reviewers and users testing DX9 applications such as Need for Speed Shift or Dawn of War 2, should uncheck the "Enable Surface Format Optimization" checkbox in the Catalyst AI settings area of the AMD control panel to turn off FP16 demotion when conducting comparative performance testing.

A Long and Winding Road
For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality. During that time, the industry agreed that any optimization that improved performance, but did not alter IQ, was in fact a valid "optimization", and any optimization that improved performance but lowered IQ, without letting the user know, was a "cheat". Special-casing of testing tools should also be considered a "cheat".

Both NVIDIA and AMD provide various control panel knobs to tune and tweak image quality parameters, but there are some important differences -- NVIDIA strives to deliver excellent IQ at default control panel settings, while also ensuring the user experiences the image quality intended by the game developer. NVIDIA will not hide optimizations that trade off image quality to obtain faster frame rates. Similarly, with each new driver release, NVIDIA will not reduce the quality of default IQ settings, unlike what appears to be happening with our competitor, per the stories recently published.

We are glad that multiple top tech sites have published their comparative IQ findings. If NVIDIA published such information on our own, without third-party validation, much of the review and technical community might just ignore it. A key goal in this blog is not to point out cheats or "false optimizations" in our competitor's drivers. Rather it is to get everyone to take a closer look at AMD's image quality in games, and fairly test our products versus AMD products. We also want people to beware of using certain anisotropic testing tools with AMD boards, as you will not get image quality results that correspond with game behavior.

AMD promotes "no compromise" enthusiast graphics, but it seems multiple reviewers beg to differ.

We have had internal discussions as to whether we should forego our position to not reduce image quality behind your back as AMD is doing. We believe our customers would rather we focus our resources to maximize performance and provide an awesome, immersive gaming experience without compromising image quality, than engage in a race to the IQ gutter with AMD.

EDITOR'S NOTE: This is a disturbing article, and the sources here are critical for legitimacy. NVIDIA is a direct competitor to AMD and is the author of this article, which may lead some readers to ignore the message. However, it was several independent review website's that first brought this issue to the forefront, and proved it exists. I personally trust these websites, particularly 3DCenter.org, and have found them to be unbiased over the years.

Benchmark Reviews can confirm that issues with filtering still exist, and pointed this out in our Radeon HD 6850 and Radeon HD 6870 launch articles. We also made it public that certain AMD partners were sending 'juiced' video card samples to reviews sites, ours included, with details published in our 1120-Core "Fixed" Radeon HD 6850 Review Samples Shipped to Media article. So could this be AMDs last ditch effort to compete with NVIDIA by manipulating performance?


Related Articles:
 

Comments 

 
# What's the f@cking deal?troll 2010-11-21 07:18
If there's an option to turn off the optimizations, then what's the problem. I'm sure that if anyone was unhappy with the image quality and was ready to spend 2 minutes on Google would be able to understand how to set things back to default...
Report Comment
 
 
# RE: What's the f@cking deal?Max 2010-11-22 09:44
There's an option to turn it off, at default, it is enabled and well hidden in CCC by AMD, that's kind of a trick. Most people, if not all, dont know that the option exists and even when what it is for, they just thought that's how the game the games are supposed to look like and blame the games for not being well made.
Report Comment
 
 
# just moar nvidia trolln'QUINTIX 2010-11-21 16:44
AMD already responded to the R11G11B10 issue here
##atomicmpc.com.au/Feature/232215,ati-cheating-benchmarks-and-degrading-game-quality-says-nvidia.aspx/2
?and as for AF, well I can?t find the article right now, but nvidia complained about this a long time ago, with ignorant PR folks claiming that Nvidia applied AF to every pixel.

Just look at this article
##anandtech.com/show/3988/the-use-of-evgas-geforce-gtx-460-ftw-in-last-nights-review
?Let's start with the obvious. NVIDIA is more aggressive than AMD with trying to get review sites to use certain games and even make certain GPU comparisons. When NVIDIA pushes, we push back..?

?NVIDIA will not hide optimizations that trade off image quality to obtain faster frame rates.? ?Bull; nvidia cares FAR more about what reviewers say about them than actual end user experiences.
Report Comment
 
 
# Ati/AMD Caught red handedGamer0001 2010-11-21 18:26
First the extra core enabled cards to reviewers only! and now this driver fiasco confirmed by respected independent sites. Nice job ati...
Report Comment
 

Comments have been disabled by the administrator.

Search Benchmark Reviews
QNAP Network Storage Servers

Follow Benchmark Reviews on FacebookReceive Tweets from Benchmark Reviews on Twitter