|Radeon HD5830 DirectX-11 Gaming Performance|
|Reviews - Featured Reviews: Video Cards|
|Written by Bruce Normann|
|Friday, 12 March 2010|
Page 7 of 8
Radeon HD5830 DX11 Final Thoughts
Why did ATI leave such a big hole in their product line, for so long? The flagship ATI video cards made a huge splash last September, but according to Mercury Research, cards costing over $200 only make up 7% of the market, and the 57xx series landed in the $100-$200 range, which makes up 27% of the market. That leaves a huge opening in the sub-$100 market, and ATI was busy filling in the gaps with all new, DirectX-11 capable cards in this segment. Enthusiasts may laugh at the diminutive HD55xx series and the HD5450, with its 80 shaders, but they provide a much-needed revenue stream for ATI. Don't begrudge them that, it's what pays for the entire R&D effort that produced the 58xx series in the first place.
Each of the three benchmarks I used for these tests implements the new DirectX-11 features in a unique way. For instance, there is an immense difference in the visual representation of the landscape in Unigine-Heaven when switching from DX10 to DX11. It's not too far from the truth to say that "The stones come alive..." in this benchmark when tessellation is turned on. The difference is many times more impressive than any change in Anisotropic Filtering or Multi-Sample Anti-Aliasing. For this benchmark, presuming it represented an actual game, my recommendation for optimal graphics performance would be to turn on DX11 with tessellation, then adjust the MSAA to get a playable frame rate. Of all the current benchmarks that are capable of displaying DX11 graphics, this one is the most dramatic in its demonstration of the available technology. I believe it's the best example we have today, of the level of improvements in graphics design that are just over the horizon. I can only say, based on what I see here, that DirectX-11 is here to stay and it's worth slowing down some of the other processing tasks in order to take advantage of it.
The DiRT 2 Demo benchmark is a bit of an anomaly, in that the demo was released well before the actual game, and it does not showcase the true graphics capability in the final product. Codemasters was hard at work until the very last second; incorporating the new DX11 features into the product, knowing that they had the chance to be one of the first gaming titles available with DirectX-11. Unlike Heaven, which is a pure technology demonstration, the developers of DIRT 2 had a full game to code, so they were not able to fully utilize all of the techniques available in Microsoft's new graphics API. I'm sure they wish it had been available from the beginning of the project, as DX11 has a number of new tools that make life easier for the developer. That's one of the reasons most of the studios were actively approaching Microsoft to get on board as quickly as possible; a major shift in attitude from when DX10 and DX10.1 were released. Microsoft had to go out and sell those toolsets to the industry; with DX11, people were practically begging for it.
Overall, the visual impact of DirectX-11 technology to the DiRT 2 Demo is much less than what is seen in Unigine-Heaven. There was only so much the developers could fit in, given the schedule pressures. There is no "switch" to use DX10 or DX11 in the benchmark, the benchmark automatically defaults to DX9 if all the pieces aren't in place to use DX11. A wide variety of rendering features, including Tessellation and Cloth, are individually selectable, and most have quality levels that you can choose. There is also a handy little FPS estimator built into the graphics configuration menu that shows the approximate impact on frame rates in real-time, as you make your menu picks.
S.T.A.L.K.E.R.: Call of Pripyat has a very grainy look to it, almost like old film stock. As older photographers know, there is actually a wealth of information embedded in those jagged textures that is hard to capture in a meaningful way with the regular, monotonic array of pixels in a digital image. Because the grains in film are self-organizing, they have the ability to create an additional level of detail that can only be captured by a digital image that has a resolution an order of magnitude higher. Just ask anyone who has tried to scan their old photos with their brand new flatbed scanner... To the degree that some of us grew up watching movies in the cinema, instead of our living room, that "film" look triggers some deeply ingrained thought processes in our brain. It's a trick that will lose its Mojo in a couple of generations, but for now, it adds some realism to this video game. If you doubt this, ask yourself why almost all the graphic artists in the gaming industry add lens flare to the sunny scenes. That oblique line-up of 10-20 pale yellow disks, arranged in groups of five or six, is an artifact from the complex zoom lenses used in filmmaking. Somehow, it looks natural to most viewers, even though it is completely artificial.
It's not in my general nature to be satisfied for long; every new answer always seems to beg a new question. In this case, I met the goals I had at the beginning of the project, and the questions that remain after this exercise all relate to what-if scenarios. What if FERMI had been released in 2009? What if 40nm chips had been in plentiful supply? What if the chip pricing from TSMC didn't have to make up for the cost of all the defective wafers? What if ATI had chosen the Large-Die strategy? What if DirectX-11 had been buggy? What if the 5830 had been able to keep all its ROP units? All questions that are likely to go unanswered; like I said, it's not in my nature to be satisfied. I am happy, however, in anticipation of some wonderful new graphics that are still in the works right now, in studios around the world. And I'm confident that the hardware we have today will be able to take full advantage of the latest rendering techniques that will be put on display in the very near future.