|ZOTAC GeForce GTX 280 AMP! Edition Video Card|
|Reviews - Featured Reviews: Video Cards|
|Written by Olin Coles|
|Monday, 30 June 2008|
Page 15 of 19
World in Conflict Benchmark Results
The latest version of Massive's proprietary Masstech engine utilizes DX10 technology and features advanced lighting and physics effects, and allows for a full 360 degree range of camera control. Massive's MassTech engine scales down to accommodate a wide range of PC specifications, if you've played a modern PC game within the last two years, you'll be able to play World in Conflict.
World in Conflict's FPS-like control scheme and 360-degree camera make its action-strategy game play accessible to strategy fans and fans of other genres... if you love strategy, you'll love World in Conflict. If you've never played strategy, World in Conflict is the strategy game to try.
World in Conflict offers an in-game benchmark; which records the minimum, average, and maximum frame rates during the test. Very recently another hardware review website made the assertion that these tests are worthless, but we couldn't disagree more. When used to compare video cards which are dependant on the same driver and use the same GPU architecture, the in-game benchmark works very well and comparisons are apples-to-apples.
First tested was the 1024x768 resolution in WiC, which is representative of the (very few) gamers using a 17" LCD monitor with this widescreen-preferred video game. Based on the test results charted below it's clear that WiC doesn't place a limit on the maximum frame rate (to conserve wasted power) which is good for full-spectrum benchmarks like ours, but bad for electricity bills. The average frame rate is shown in each chart, but our initial results are so close that it becomes obvious that WiC doesn't ask much from the graphics card at low resolutions. That's okay, because we've got three more to offer.
At 1024x768 the 9800 GTX was ahead by one single FPS against the Sapphire Radeon HD 4850, but at 1280x1024 the positions and results are exactly reversed. The CrossFireX set of HD 4850's are just a step behind the average frame rate of the GeForce 9800 GX2. Ultimately the overclocked ZOTAC GTX 280 would secure the lead with an average frame rate of 69 FPS; but a 3 FPS lead over the GeForce 9800 GTX is not exactly impressive.
Moving up a small step to 1680x1050 widescreen resolution, the trends are kept within the ratio they have operated at for the past two test. The ZOTAC GeForce GTX 280 holds its ground and drops only 2 FPS, which results in a decidedly small lead over the 9800 GX2 by a whole 2 FPS. The Foxconn GeForce 9800 GTX is neck-and-neck with the Sapphire Radeon HD 4850, and the CrossFireX setup is in-line with the 9800 GX2 and both GTX 280's.
With a balanced demand for CPU and GPU power, World in Conflict just begins to place demands on the graphics processor at the 1920x1280 resolution. I was expecting more of the same, and that is pretty much exactly what I got.
The performance decay had its hardest impact on the mid-level video cards: GeForce 9800 GTX and Radeon HD 4850, which for all intents an purposes performed exactly the same throughout our entire WiC testing. Two HD 4850's in CrossFireX configuration will yield a 46% improvement over using only one, while it matches performance with our reference NVIDIA GeForce GTX 280. The GeForce 9800 GX2 barely moved two full frames per second as it worked without effort from 0.79 MP up to 2.3 MP.
Taking a broader look at the average frame rate, there appears to be a major difference between the mid-range and high-end video card products when it comes to World in Conflict. This game offers DirectX 10 functionality, which could lend itself to taxing the CrossFireX, 9800 GX2, and GTX 280 graphics cards more appropriately. For our testing, it appears that only the mid-level GeForce 9800 GTX and Sapphire Radeon HD 4850 demonstrate a performance decay as the resolution is raised.
In our next section, we discuss electrical power consumption and temperature levels for these products. Learn how well (or poorly) each video card will impact your utility bill...